Sample records for provide sufficient sample

  1. Method and apparatus for nitrogen oxide determination

    DOEpatents

    Hohorst, Frederick A.

    1990-01-01

    Method and apparatus for determining nitrogen oxide content in a high temperature process gas, which involves withdrawing a sample portion of a high temperature gas containing nitrogen oxide from a source to be analyzed. The sample portion is passed through a restrictive flow conduit, which may be a capillary or a restriction orifice. The restrictive flow conduit is heated to a temperature sufficient to maintain the flowing sample portion at an elevated temperature at least as great as the temperature of the high temperature gas source, to thereby provide that deposition of ammonium nitrate within the restrictive flow conduit cannot occur. The sample portion is then drawn into an aspirator device. A heated motive gas is passed to the aspirator device at a temperature at least as great as the temperature of the high temperature gas source. The motive gas is passed through the nozzle of the aspirator device under conditions sufficient to aspirate the heated sample portion through the restrictive flow conduit and produce a mixture of the sample portion in the motive gas at a dilution of the sample portion sufficient to provide that deposition of ammonium nitrate from the mixture cannot occur at reduced temperature. A portion of the cooled dilute mixture is then passed to analytical means capable of detecting nitric oxide.

  2. System and method for liquid extraction electrospray-assisted sample transfer to solution for chemical analysis

    DOEpatents

    Kertesz, Vilmos; Van Berkel, Gary J.

    2016-07-12

    A system for sampling a surface includes a surface sampling probe comprising a solvent liquid supply conduit and a distal end, and a sample collector for suspending a sample collection liquid adjacent to the distal end of the probe. A first electrode provides a first voltage to solvent liquid at the distal end of the probe. The first voltage produces a field sufficient to generate electrospray plume at the distal end of the probe. A second electrode provides a second voltage and is positioned to produce a plume-directing field sufficient to direct the electrospray droplets and ions to the suspended sample collection liquid. The second voltage is less than the first voltage in absolute value. A voltage supply system supplies the voltages to the first electrode and the second electrode. The first electrode can apply the first voltage directly to the solvent liquid. A method for sampling for a surface is also disclosed.

  3. 40 CFR 265.91 - Ground-water monitoring system.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... sufficient to yield ground-water samples that are: (i) Representative of background ground-water quality in... not required provided that provisions for sampling upgradient and downgradient water quality will... perforated, and packed with gravel or sand where necessary, to enable sample collection at depths where...

  4. 12 CFR 715.8 - Requirements for verification of accounts and passbooks.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... selection: (ii) A sample which is representative of the population from which it was selected; (iii) An equal chance of selecting each dollar in the population; (iv) Sufficient accounts in both number and... consistent with GAAS if such methods provide for: (i) Sufficient accounts in both number and scope on which...

  5. Sampling methods for amphibians in streams in the Pacific Northwest.

    Treesearch

    R. Bruce Bury; Paul Stephen Corn

    1991-01-01

    Methods describing how to sample aquatic and semiaquatic amphibians in small streams and headwater habitats in the Pacific Northwest are presented. We developed a technique that samples 10-meter stretches of selected streams, which was adequate to detect presence or absence of amphibian species and provided sample sizes statistically sufficient to compare abundance of...

  6. 40 CFR 92.127 - Emission measurement accuracy.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Emission measurement accuracy. (a) Good engineering practice dictates that exhaust emission sample analyzer... resolution read-out systems such as computers, data loggers, etc., can provide sufficient accuracy and...

  7. 40 CFR 92.127 - Emission measurement accuracy.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Emission measurement accuracy. (a) Good engineering practice dictates that exhaust emission sample analyzer... resolution read-out systems such as computers, data loggers, etc., can provide sufficient accuracy and...

  8. 40 CFR 92.127 - Emission measurement accuracy.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Emission measurement accuracy. (a) Good engineering practice dictates that exhaust emission sample analyzer... resolution read-out systems such as computers, data loggers, etc., can provide sufficient accuracy and...

  9. 40 CFR 92.127 - Emission measurement accuracy.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Emission measurement accuracy. (a) Good engineering practice dictates that exhaust emission sample analyzer... resolution read-out systems such as computers, data loggers, etc., can provide sufficient accuracy and...

  10. 40 CFR 92.127 - Emission measurement accuracy.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Emission measurement accuracy. (a) Good engineering practice dictates that exhaust emission sample analyzer... resolution read-out systems such as computers, data loggers, etc., can provide sufficient accuracy and...

  11. Biobriefcase aerosol collector

    DOEpatents

    Bell, Perry M [Tracy, CA; Christian, Allen T [Madison, WI; Bailey, Christopher G [Pleasanton, CA; Willis, Ladona [Manteca, CA; Masquelier, Donald A [Tracy, CA; Nasarabadi, Shanavaz L [Livermore, CA

    2009-09-22

    A system for sampling air and collecting particles entrained in the air that potentially include bioagents. The system comprises providing a receiving surface, directing a liquid to the receiving surface and producing a liquid surface. Collecting samples of the air and directing the samples of air so that the samples of air with particles entrained in the air impact the liquid surface. The particles potentially including bioagents become captured in the liquid. The air with particles entrained in the air impacts the liquid surface with sufficient velocity to entrain the particles into the liquid but cause minor turbulence. The liquid surface has a surface tension and the collector samples the air and directs the air to the liquid surface so that the air with particles entrained in the air impacts the liquid surface with sufficient velocity to entrain the particles into the liquid, but cause minor turbulence on the surface resulting in insignificant evaporation of the liquid.

  12. Optimal sampling for radiotelemetry studies of spotted owl habitat and home range.

    Treesearch

    Andrew B. Carey; Scott P. Horton; Janice A. Reid

    1989-01-01

    Radiotelemetry studies of spotted owl (Strix occidentalis) ranges and habitat-use must be designed efficiently to estimate parameters needed for a sample of individuals sufficient to describe the population. Independent data are required by analytical methods and provide the greatest return of information per effort. We examined time series of...

  13. Identification and Quantitative Analysis of Acetaminophen, Acetylsalicylic Acid, and Caffeine in Commercial Analgesic Tablets by LC-MS

    ERIC Educational Resources Information Center

    Fenk, Christopher J.; Hickman, Nicole M.; Fincke, Melissa A.; Motry, Douglas H.; Lavine, Barry

    2010-01-01

    An undergraduate LC-MS experiment is described for the identification and quantitative determination of acetaminophen, acetylsalicylic acid, and caffeine in commercial analgesic tablets. This inquiry-based experimental procedure requires minimal sample preparation and provides good analytical results. Students are provided sufficient background…

  14. SMALL AREA ESTIMATION OF INDICATORS OF STREAM CONDITION FOR MAIA USING HIERARCHICAL BAYES PREDICTION MODELS

    EPA Science Inventory

    Probability surveys of stream and river resources (hereafter referred to as streams) provide reliable estimates of stream condition when the areas for the estimates have sufficient number of sample sites. Monitoring programs are frequently asked to provide estimates for areas th...

  15. Amendment to examination and investigation sample requirements--FDA. Direct final rule.

    PubMed

    1998-09-25

    The Food and Drug Administration (FDA) is amending its regulations regarding the collection of twice the quantity of food, drug, or cosmetic estimated to be sufficient for analysis. This action increases the dollar amount that FDA will consider to determine whether to routinely collect a reserve sample of a food, drug, or cosmetic product in addition to the quantity sufficient for analysis. Experience has demonstrated that the current dollar amount does not adequately cover the cost of most quantities sufficient for analysis plus reserve samples. This direct final rule is part of FDA's continuing effort to achieve the objectives of the President's "Reinventing Government" initiative, and is intended to reduce the burden of unnecessary regulations on food, drugs, and cosmetics without diminishing the protection of the public health. Elsewhere in this issue of the Federal Register, FDA is publishing a companion proposed rule under FDA's usual procedures for notice and comment to provide a procedural framework to finalize the rule in the event the agency receives any significant adverse comment and withdraws this direct final rule.

  16. Consensus for second-order multi-agent systems with position sampled data

    NASA Astrophysics Data System (ADS)

    Wang, Rusheng; Gao, Lixin; Chen, Wenhai; Dai, Dameng

    2016-10-01

    In this paper, the consensus problem with position sampled data for second-order multi-agent systems is investigated. The interaction topology among the agents is depicted by a directed graph. The full-order and reduced-order observers with position sampled data are proposed, by which two kinds of sampled data-based consensus protocols are constructed. With the provided sampled protocols, the consensus convergence analysis of a continuous-time multi-agent system is equivalently transformed into that of a discrete-time system. Then, by using matrix theory and a sampled control analysis method, some sufficient and necessary consensus conditions based on the coupling parameters, spectrum of the Laplacian matrix and sampling period are obtained. While the sampling period tends to zero, our established necessary and sufficient conditions are degenerated to the continuous-time protocol case, which are consistent with the existing result for the continuous-time case. Finally, the effectiveness of our established results is illustrated by a simple simulation example. Project supported by the Natural Science Foundation of Zhejiang Province, China (Grant No. LY13F030005) and the National Natural Science Foundation of China (Grant No. 61501331).

  17. Biocompatible, smooth, plasma-treated nickel-titanium surface--an adequate platform for cell growth.

    PubMed

    Chrzanowski, W; Szade, J; Hart, A D; Knowles, J C; Dalby, M J

    2012-02-01

    High nickel content is believed to reduce the number of biomedical applications of nickel-titanium alloy due to the reported toxicity of nickel. The reduction in nickel release and minimized exposure of the cell to nickel can optimize the biocompatibility of the alloy and increase its use in the application where its shape memory effects and pseudoelasticity are particularly useful, e.g., spinal implants. Many treatments have been tried to improve the biocompatibility of Ni-Ti, and results suggest that a native, smooth surface could provide sufficient tolerance, biologically. We hypothesized that the native surface of nickel-titanium supports cell differentiation and insures good biocompatibility. Three types of surface modifications were investigated: thermal oxidation, alkali treatment, and plasma sputtering, and compared with smooth, ground surface. Thermal oxidation caused a drop in surface nickel content, while negligible chemistry changes were observed for plasma-modified samples when compared with control ground samples. In contrast, alkali treatment caused significant increase in surface nickel concentration and accelerated nickel release. Nickel release was also accelerated in thermally oxidized samples at 600 °C, while in other samples it remained at low level. Both thermal oxidation and alkali treatment increased the roughness of the surface, but mean roughness R(a) was significantly greater for the alkali-treated ones. Ground and plasma-modified samples had 'smooth' surfaces with R(a)=4 nm. Deformability tests showed that the adhesion of the surface layers on samples oxidized at 600 °C and alkali treatment samples was not sufficient; the layer delaminated upon deformation. It was observed that the cell cytoskeletons on the samples with a high nickel content or release were less developed, suggesting some negative effects of nickel on cell growth. These effects were observed primarily during initial cell contact with the surface. The most favorable cell responses were observed for ground and plasma-sputtered surfaces. These studies indicated that smooth, plasma-modified surfaces provide sufficient properties for cells to grow. © The Author(s), 2011.

  18. DNA methylation profiling of genomic DNA isolated from urine in diabetic chronic kidney disease: A pilot study

    PubMed Central

    Sexton-Oates, Alexandra; Carmody, Jake; Ekinci, Elif I.; Dwyer, Karen M.; Saffery, Richard

    2018-01-01

    Aim To characterise the genomic DNA (gDNA) yield from urine and quality of derived methylation data generated from the widely used Illuminia Infinium MethylationEPIC (HM850K) platform and compare this with buffy coat samples. Background DNA methylation is the most widely studied epigenetic mark and variations in DNA methylation profile have been implicated in diabetes which affects approximately 415 million people worldwide. Methods QIAamp Viral RNA Mini Kit and QIAamp DNA micro kit were used to extract DNA from frozen and fresh urine samples as well as increasing volumes of fresh urine. Matched buffy coats to the frozen urine were also obtained and DNA was extracted from the buffy coats using the QIAamp DNA Mini Kit. Genomic DNA of greater concentration than 20μg/ml were used for methylation analysis using the HM850K array. Results Irrespective of extraction technique or the use of fresh versus frozen urine samples, limited genomic DNA was obtained using a starting sample volume of 5ml (0–0.86μg/mL). In order to optimize the yield, we increased starting volumes to 50ml fresh urine, which yielded only 0–9.66μg/mL A different kit, QIAamp DNA Micro Kit, was trialled in six fresh urine samples and ten frozen urine samples with inadequate DNA yields from 0–17.7μg/mL and 0–1.6μg/mL respectively. Sufficient genomic DNA was obtained from only 4 of the initial 41 frozen urine samples (10%) for DNA methylation profiling. In comparison, all four buffy coat samples (100%) provided sufficient genomic DNA. Conclusion High quality data can be obtained provided a sufficient yield of genomic DNA is isolated. Despite optimizing various extraction methodologies, the modest amount of genomic DNA derived from urine, may limit the generalisability of this approach for the identification of DNA methylation biomarkers of chronic diabetic kidney disease. PMID:29462136

  19. DNA methylation profiling of genomic DNA isolated from urine in diabetic chronic kidney disease: A pilot study.

    PubMed

    Lecamwasam, Ashani; Sexton-Oates, Alexandra; Carmody, Jake; Ekinci, Elif I; Dwyer, Karen M; Saffery, Richard

    2018-01-01

    To characterise the genomic DNA (gDNA) yield from urine and quality of derived methylation data generated from the widely used Illuminia Infinium MethylationEPIC (HM850K) platform and compare this with buffy coat samples. DNA methylation is the most widely studied epigenetic mark and variations in DNA methylation profile have been implicated in diabetes which affects approximately 415 million people worldwide. QIAamp Viral RNA Mini Kit and QIAamp DNA micro kit were used to extract DNA from frozen and fresh urine samples as well as increasing volumes of fresh urine. Matched buffy coats to the frozen urine were also obtained and DNA was extracted from the buffy coats using the QIAamp DNA Mini Kit. Genomic DNA of greater concentration than 20μg/ml were used for methylation analysis using the HM850K array. Irrespective of extraction technique or the use of fresh versus frozen urine samples, limited genomic DNA was obtained using a starting sample volume of 5ml (0-0.86μg/mL). In order to optimize the yield, we increased starting volumes to 50ml fresh urine, which yielded only 0-9.66μg/mL A different kit, QIAamp DNA Micro Kit, was trialled in six fresh urine samples and ten frozen urine samples with inadequate DNA yields from 0-17.7μg/mL and 0-1.6μg/mL respectively. Sufficient genomic DNA was obtained from only 4 of the initial 41 frozen urine samples (10%) for DNA methylation profiling. In comparison, all four buffy coat samples (100%) provided sufficient genomic DNA. High quality data can be obtained provided a sufficient yield of genomic DNA is isolated. Despite optimizing various extraction methodologies, the modest amount of genomic DNA derived from urine, may limit the generalisability of this approach for the identification of DNA methylation biomarkers of chronic diabetic kidney disease.

  20. A Database Design and Development Case: Home Theater Video

    ERIC Educational Resources Information Center

    Ballenger, Robert; Pratt, Renee

    2012-01-01

    This case consists of a business scenario of a small video rental store, Home Theater Video, which provides background information, a description of the functional business requirements, and sample data. The case provides sufficient information to design and develop a moderately complex database to assist Home Theater Video in solving their…

  1. Exploration of the factor structure of the Kirton Adaption-Innovation Inventory using bootstrapping estimation.

    PubMed

    Im, Subin; Min, Soonhong

    2013-04-01

    Exploratory factor analyses of the Kirton Adaption-Innovation Inventory (KAI), which serves to measure individual cognitive styles, generally indicate three factors: sufficiency of originality, efficiency, and rule/group conformity. In contrast, a 2005 study by Im and Hu using confirmatory factor analysis supported a four-factor structure, dividing the sufficiency of originality dimension into two subdimensions, idea generation and preference for change. This study extends Im and Hu's (2005) study of a derived version of the KAI by providing additional evidence of the four-factor structure. Specifically, the authors test the robustness of the parameter estimates to the violation of normality assumptions in the sample using bootstrap methods. A bias-corrected confidence interval bootstrapping procedure conducted among a sample of 356 participants--members of the Arkansas Household Research Panel, with middle SES and average age of 55.6 yr. (SD = 13.9)--showed that the four-factor model with two subdimensions of sufficiency of originality fits the data significantly better than the three-factor model in non-normality conditions.

  2. Ultra-accelerated natural sunlight exposure testing

    DOEpatents

    Jorgensen, Gary J.; Bingham, Carl; Goggin, Rita; Lewandowski, Allan A.; Netter, Judy C.

    2000-06-13

    Process and apparatus for providing ultra accelerated natural sunlight exposure testing of samples under controlled weathering without introducing unrealistic failure mechanisms in exposed materials and without breaking reciprocity relationships between flux exposure levels and cumulative dose that includes multiple concurrent levels of temperature and relative humidity at high levels of natural sunlight comprising: a) concentrating solar flux uniformly; b) directing the controlled uniform sunlight onto sample materials in a chamber enclosing multiple concurrent levels of temperature and relative humidity to allow the sample materials to be subjected to accelerated irradiance exposure factors for a sufficient period of time in days to provide a corresponding time of about at least a years worth of representative weathering of the sample materials.

  3. Broad Consent for Research on Biospecimens: The Views of Actual Donors at Four U.S. Medical Centers.

    PubMed

    Warner, Teddy D; Weil, Carol J; Andry, Christopher; Degenholtz, Howard B; Parker, Lisa; Carithers, Latarsha J; Feige, Michelle; Wendler, David; Pentz, Rebecca D

    2018-04-01

    Commentators are concerned that broad consent may not provide biospecimen donors with sufficient information regarding possible future research uses of their tissue. We surveyed with interviews 302 cancer patients who had recently provided broad consent at four diverse academic medical centers. The majority of donors believed that the consent form provided them with sufficient information regarding future possible uses of their biospecimens. Donors expressed very positive views regarding tissue donation in general and endorsed the use of their biospecimens in future research across a wide range of contexts. Concerns regarding future uses were limited to for-profit research and research by investigators in other countries. These results support the use of broad consent to store and use biological samples in future research.

  4. Digital Holographic Microscopy, a Method for Detection of Microorganisms in Plume Samples from Enceladus and Other Icy Worlds

    PubMed Central

    Bedrossian, Manuel; Lindensmith, Chris

    2017-01-01

    Abstract Detection of extant microbial life on Earth and elsewhere in the Solar System requires the ability to identify and enumerate micrometer-scale, essentially featureless cells. On Earth, bacteria are usually enumerated by culture plating or epifluorescence microscopy. Culture plates require long incubation times and can only count culturable strains, and epifluorescence microscopy requires extensive staining and concentration of the sample and instrumentation that is not readily miniaturized for space. Digital holographic microscopy (DHM) represents an alternative technique with no moving parts and higher throughput than traditional microscopy, making it potentially useful in space for detection of extant microorganisms provided that sufficient numbers of cells can be collected. Because sample collection is expected to be the limiting factor for space missions, especially to outer planets, it is important to quantify the limits of detection of any proposed technique for extant life detection. Here we use both laboratory and field samples to measure the limits of detection of an off-axis digital holographic microscope (DHM). A statistical model is used to estimate any instrument's probability of detection at various bacterial concentrations based on the optical performance characteristics of the instrument, as well as estimate the confidence interval of detection. This statistical model agrees well with the limit of detection of 103 cells/mL that was found experimentally with laboratory samples. In environmental samples, active cells were immediately evident at concentrations of 104 cells/mL. Published estimates of cell densities for Enceladus plumes yield up to 104 cells/mL, which are well within the off-axis DHM's limits of detection to confidence intervals greater than or equal to 95%, assuming sufficient sample volumes can be collected. The quantitative phase imaging provided by DHM allowed minerals to be distinguished from cells. Off-axis DHM's ability for rapid low-level bacterial detection and counting shows its viability as a technique for detection of extant microbial life provided that the cells can be captured intact and delivered to the sample chamber in a sufficient volume of liquid for imaging. Key Words: In situ life detection—Extant microorganisms—Holographic microscopy—Ocean Worlds—Enceladus—Imaging. Astrobiology 17, 913–925. PMID:28708412

  5. Preparation of DNA-containing extract for PCR amplification

    DOEpatents

    Dunbar, John M.; Kuske, Cheryl R.

    2006-07-11

    Environmental samples typically include impurities that interfere with PCR amplification and DNA quantitation. Samples of soil, river water, and aerosol were taken from the environment and added to an aqueous buffer (with or without detergent). Cells from the sample are lysed, releasing their DNA into the buffer. After removing insoluble cell components, the remaining soluble DNA-containing extract is treated with N-phenacylthiazolium bromide, which causes rapid precipitation of impurities. Centrifugation provides a supernatant that can be used or diluted for PCR amplification of DNA, or further purified. The method may provide a DNA-containing extract sufficiently pure for PCR amplification within 5–10 minutes.

  6. Passive injection control for microfluidic systems

    DOEpatents

    Paul, Phillip H.; Arnold, Don W.; Neyer, David W.

    2004-12-21

    Apparatus for eliminating siphoning, "dead" regions, and fluid concentration gradients in microscale analytical devices. In its most basic embodiment, the present invention affords passive injection control for both electric field-driven and pressure-driven systems by providing additional fluid flow channels or auxiliary channels disposed on either side of a sample separation column. The auxiliary channels are sized such that volumetric fluid flow rate through these channels, while sufficient to move the sample away from the sample injection region in a timely fashion, is less than that through the sample separation channel or chromatograph.

  7. Rapid DNA extraction protocol for detection of alpha-1 antitrypsin deficiency from dried blood spots by real-time PCR.

    PubMed

    Struniawski, R; Szpechcinski, A; Poplawska, B; Skronski, M; Chorostowska-Wynimko, J

    2013-01-01

    The dried blood spot (DBS) specimens have been successfully employed for the large-scale diagnostics of α1-antitrypsin (AAT) deficiency as an easy to collect and transport alternative to plasma/serum. In the present study we propose a fast, efficient, and cost effective protocol of DNA extraction from dried blood spot (DBS) samples that provides sufficient quantity and quality of DNA and effectively eliminates any natural PCR inhibitors, allowing for successful AAT genotyping by real-time PCR and direct sequencing. DNA extracted from 84 DBS samples from chronic obstructive pulmonary disease patients was genotyped for AAT deficiency variants by real-time PCR. The results of DBS AAT genotyping were validated by serum IEF phenotyping and AAT concentration measurement. The proposed protocol allowed successful DNA extraction from all analyzed DBS samples. Both quantity and quality of DNA were sufficient for further real-time PCR and, if necessary, for genetic sequence analysis. A 100% concordance between AAT DBS genotypes and serum phenotypes in positive detection of two major deficiency S- and Z- alleles was achieved. Both assays, DBS AAT genotyping by real-time PCR and serum AAT phenotyping by IEF, positively identified PI*S and PI*Z allele in 8 out of the 84 (9.5%) and 16 out of 84 (19.0%) patients, respectively. In conclusion, the proposed protocol noticeably reduces the costs and the hand-on-time of DBS samples preparation providing genomic DNA of sufficient quantity and quality for further real-time PCR or genetic sequence analysis. Consequently, it is ideally suited for large-scale AAT deficiency screening programs and should be method of choice.

  8. Evaluation of water-quality data and monitoring program for Lake Travis, near Austin, Texas

    USGS Publications Warehouse

    Rast, Walter; Slade, Raymond M.

    1998-01-01

    The multiple-comparison tests indicate that, for some constituents, a single sampling site for a constituent or property might adequately characterize the water quality of Lake Travis for that constituent or property. However, multiple sampling sites are required to provide information of sufficient temporal and spatial resolution to accurately evaluate other water-quality constituents for the reservoir. For example, the water-quality data from surface samples and from bottom samples indicate that nutrients (nitrogen, phosphorus) might require additional sampling sites for a more accurate characterization of their in-lake dynamics.

  9. Clinical decision making and the expected value of information.

    PubMed

    Willan, Andrew R

    2007-01-01

    The results of the HOPE study, a randomized clinical trial, provide strong evidence that 1) ramipril prevents the composite outcome of cardiovascular death, myocardial infarction or stroke in patients who are at high risk of a cardiovascular event and 2) ramipril is cost-effective at a threshold willingness-to-pay of $10,000 to prevent an event of the composite outcome. In this report the concept of the expected value of information is used to determine if the information provided by the HOPE study is sufficient for decision making in the US and Canada. and results Using the cost-effectiveness data from a clinical trial, or from a meta-analysis of several trials, one can determine, based on the number of future patients that would benefit from the health technology under investigation, the expected value of sample information (EVSI) of a future trial as a function of proposed sample size. If the EVSI exceeds the cost for any particular sample size then the current information is insufficient for decision making and a future trial is indicated. If, on the other hand, there is no sample size for which the EVSI exceeds the cost, then there is sufficient information for decision making and no future trial is required. Using the data from the HOPE study these concepts are applied for various assumptions regarding the fixed and variable cost of a future trial and the number of patients who would benefit from ramipril. Expected value of information methods provide a decision-analytic alternative to the standard likelihood methods for assessing the evidence provided by cost-effectiveness data from randomized clinical trials.

  10. 29 CFR 1910.1029 - Coke oven emissions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., including at least one sample during each shift for each battery and each job classification within the... controls to control coke oven emissions during charging operations: (a) One of the following methods of...) Aspiration systems designed and operated to provide sufficient negative pressure and flow volume to...

  11. 29 CFR 1910.1029 - Coke oven emissions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., including at least one sample during each shift for each battery and each job classification within the... controls to control coke oven emissions during charging operations: (a) One of the following methods of...) Aspiration systems designed and operated to provide sufficient negative pressure and flow volume to...

  12. 29 CFR 1910.1029 - Coke oven emissions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., including at least one sample during each shift for each battery and each job classification within the... controls to control coke oven emissions during charging operations: (a) One of the following methods of...) Aspiration systems designed and operated to provide sufficient negative pressure and flow volume to...

  13. Automated Analysis of Child Phonetic Production Using Naturalistic Recordings

    ERIC Educational Resources Information Center

    Xu, Dongxin; Richards, Jeffrey A.; Gilkerson, Jill

    2014-01-01

    Purpose: Conventional resource-intensive methods for child phonetic development studies are often impractical for sampling and analyzing child vocalizations in sufficient quantity. The purpose of this study was to provide new information on early language development by an automated analysis of child phonetic production using naturalistic…

  14. Study site characterization. Chapter 2

    Treesearch

    Chris Potter; Richard Birdsey

    2008-01-01

    This chapter is an overview of the main site characterization requirements at landscape-scale sampling locations. The overview is organized according to multiple "Site Attribute" headings that require descriptions throughout a given study site area, leading ultimately to a sufficient overall site characterization. Guidance is provided to describe the major...

  15. Differences by Degree: Evidence of the Net Financial Rates of Return to Undergraduate Study for England and Wales

    ERIC Educational Resources Information Center

    Walker, Ian; Zhu, Yu

    2011-01-01

    This paper provides estimates of the impact of higher education qualifications on the earnings of graduates in the U.K. by subject studied. We use data from the recent U.K. Labour Force Surveys which provide a sufficiently large sample to consider the effects of the subject studied, class of first degree, and postgraduate qualifications. Ordinary…

  16. DESCARTES' RULE OF SIGNS AND THE IDENTIFIABILITY OF POPULATION DEMOGRAPHIC MODELS FROM GENOMIC VARIATION DATA.

    PubMed

    Bhaskar, Anand; Song, Yun S

    2014-01-01

    The sample frequency spectrum (SFS) is a widely-used summary statistic of genomic variation in a sample of homologous DNA sequences. It provides a highly efficient dimensional reduction of large-scale population genomic data and its mathematical dependence on the underlying population demography is well understood, thus enabling the development of efficient inference algorithms. However, it has been recently shown that very different population demographies can actually generate the same SFS for arbitrarily large sample sizes. Although in principle this nonidentifiability issue poses a thorny challenge to statistical inference, the population size functions involved in the counterexamples are arguably not so biologically realistic. Here, we revisit this problem and examine the identifiability of demographic models under the restriction that the population sizes are piecewise-defined where each piece belongs to some family of biologically-motivated functions. Under this assumption, we prove that the expected SFS of a sample uniquely determines the underlying demographic model, provided that the sample is sufficiently large. We obtain a general bound on the sample size sufficient for identifiability; the bound depends on the number of pieces in the demographic model and also on the type of population size function in each piece. In the cases of piecewise-constant, piecewise-exponential and piecewise-generalized-exponential models, which are often assumed in population genomic inferences, we provide explicit formulas for the bounds as simple functions of the number of pieces. Lastly, we obtain analogous results for the "folded" SFS, which is often used when there is ambiguity as to which allelic type is ancestral. Our results are proved using a generalization of Descartes' rule of signs for polynomials to the Laplace transform of piecewise continuous functions.

  17. DESCARTES’ RULE OF SIGNS AND THE IDENTIFIABILITY OF POPULATION DEMOGRAPHIC MODELS FROM GENOMIC VARIATION DATA1

    PubMed Central

    Bhaskar, Anand; Song, Yun S.

    2016-01-01

    The sample frequency spectrum (SFS) is a widely-used summary statistic of genomic variation in a sample of homologous DNA sequences. It provides a highly efficient dimensional reduction of large-scale population genomic data and its mathematical dependence on the underlying population demography is well understood, thus enabling the development of efficient inference algorithms. However, it has been recently shown that very different population demographies can actually generate the same SFS for arbitrarily large sample sizes. Although in principle this nonidentifiability issue poses a thorny challenge to statistical inference, the population size functions involved in the counterexamples are arguably not so biologically realistic. Here, we revisit this problem and examine the identifiability of demographic models under the restriction that the population sizes are piecewise-defined where each piece belongs to some family of biologically-motivated functions. Under this assumption, we prove that the expected SFS of a sample uniquely determines the underlying demographic model, provided that the sample is sufficiently large. We obtain a general bound on the sample size sufficient for identifiability; the bound depends on the number of pieces in the demographic model and also on the type of population size function in each piece. In the cases of piecewise-constant, piecewise-exponential and piecewise-generalized-exponential models, which are often assumed in population genomic inferences, we provide explicit formulas for the bounds as simple functions of the number of pieces. Lastly, we obtain analogous results for the “folded” SFS, which is often used when there is ambiguity as to which allelic type is ancestral. Our results are proved using a generalization of Descartes’ rule of signs for polynomials to the Laplace transform of piecewise continuous functions. PMID:28018011

  18. 40 CFR 86.1338-84 - Emission measurement accuracy.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... engineering practice dictates that exhaust emission sample analyzer readings below 15 percent of full scale... computers, data loggers, etc., can provide sufficient accuracy and resolution below 15 percent of full scale... spaced points, using good engineering judgement, below 15 percent of full scale are made to ensure the...

  19. 40 CFR 86.1338-84 - Emission measurement accuracy.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... engineering practice dictates that exhaust emission sample analyzer readings below 15 percent of full scale... computers, data loggers, etc., can provide sufficient accuracy and resolution below 15 percent of full scale... spaced points, using good engineering judgement, below 15 percent of full scale are made to ensure the...

  20. 40 CFR 86.1338-84 - Emission measurement accuracy.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... engineering practice dictates that exhaust emission sample analyzer readings below 15 percent of full scale... computers, data loggers, etc., can provide sufficient accuracy and resolution below 15 percent of full scale... spaced points, using good engineering judgement, below 15 percent of full scale are made to ensure the...

  1. 40 CFR 86.1338-84 - Emission measurement accuracy.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... engineering practice dictates that exhaust emission sample analyzer readings below 15 percent of full scale... computers, data loggers, etc., can provide sufficient accuracy and resolution below 15 percent of full scale... spaced points, using good engineering judgement, below 15 percent of full scale are made to ensure the...

  2. Method for isolating nucleic acids

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hurt, Jr., Richard Ashley; Elias, Dwayne A.

    The current disclosure provides methods and kits for isolating nucleic acid from an environmental sample. The current methods and compositions further provide methods for isolating nucleic acids by reducing adsorption of nucleic acids by charged ions and particles within an environmental sample. The methods of the current disclosure provide methods for isolating nucleic acids by releasing adsorbed nucleic acids from charged particles during the nucleic acid isolation process. The current disclosure facilitates the isolation of nucleic acids of sufficient quality and quantity to enable one of ordinary skill in the art to utilize or analyze the isolated nucleic acids formore » a wide variety of applications including, sequencing or species population analysis.« less

  3. Characterizations of linear sufficient statistics

    NASA Technical Reports Server (NTRS)

    Peters, B. C., Jr.; Reoner, R.; Decell, H. P., Jr.

    1977-01-01

    A surjective bounded linear operator T from a Banach space X to a Banach space Y must be a sufficient statistic for a dominated family of probability measures defined on the Borel sets of X. These results were applied, so that they characterize linear sufficient statistics for families of the exponential type, including as special cases the Wishart and multivariate normal distributions. The latter result was used to establish precisely which procedures for sampling from a normal population had the property that the sample mean was a sufficient statistic.

  4. Do sufficient vitamin D levels at the end of summer in children and adolescents provide an assurance of vitamin D sufficiency at the end of winter? A cohort study.

    PubMed

    Shakeri, Habibesadat; Pournaghi, Seyed-Javad; Hashemi, Javad; Mohammad-Zadeh, Mohammad; Akaberi, Arash

    2017-10-26

    The changes in serum 25-hydroxyvitamin D (25(OH)D) in adolescents from summer to winter and optimal serum vitamin D levels in the summer to ensure adequate vitamin D levels at the end of winter are currently unknown. This study was conducted to address this knowledge gap. The study was conducted as a cohort study. Sixty-eight participants aged 7-18 years and who had sufficient vitamin D levels at the end of the summer in 2011 were selected using stratified random sampling. Subsequently, the participants' vitamin D levels were measured at the end of the winter in 2012. A receiver operating characteristic (ROC) curve was used to determine optimal cutoff points for vitamin D at the end of the summer to predict sufficient vitamin D levels at the end of the winter. The results indicated that 89.7% of all the participants had a decrease in vitamin D levels from summer to winter: 14.7% of them were vitamin D-deficient, 36.8% had insufficient vitamin D concentrations and only 48.5% where able to maintain sufficient vitamin D. The optimal cutoff point to provide assurance of sufficient serum vitamin D at the end of the winter was 40 ng/mL at the end of the summer. Sex, age and vitamin D levels at the end of the summer were significant predictors of non-sufficient vitamin D at the end of the winter. In this age group, a dramatic reduction in vitamin D was observed over the follow-up period. Sufficient vitamin D at the end of the summer did not guarantee vitamin D sufficiency at the end of the winter. We found 40 ng/mL as an optimal cutoff point.

  5. Characterizing the zenithal night sky brightness in large territories: how many samples per square kilometre are needed?

    NASA Astrophysics Data System (ADS)

    Bará, Salvador

    2018-01-01

    A recurring question arises when trying to characterize, by means of measurements or theoretical calculations, the zenithal night sky brightness throughout a large territory: how many samples per square kilometre are needed? The optimum sampling distance should allow reconstructing, with sufficient accuracy, the continuous zenithal brightness map across the whole region, whilst at the same time avoiding unnecessary and redundant oversampling. This paper attempts to provide some tentative answers to this issue, using two complementary tools: the luminance structure function and the Nyquist-Shannon spatial sampling theorem. The analysis of several regions of the world, based on the data from the New world atlas of artificial night sky brightness, suggests that, as a rule of thumb, about one measurement per square kilometre could be sufficient for determining the zenithal night sky brightness of artificial origin at any point in a region to within ±0.1 magV arcsec-2 (in the root-mean-square sense) of its true value in the Johnson-Cousins V band. The exact reconstruction of the zenithal night sky brightness maps from samples taken at the Nyquist rate seems to be considerably more demanding.

  6. Reconstructing gravitational wave source parameters via direct comparisons to numerical relativity I: Method

    NASA Astrophysics Data System (ADS)

    Lange, Jacob; O'Shaughnessy, Richard; Healy, James; Lousto, Carlos; Shoemaker, Deirdre; Lovelace, Geoffrey; Scheel, Mark; Ossokine, Serguei

    2016-03-01

    In this talk, we describe a procedure to reconstruct the parameters of sufficiently massive coalescing compact binaries via direct comparison with numerical relativity simulations. For sufficiently massive sources, existing numerical relativity simulations are long enough to cover the observationally accessible part of the signal. Due to the signal's brevity, the posterior parameter distribution it implies is broad, simple, and easily reconstructed from information gained by comparing to only the sparse sample of existing numerical relativity simulations. We describe how followup simulations can corroborate and improve our understanding of a detected source. Since our method can include all physics provided by full numerical relativity simulations of coalescing binaries, it provides a valuable complement to alternative techniques which employ approximations to reconstruct source parameters. Supported by NSF Grant PHY-1505629.

  7. Characterizing dispersal patterns in a threatened seabird with limited genetic structure

    Treesearch

    Laurie A. Hall; Per J. Palsboll; Steven R. Beissinger; James T. Harvey; Martine Berube; Martin G. Raphael; Kim Nelson; Richard T. Golightly; Laura McFarlane-Tranquilla; Scott H. Newman; M. Zachariah Peery

    2009-01-01

    Genetic assignment methods provide an appealing approach for characterizing dispersal patterns on ecological time scales, but require sufficient genetic differentiation to accurately identify migrants and a large enough sample size of migrants to, for example, compare dispersal between sexes or age classes. We demonstrate that assignment methods can be rigorously used...

  8. 40 CFR 267.13 - What are my waste analysis requirements?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... representative sample of the wastes. At a minimum, the analysis must contain all the information needed to treat... analysis for these parameters will provide sufficient information on the waste's properties to comply with... 40 Protection of Environment 27 2014-07-01 2014-07-01 false What are my waste analysis...

  9. 40 CFR 267.13 - What are my waste analysis requirements?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... representative sample of the wastes. At a minimum, the analysis must contain all the information needed to treat... analysis for these parameters will provide sufficient information on the waste's properties to comply with... 40 Protection of Environment 28 2013-07-01 2013-07-01 false What are my waste analysis...

  10. 40 CFR 267.13 - What are my waste analysis requirements?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... representative sample of the wastes. At a minimum, the analysis must contain all the information needed to treat... analysis for these parameters will provide sufficient information on the waste's properties to comply with... 40 Protection of Environment 27 2011-07-01 2011-07-01 false What are my waste analysis...

  11. 40 CFR 267.13 - What are my waste analysis requirements?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... representative sample of the wastes. At a minimum, the analysis must contain all the information needed to treat... analysis for these parameters will provide sufficient information on the waste's properties to comply with... 40 Protection of Environment 28 2012-07-01 2012-07-01 false What are my waste analysis...

  12. 40 CFR 267.13 - What are my waste analysis requirements?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... representative sample of the wastes. At a minimum, the analysis must contain all the information needed to treat... analysis for these parameters will provide sufficient information on the waste's properties to comply with... 40 Protection of Environment 26 2010-07-01 2010-07-01 false What are my waste analysis...

  13. In-gas-cell laser ionization studies of plutonium isotopes at IGISOL

    NASA Astrophysics Data System (ADS)

    Pohjalainen, I.; Moore, I. D.; Kron, T.; Raeder, S.; Sonnenschein, V.; Tomita, H.; Trautmann, N.; Voss, A.; Wendt, K.

    2016-06-01

    In-gas-cell resonance laser ionization has been performed on long-lived isotopes of Pu at the IGISOL facility, Jyväskylä. This initiates a new programme of research towards high-resolution optical spectroscopy of heavy actinide elements which can be produced in sufficient quantities at research reactors and transported to facilities elsewhere. In this work a new gas cell has been constructed for fast extraction of laser-ionized elements. Samples of 238-240,242Pu and 244Pu have been evaporated from Ta filaments, laser ionized, mass separated and delivered to the collinear laser spectroscopy station. Here we report on the performance of the gas cell through studies of the mass spectra obtained in helium and argon, before and after the radiofrequency quadrupole cooler-buncher. This provides valuable insight into the gas phase chemistry exhibited by Pu, which has been additionally supported by measurements of ion time profiles. The resulting monoatomic yields are sufficient for collinear laser spectroscopy. A gamma-ray spectroscopic analysis of the Pu samples shows a good agreement with the assay provided by the Mainz Nuclear Chemistry department.

  14. Comparison of DNA extraction methods for human gut microbial community profiling.

    PubMed

    Lim, Mi Young; Song, Eun-Ji; Kim, Sang Ho; Lee, Jangwon; Nam, Young-Do

    2018-03-01

    The human gut harbors a vast range of microbes that have significant impact on health and disease. Therefore, gut microbiome profiling holds promise for use in early diagnosis and precision medicine development. Accurate profiling of the highly complex gut microbiome requires DNA extraction methods that provide sufficient coverage of the original community as well as adequate quality and quantity. We tested nine different DNA extraction methods using three commercial kits (TianLong Stool DNA/RNA Extraction Kit (TS), QIAamp DNA Stool Mini Kit (QS), and QIAamp PowerFecal DNA Kit (QP)) with or without additional bead-beating step using manual or automated methods and compared them in terms of DNA extraction ability from human fecal sample. All methods produced DNA in sufficient concentration and quality for use in sequencing, and the samples were clustered according to the DNA extraction method. Inclusion of bead-beating step especially resulted in higher degrees of microbial diversity and had the greatest effect on gut microbiome composition. Among the samples subjected to bead-beating method, TS kit samples were more similar to QP kit samples than QS kit samples. Our results emphasize the importance of mechanical disruption step for a more comprehensive profiling of the human gut microbiome. Copyright © 2017 The Authors. Published by Elsevier GmbH.. All rights reserved.

  15. Modified zirconium-eriochrome cyanine R determination of fluoride

    USGS Publications Warehouse

    Thatcher, L.L.

    1957-01-01

    The Eriochrome Cyanine R method for determining fluoride in natural water has been modified to provide a single, stable reagent solution, eliminate interference from oxidizing agents, extend the concentration range to 3 p.p.m., and extend the phosphate tolerance. Temperature effect was minimized; sulfate error was eliminated by precipitation. The procedure is sufficiently tolerant to interferences found in natural and polluted waters to permit the elimination of prior distillation for most samples. The method has been applied to 500 samples.

  16. Determination of $sup 241$Am in soil using an automated nuclear radiation measurement laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Engstrom, D.E.; White, M.G.; Dunaway, P.B.

    The recent completion of REECo's Automated Laboratory and associated software systems has provided a significant increase in capability while reducing manpower requirements. The system is designed to perform gamma spectrum analyses on the large numbers of samples required by the current Nevada Applied Ecology Group (NAEG) and Plutonium Distribution Inventory Program (PDIP) soil sampling programs while maintaining sufficient sensitivities as defined by earlier investigations of the same type. The hardware and systems are generally described in this paper, with emphasis being placed on spectrum reduction and the calibration procedures used for soil samples. (auth)

  17. The Monitoring the Future Project After Thirty-Two Years: Design and Procedures. Monitoring the Future Occasional Paper 64

    ERIC Educational Resources Information Center

    Bachman, Jerald G.; Johnston, Lloyd D.; O'Malley, Patrick M.; Schulenberg, John E.

    2006-01-01

    This occasional paper updates and extends earlier papers in the Monitoring the Future project. It provides a detailed description of the project's design, including sampling design, data collection procedures, measurement content, and questionnaire format. It attempts to include sufficient information for others who wish to evaluate the results,…

  18. A Sampling of Community-Based Housing Efforts at Pine Ridge Indian Reservation

    ERIC Educational Resources Information Center

    Wood, Clinton L.; Clevenger, Caroline M.

    2012-01-01

    Pine Ridge Indian Reservation is in need of several thousand houses to alleviate overcrowding and improve living conditions. The United States government has failed to provide appropriate or sufficient housing and other individuals and organizations that have attempted to build homes for the Lakota have met with widely varying results. This paper…

  19. Requirements for Minimum Sample Size for Sensitivity and Specificity Analysis

    PubMed Central

    Adnan, Tassha Hilda

    2016-01-01

    Sensitivity and specificity analysis is commonly used for screening and diagnostic tests. The main issue researchers face is to determine the sufficient sample sizes that are related with screening and diagnostic studies. Although the formula for sample size calculation is available but concerning majority of the researchers are not mathematicians or statisticians, hence, sample size calculation might not be easy for them. This review paper provides sample size tables with regards to sensitivity and specificity analysis. These tables were derived from formulation of sensitivity and specificity test using Power Analysis and Sample Size (PASS) software based on desired type I error, power and effect size. The approaches on how to use the tables were also discussed. PMID:27891446

  20. Evaluating multi-level models to test occupancy state responses of Plethodontid salamanders

    USGS Publications Warehouse

    Kroll, Andrew J.; Garcia, Tiffany S.; Jones, Jay E.; Dugger, Catherine; Murden, Blake; Johnson, Josh; Peerman, Summer; Brintz, Ben; Rochelle, Michael

    2015-01-01

    Plethodontid salamanders are diverse and widely distributed taxa and play critical roles in ecosystem processes. Due to salamander use of structurally complex habitats, and because only a portion of a population is available for sampling, evaluation of sampling designs and estimators is critical to provide strong inference about Plethodontid ecology and responses to conservation and management activities. We conducted a simulation study to evaluate the effectiveness of multi-scale and hierarchical single-scale occupancy models in the context of a Before-After Control-Impact (BACI) experimental design with multiple levels of sampling. Also, we fit the hierarchical single-scale model to empirical data collected for Oregon slender and Ensatina salamanders across two years on 66 forest stands in the Cascade Range, Oregon, USA. All models were fit within a Bayesian framework. Estimator precision in both models improved with increasing numbers of primary and secondary sampling units, underscoring the potential gains accrued when adding secondary sampling units. Both models showed evidence of estimator bias at low detection probabilities and low sample sizes; this problem was particularly acute for the multi-scale model. Our results suggested that sufficient sample sizes at both the primary and secondary sampling levels could ameliorate this issue. Empirical data indicated Oregon slender salamander occupancy was associated strongly with the amount of coarse woody debris (posterior mean = 0.74; SD = 0.24); Ensatina occupancy was not associated with amount of coarse woody debris (posterior mean = -0.01; SD = 0.29). Our simulation results indicate that either model is suitable for use in an experimental study of Plethodontid salamanders provided that sample sizes are sufficiently large. However, hierarchical single-scale and multi-scale models describe different processes and estimate different parameters. As a result, we recommend careful consideration of study questions and objectives prior to sampling data and fitting models.

  1. Effect of ambient humidity on the rate at which blood spots dry and the size of the spot produced.

    PubMed

    Denniff, Philip; Woodford, Lynsey; Spooner, Neil

    2013-08-01

    For shipping and storage, dried blood spot (DBS) samples must be sufficiently dry to protect the integrity of the sample. When the blood is spotted the humidity has the potential to affect the size of the spot created and the speed at which it dries. The area of DBS produced on three types of substrates were not affected by the humidity under which they were generated. DBS samples reached a steady moisture content 150 min after spotting and 90 min for humidities less than 60% relative humidity. All packaging materials examined provided some degree of protection from external extreme conditions. However, none of the packaging examined provided a total moisture barrier to extreme environmental conditions. Humidity was shown not to affect the spot area and DBS samples were ready for shipping and storage 2 h after spotting. The packing solutions examined all provided good protection from external high humidity conditions.

  2. Linking models and data on vegetation structure

    NASA Astrophysics Data System (ADS)

    Hurtt, G. C.; Fisk, J.; Thomas, R. Q.; Dubayah, R.; Moorcroft, P. R.; Shugart, H. H.

    2010-06-01

    For more than a century, scientists have recognized the importance of vegetation structure in understanding forest dynamics. Now future satellite missions such as Deformation, Ecosystem Structure, and Dynamics of Ice (DESDynI) hold the potential to provide unprecedented global data on vegetation structure needed to reduce uncertainties in terrestrial carbon dynamics. Here, we briefly review the uses of data on vegetation structure in ecosystem models, develop and analyze theoretical models to quantify model-data requirements, and describe recent progress using a mechanistic modeling approach utilizing a formal scaling method and data on vegetation structure to improve model predictions. Generally, both limited sampling and coarse resolution averaging lead to model initialization error, which in turn is propagated in subsequent model prediction uncertainty and error. In cases with representative sampling, sufficient resolution, and linear dynamics, errors in initialization tend to compensate at larger spatial scales. However, with inadequate sampling, overly coarse resolution data or models, and nonlinear dynamics, errors in initialization lead to prediction error. A robust model-data framework will require both models and data on vegetation structure sufficient to resolve important environmental gradients and tree-level heterogeneity in forest structure globally.

  3. Estimation of pyrethroid pesticide intake using regression ...

    EPA Pesticide Factsheets

    Population-based estimates of pesticide intake are needed to characterize exposure for particular demographic groups based on their dietary behaviors. Regression modeling performed on measurements of selected pesticides in composited duplicate diet samples allowed (1) estimation of pesticide intakes for a defined demographic community, and (2) comparison of dietary pesticide intakes between the composite and individual samples. Extant databases were useful for assigning individual samples to composites, but they could not provide the breadth of information needed to facilitate measurable levels in every composite. Composite sample measurements were found to be good predictors of pyrethroid pesticide levels in their individual sample constituents where sufficient measurements are available above the method detection limit. Statistical inference shows little evidence of differences between individual and composite measurements and suggests that regression modeling of food groups based on composite dietary samples may provide an effective tool for estimating dietary pesticide intake for a defined population. The research presented in the journal article will improve community's ability to determine exposures through the dietary route with a less burdensome and costly method.

  4. Laser capture microdissection of embryonic cells and preparation of RNA for microarray assays.

    PubMed

    Redmond, Latasha C; Pang, Christopher J; Dumur, Catherine; Haar, Jack L; Lloyd, Joyce A

    2014-01-01

    In order to compare the global gene expression profiles of different embryonic cell types, it is first necessary to isolate the specific cells of interest. The purpose of this chapter is to provide a step-by-step protocol to perform laser capture microdissection (LCM) on embryo samples and obtain sufficient amounts of high-quality RNA for microarray hybridizations. Using the LCM/microarray strategy on mouse embryo samples has some challenges, because the cells of interest are available in limited quantities. The first step in the protocol is to obtain embryonic tissue, and immediately cryoprotect and freeze it in a cryomold containing Optimal Cutting Temperature freezing media (Sakura Finetek), using a dry ice-isopentane bath. The tissue is then cryosectioned, and the microscope slides are processed to fix, stain, and dehydrate the cells. LCM is employed to isolate specific cell types from the slides, identified under the microscope by virtue of their morphology. Detailed protocols are provided for using the currently available ArcturusXT LCM instrument and CapSure(®) LCM Caps, to which the selected cells adhere upon laser capture. To maintain RNA integrity, upon removing a slide from the final processing step, or attaching the first cells on the LCM cap, LCM is completed within 20 min. The cells are then immediately recovered from the LCM cap using a denaturing solution that stabilizes RNA integrity. RNA is prepared using standard methods, modified for working with small samples. To ensure the validity of the microarray data, the quality of the RNA is assessed using the Agilent bioanalyzer. Only RNA that is of sufficient integrity and quantity is used to perform microarray assays. This chapter provides guidance regarding troubleshooting and optimization to obtain high-quality RNA from cells of limited availability, obtained from embryo samples by LCM.

  5. Laser Capture Microdissection of Embryonic Cells and Preparation of RNA for Microarray Assays

    PubMed Central

    Redmond, Latasha C.; Pang, Christopher J.; Dumur, Catherine; Haar, Jack L.; Lloyd, Joyce A.

    2014-01-01

    In order to compare the global gene expression profiles of different embryonic cell types, it is first necessary to isolate the specific cells of interest. The purpose of this chapter is to provide a step-by-step protocol to perform laser capture microdissection (LCM) on embryo samples and obtain sufficient amounts of high-quality RNA for microarray hybridizations. Using the LCM/microarray strategy on mouse embryo samples has some challenges, because the cells of interest are available in limited quantities. The first step in the protocol is to obtain embryonic tissue, and immediately cryoprotect and freeze it in a cryomold containing Optimal Cutting Temperature freezing media (Sakura Finetek), using a dry ice–isopentane bath. The tissue is then cryosectioned, and the microscope slides are processed to fix, stain, and dehydrate the cells. LCM is employed to isolate specific cell types from the slides, identified under the microscope by virtue of their morphology. Detailed protocols are provided for using the currently available ArcturusXT LCM instrument and CapSure® LCM Caps, to which the selected cells adhere upon laser capture. To maintain RNA integrity, upon removing a slide from the final processing step, or attaching the first cells on the LCM cap, LCM is completed within 20 min. The cells are then immediately recovered from the LCM cap using a denaturing solution that stabilizes RNA integrity. RNA is prepared using standard methods, modified for working with small samples. To ensure the validity of the microarray data, the quality of the RNA is assessed using the Agilent bioanalyzer. Only RNA that is of sufficient integrity and quantity is used to perform microarray assays. This chapter provides guidance regarding troubleshooting and optimization to obtain high-quality RNA from cells of limited availability, obtained from embryo samples by LCM. PMID:24318813

  6. A High-Precision Counter Using the DSP Technique

    DTIC Science & Technology

    2004-09-01

    DSP is not good enough to process all the 1-second samples. The cache memory is also not sufficient to store all the sampling data. So we cut the...sampling number in a cycle is not good enough to achieve an accuracy less than 2×10-11. For this reason, a correlation operation is performed for... not good enough to process all the 1-second samples. The cache memory is also not sufficient to store all the sampling data. We will solve this

  7. Transfer Learning for Class Imbalance Problems with Inadequate Data.

    PubMed

    Al-Stouhi, Samir; Reddy, Chandan K

    2016-07-01

    A fundamental problem in data mining is to effectively build robust classifiers in the presence of skewed data distributions. Class imbalance classifiers are trained specifically for skewed distribution datasets. Existing methods assume an ample supply of training examples as a fundamental prerequisite for constructing an effective classifier. However, when sufficient data is not readily available, the development of a representative classification algorithm becomes even more difficult due to the unequal distribution between classes. We provide a unified framework that will potentially take advantage of auxiliary data using a transfer learning mechanism and simultaneously build a robust classifier to tackle this imbalance issue in the presence of few training samples in a particular target domain of interest. Transfer learning methods use auxiliary data to augment learning when training examples are not sufficient and in this paper we will develop a method that is optimized to simultaneously augment the training data and induce balance into skewed datasets. We propose a novel boosting based instance-transfer classifier with a label-dependent update mechanism that simultaneously compensates for class imbalance and incorporates samples from an auxiliary domain to improve classification. We provide theoretical and empirical validation of our method and apply to healthcare and text classification applications.

  8. Low energy cyclotron for radiocarbon dating

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Welch, J.J.

    1984-12-01

    The measurement of naturally occurring radioisotopes whose half lives are less than a few hundred million years but more than a few years provides information about the temporal behavior of geologic and climatic processes, the temporal history of meteoritic bodies as well as the production mechanisms of these radioisotopes. A new extremely sensitive technique for measuring these radioisotopes at tandem Van de Graaff and cyclotron facilities has been very successful though the high cost and limited availability have been discouraging. We have built and tested a low energy cyclotron for radiocarbon dating similar in size to a conventional mass spectrometer.more » These tests clearly show that with the addition of a conventional ion source, the low energy cyclotron can perform the extremely high sensitivity /sup 14/C measurements that are now done at accelerator facilities. We found that no significant background is present when the cyclotron is tuned to accelerate /sup 14/C negative ions and the transmission efficiency is adequate to perform radiocarbon dating on milligram samples of carbon. The internal ion source used did not produce sufficient current to detect /sup 14/C directly at modern concentrations. We show how a conventional carbon negative ion source, located outside the cyclotron magnet, would produce sufficient beam and provide for quick sampling to make radiocarbon dating milligram samples with a modest laboratory instrument feasible.« less

  9. Optical microsensor for continuous glucose measurements in interstitial fluid

    NASA Astrophysics Data System (ADS)

    Olesberg, Jonathon T.; Cao, Chuanshun; Yager, Jeffrey R.; Prineas, John P.; Coretsopoulos, Chris; Arnold, Mark A.; Olafsen, Linda J.; Santilli, Michael

    2006-02-01

    Tight control of blood glucose levels has been shown to dramatically reduce the long-term complications of diabetes. Current invasive technology for monitoring glucose levels is effective but underutilized by people with diabetes because of the pain of repeated finger-sticks, the inconvenience of handling samples of blood, and the cost of reagent strips. A continuous glucose sensor coupled with an insulin delivery system could provide closed-loop glucose control without the need for discrete sampling or user intervention. We describe an optical glucose microsensor based on absorption spectroscopy in interstitial fluid that can potentially be implanted to provide continuous glucose readings. Light from a GaInAsSb LED in the 2.2-2.4 μm wavelength range is passed through a sample of interstitial fluid and a linear variable filter before being detected by an uncooled, 32-element GaInAsSb detector array. Spectral resolution is provided by the linear variable filter, which has a 10 nm band pass and a center wavelength that varies from 2.18-2.38 μm (4600-4200 cm -1) over the length of the detector array. The sensor assembly is a monolithic design requiring no coupling optics. In the present system, the LED running with 100 mA of drive current delivers 20 nW of power to each of the detector pixels, which have a noise-equivalent-power of 3 pW/Hz 1/2. This is sufficient to provide a signal-to-noise ratio of 4500 Hz 1/2 under detector-noise limited conditions. This signal-to-noise ratio corresponds to a spectral noise level less than 10 μAU for a five minute integration, which should be sufficient for sub-millimolar glucose detection.

  10. The Number of Patients and Events Required to Limit the Risk of Overestimation of Intervention Effects in Meta-Analysis—A Simulation Study

    PubMed Central

    Thorlund, Kristian; Imberger, Georgina; Walsh, Michael; Chu, Rong; Gluud, Christian; Wetterslev, Jørn; Guyatt, Gordon; Devereaux, Philip J.; Thabane, Lehana

    2011-01-01

    Background Meta-analyses including a limited number of patients and events are prone to yield overestimated intervention effect estimates. While many assume bias is the cause of overestimation, theoretical considerations suggest that random error may be an equal or more frequent cause. The independent impact of random error on meta-analyzed intervention effects has not previously been explored. It has been suggested that surpassing the optimal information size (i.e., the required meta-analysis sample size) provides sufficient protection against overestimation due to random error, but this claim has not yet been validated. Methods We simulated a comprehensive array of meta-analysis scenarios where no intervention effect existed (i.e., relative risk reduction (RRR) = 0%) or where a small but possibly unimportant effect existed (RRR = 10%). We constructed different scenarios by varying the control group risk, the degree of heterogeneity, and the distribution of trial sample sizes. For each scenario, we calculated the probability of observing overestimates of RRR>20% and RRR>30% for each cumulative 500 patients and 50 events. We calculated the cumulative number of patients and events required to reduce the probability of overestimation of intervention effect to 10%, 5%, and 1%. We calculated the optimal information size for each of the simulated scenarios and explored whether meta-analyses that surpassed their optimal information size had sufficient protection against overestimation of intervention effects due to random error. Results The risk of overestimation of intervention effects was usually high when the number of patients and events was small and this risk decreased exponentially over time as the number of patients and events increased. The number of patients and events required to limit the risk of overestimation depended considerably on the underlying simulation settings. Surpassing the optimal information size generally provided sufficient protection against overestimation. Conclusions Random errors are a frequent cause of overestimation of intervention effects in meta-analyses. Surpassing the optimal information size will provide sufficient protection against overestimation. PMID:22028777

  11. Analysis of the sensitivity of in vitro bioassays for androgenic, progestagenic, glucocorticoid, thyroid and estrogenic activity: Suitability for drinking and environmental waters.

    PubMed

    Leusch, Frederic D L; Neale, Peta A; Hebert, Armelle; Scheurer, Marco; Schriks, Merijn C M

    2017-02-01

    The presence of endocrine disrupting chemicals in the aquatic environment poses a risk for ecosystem health. Consequently there is a need for sensitive tools, such as in vitro bioassays, to monitor endocrine activity in environmental waters. The aim of the current study was to assess whether current in vitro bioassays are suitable to detect endocrine activity in a range of water types. The reviewed assays included androgenic (n=11), progestagenic (n=6), glucocorticoid (n=5), thyroid (n=5) and estrogenic (n=8) activity in both agonist and antagonist mode. Existing in vitro bioassay data were re-evaluated to determine assay sensitivity, with the calculated method detection limit compared with measured hormonal activity in treated wastewater, surface water and drinking water to quantify whether the studied assays were sufficiently sensitive for environmental samples. With typical sample enrichment, current in vitro bioassays are sufficiently sensitive to detect androgenic activity in treated wastewater and surface water, with anti-androgenic activity able to be detected in most environmental waters. Similarly, with sufficient enrichment, the studied mammalian assays are able to detect estrogenic activity even in drinking water samples. Fewer studies have focused on progestagenic and glucocorticoid activity, but some of the reviewed bioassays are suitable for detecting activity in treated wastewater and surface water. Even less is known about (anti)thyroid activity, but the available data suggests that the more sensitive reviewed bioassays are still unlikely to detect this type of activity in environmental waters. The findings of this review can help provide guidance on in vitro bioassay selection and required sample enrichment for optimised detection of endocrine activity in environmental waters. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Sample size planning for composite reliability coefficients: accuracy in parameter estimation via narrow confidence intervals.

    PubMed

    Terry, Leann; Kelley, Ken

    2012-11-01

    Composite measures play an important role in psychology and related disciplines. Composite measures almost always have error. Correspondingly, it is important to understand the reliability of the scores from any particular composite measure. However, the point estimates of the reliability of composite measures are fallible and thus all such point estimates should be accompanied by a confidence interval. When confidence intervals are wide, there is much uncertainty in the population value of the reliability coefficient. Given the importance of reporting confidence intervals for estimates of reliability, coupled with the undesirability of wide confidence intervals, we develop methods that allow researchers to plan sample size in order to obtain narrow confidence intervals for population reliability coefficients. We first discuss composite reliability coefficients and then provide a discussion on confidence interval formation for the corresponding population value. Using the accuracy in parameter estimation approach, we develop two methods to obtain accurate estimates of reliability by planning sample size. The first method provides a way to plan sample size so that the expected confidence interval width for the population reliability coefficient is sufficiently narrow. The second method ensures that the confidence interval width will be sufficiently narrow with some desired degree of assurance (e.g., 99% assurance that the 95% confidence interval for the population reliability coefficient will be less than W units wide). The effectiveness of our methods was verified with Monte Carlo simulation studies. We demonstrate how to easily implement the methods with easy-to-use and freely available software. ©2011 The British Psychological Society.

  13. Method of evaluation of process of red blood cell sedimentation based on photometry of droplet samples.

    PubMed

    Aristov, Alexander; Nosova, Ekaterina

    2017-04-01

    The paper focuses on research aimed at creating and testing a new approach to evaluate the processes of aggregation and sedimentation of red blood cells for purpose of its use in clinical laboratory diagnostics. The proposed method is based on photometric analysis of blood sample formed as a sessile drop. The results of clinical approbation of this method are given in the paper. Analysis of the processes occurring in the sample in the form of sessile drop during the process of blood cells sedimentation is described. The results of experimental studies to evaluate the effect of the droplet sample focusing properties on light radiation transmittance are presented. It is shown that this method significantly reduces the sample volume and provides sufficiently high sensitivity to the studied processes.

  14. Revisiting sample size: are big trials the answer?

    PubMed

    Lurati Buse, Giovanna A L; Botto, Fernando; Devereaux, P J

    2012-07-18

    The superiority of the evidence generated in randomized controlled trials over observational data is not only conditional to randomization. Randomized controlled trials require proper design and implementation to provide a reliable effect estimate. Adequate random sequence generation, allocation implementation, analyses based on the intention-to-treat principle, and sufficient power are crucial to the quality of a randomized controlled trial. Power, or the probability of the trial to detect a difference when a real difference between treatments exists, strongly depends on sample size. The quality of orthopaedic randomized controlled trials is frequently threatened by a limited sample size. This paper reviews basic concepts and pitfalls in sample-size estimation and focuses on the importance of large trials in the generation of valid evidence.

  15. Influence of Sampling Effort on the Estimated Richness of Road-Killed Vertebrate Wildlife

    NASA Astrophysics Data System (ADS)

    Bager, Alex; da Rosa, Clarissa A.

    2011-05-01

    Road-killed mammals, birds, and reptiles were collected weekly from highways in southern Brazil in 2002 and 2005. The objective was to assess variation in estimates of road-kill impacts on species richness produced by different sampling efforts, and to provide information to aid in the experimental design of future sampling. Richness observed in weekly samples was compared with sampling for different periods. In each period, the list of road-killed species was evaluated based on estimates the community structure derived from weekly samplings, and by the presence of the ten species most subject to road mortality, and also of threatened species. Weekly samples were sufficient only for reptiles and mammals, considered separately. Richness estimated from the biweekly samples was equal to that found in the weekly samples, and gave satisfactory results for sampling the most abundant and threatened species. The ten most affected species showed constant road-mortality rates, independent of sampling interval, and also maintained their dominance structure. Birds required greater sampling effort. When the composition of road-killed species varies seasonally, it is necessary to take biweekly samples for a minimum of one year. Weekly or more-frequent sampling for periods longer than two years is necessary to provide a reliable estimate of total species richness.

  16. Influence of sampling effort on the estimated richness of road-killed vertebrate wildlife.

    PubMed

    Bager, Alex; da Rosa, Clarissa A

    2011-05-01

    Road-killed mammals, birds, and reptiles were collected weekly from highways in southern Brazil in 2002 and 2005. The objective was to assess variation in estimates of road-kill impacts on species richness produced by different sampling efforts, and to provide information to aid in the experimental design of future sampling. Richness observed in weekly samples was compared with sampling for different periods. In each period, the list of road-killed species was evaluated based on estimates the community structure derived from weekly samplings, and by the presence of the ten species most subject to road mortality, and also of threatened species. Weekly samples were sufficient only for reptiles and mammals, considered separately. Richness estimated from the biweekly samples was equal to that found in the weekly samples, and gave satisfactory results for sampling the most abundant and threatened species. The ten most affected species showed constant road-mortality rates, independent of sampling interval, and also maintained their dominance structure. Birds required greater sampling effort. When the composition of road-killed species varies seasonally, it is necessary to take biweekly samples for a minimum of one year. Weekly or more-frequent sampling for periods longer than two years is necessary to provide a reliable estimate of total species richness.

  17. Method of quantitating dsDNA

    DOEpatents

    Stark, Peter C.; Kuske, Cheryl R.; Mullen, Kenneth I.

    2002-01-01

    A method for quantitating dsDNA in an aqueous sample solution containing an unknown amount of dsDNA. A first aqueous test solution containing a known amount of a fluorescent dye-dsDNA complex and at least one fluorescence-attenutating contaminant is prepared. The fluorescence intensity of the test solution is measured. The first test solution is diluted by a known amount to provide a second test solution having a known concentration of dsDNA. The fluorescence intensity of the second test solution is measured. Additional diluted test solutions are similarly prepared until a sufficiently dilute test solution having a known amount of dsDNA is prepared that has a fluorescence intensity that is not attenuated upon further dilution. The value of the maximum absorbance of this solution between 200-900 nanometers (nm), referred to herein as the threshold absorbance, is measured. A sample solution having an unknown amount of dsDNA and an absorbance identical to that of the sufficiently dilute test solution at the same chosen wavelength is prepared. Dye is then added to the sample solution to form the fluorescent dye-dsDNA-complex, after which the fluorescence intensity of the sample solution is measured and the quantity of dsDNA in the sample solution is determined. Once the threshold absorbance of a sample solution obtained from a particular environment has been determined, any similarly prepared sample solution taken from a similar environment and having the same value for the threshold absorbance can be quantified for dsDNA by adding a large excess of dye to the sample solution and measuring its fluorescence intensity.

  18. 49 CFR 40.263 - What happens when an employee is unable to provide a sufficient amount of saliva for an alcohol...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... sufficient amount of saliva for an alcohol screening test? (a) As the STT, you must take the following steps if an employee is unable to provide sufficient saliva to complete a test on a saliva screening device (e.g., the employee does not provide sufficient saliva to activate the device). (1) You must conduct...

  19. Centralized and decentralized global outer-synchronization of asymmetric recurrent time-varying neural network by data-sampling.

    PubMed

    Lu, Wenlian; Zheng, Ren; Chen, Tianping

    2016-03-01

    In this paper, we discuss outer-synchronization of the asymmetrically connected recurrent time-varying neural networks. By using both centralized and decentralized discretization data sampling principles, we derive several sufficient conditions based on three vector norms to guarantee that the difference of any two trajectories starting from different initial values of the neural network converges to zero. The lower bounds of the common time intervals between data samples in centralized and decentralized principles are proved to be positive, which guarantees exclusion of Zeno behavior. A numerical example is provided to illustrate the efficiency of the theoretical results. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Note: A pure-sampling quantum Monte Carlo algorithm with independent Metropolis.

    PubMed

    Vrbik, Jan; Ospadov, Egor; Rothstein, Stuart M

    2016-07-14

    Recently, Ospadov and Rothstein published a pure-sampling quantum Monte Carlo algorithm (PSQMC) that features an auxiliary Path Z that connects the midpoints of the current and proposed Paths X and Y, respectively. When sufficiently long, Path Z provides statistical independence of Paths X and Y. Under those conditions, the Metropolis decision used in PSQMC is done without any approximation, i.e., not requiring microscopic reversibility and without having to introduce any G(x → x'; τ) factors into its decision function. This is a unique feature that contrasts with all competing reptation algorithms in the literature. An example illustrates that dependence of Paths X and Y has adverse consequences for pure sampling.

  1. Note: A pure-sampling quantum Monte Carlo algorithm with independent Metropolis

    NASA Astrophysics Data System (ADS)

    Vrbik, Jan; Ospadov, Egor; Rothstein, Stuart M.

    2016-07-01

    Recently, Ospadov and Rothstein published a pure-sampling quantum Monte Carlo algorithm (PSQMC) that features an auxiliary Path Z that connects the midpoints of the current and proposed Paths X and Y, respectively. When sufficiently long, Path Z provides statistical independence of Paths X and Y. Under those conditions, the Metropolis decision used in PSQMC is done without any approximation, i.e., not requiring microscopic reversibility and without having to introduce any G(x → x'; τ) factors into its decision function. This is a unique feature that contrasts with all competing reptation algorithms in the literature. An example illustrates that dependence of Paths X and Y has adverse consequences for pure sampling.

  2. Use of self-collected capillary blood samples for islet autoantibody screening in relatives: a feasibility and acceptability study.

    PubMed

    Liu, Y; Rafkin, L E; Matheson, D; Henderson, C; Boulware, D; Besser, R E J; Ferrara, C; Yu, L; Steck, A K; Bingley, P J

    2017-07-01

    To evaluate the feasibility of using self-collected capillary blood samples for islet autoantibody testing to identify risk in relatives of people with Type 1 diabetes. Participants were recruited via the observational TrialNet Pathway to Prevention study, which screens and monitors relatives of people with Type 1 diabetes for islet autoantibodies. Relatives were sent kits for capillary blood collection, with written instructions, an online instructional video link and a questionnaire. Sera from capillary blood samples were tested for autoantibodies to glutamic acid decarboxylase, islet antigen-2, insulin and zinc transporter 8. 'Successful' sample collection was defined as obtaining sufficient volume and quality to provide definitive autoantibody results, including confirmation of positive results by repeat assay. In 240 relatives who returned samples, the median (range) age was 15.5 (1-49) years and 51% were male. Of these samples, 98% were sufficient for glutamic acid decarboxylase, islet antigen-2 and zinc transporter 8 autoantibody testing and 84% for insulin autoantibody testing and complete autoantibody screen. The upper 90% confidence bound for unsuccessful collection was 4.4% for glutamic acid decarboxylase, islet antigen-2 and/or zinc transporter 8 autoantibody assays, and 19.3% for insulin autoantibodies. Despite 43% of 220 questionnaire respondents finding capillary blood collection uncomfortable or painful, 82% preferred home self-collection of capillary blood samples compared with outpatient venepuncture (90% of those aged <8 years, 83% of those aged 9-18 years and 73% of those aged >18 years). The perceived difficulty of collecting capillary blood samples did not affect success rate. Self-collected capillary blood sampling offers a feasible alternative to venous sampling, with the potential to facilitate autoantibody screening for Type 1 diabetes risk. © 2017 Diabetes UK.

  3. Ground truth crop proportion summaries for US segments, 1976-1979

    NASA Technical Reports Server (NTRS)

    Horvath, R. (Principal Investigator); Rice, D.; Wessling, T.

    1981-01-01

    The original ground truth data was collected, digitized, and registered to LANDSAT data for use in the LACIE and AgRISTARS projects. The numerous ground truth categories were consolidated into fewer classes of crops or crop conditions and counted occurrences of these classes for each segment. Tables are presented in which the individual entries are the percentage of total segment area assigned to a given class. The ground truth summaries were prepared from a 20% sample of the scene. An analysis indicates that this size of sample provides sufficient accuracy for use of the data in initial segment screening.

  4. Multiwavelength absorbance of filter deposits for determination of environmental tobacco smoke and black carbon

    NASA Astrophysics Data System (ADS)

    Lawless, Phil A.; Rodes, Charles E.; Ensor, David S.

    A multiwavelength optical absorption technique has been developed for Teflon filters used for personal exposure sampling with sufficient sensitivity to allow apportionments of environmental tobacco smoke and soot (black) carbon to be made. Measurements on blank filters show that the filter material itself contributes relatively little to the total absorbance and filters from the same lot have similar characteristics; this makes retrospective analysis of filters quite feasible. Using an integrating sphere radiometer and multiple wavelengths to provide specificity, the determination of tobacco smoke and carbon with reasonable accuracy is possible on filters not characterized before exposure. This technique provides a low cost, non-destructive exposure assessment alternative to both standard thermo-gravimetric elemental carbon evaluations on quartz filters and cotinine analyses from urine or saliva samples. The method allows the same sample filter to be used for assessment of mass, carbon, and tobacco smoke without affecting the deposit.

  5. A sampling strategy for promoting and assessing medical student retention of physical examination skills.

    PubMed

    Williams, Reed G; Klamen, Debra L; Mayer, David; Valaski, Maureen; Roberts, Nicole K

    2007-10-01

    Skill acquisition and maintenance requires spaced deliberate practice. Assessing medical students' physical examination performance ability is resource intensive. The authors assessed the nature and size of physical examination performance samples necessary to accurately estimate total physical examination skill. Physical examination assessment data were analyzed from second year students at the University of Illinois College of Medicine at Chicago in 2002, 2003, and 2004 (N = 548). Scores on subgroups of physical exam maneuvers were compared with scores on the total physical exam, to identify sound predictors of total test performance. Five exam subcomponents were sufficiently correlated to overall test performance and provided adequate sensitivity and specificity to serve as a means to prompt continued student review and rehearsal of physical examination technical skills. Selection and administration of samples of the total physical exam provide a resource-saving approach for promoting and estimating overall physical examination skills retention.

  6. Staple Food Self-Sufficiency of Farmers Household Level in The Great Solo

    NASA Astrophysics Data System (ADS)

    Darsono

    2017-04-01

    Analysis of food security level of household is a novelty of measurement standards which usually includes regional and national levels. With household approach is expected to provide the basis of sharp food policy formulation. The purpose of this study are to identify the condition of self-sufficiency in staple foods, and to find the main factors affecting the dynamics of self-sufficiency in staple foods on farm household level in Great Solo. Using primary data from 50 farmers in the sample and secondary data in Great Solo (Surakarta city, Boyolali, Sukoharjo, Karanganyar, Wonogiri, Sragen and Klaten). Compiled panel data were analyzed with linear probability regression models to produce a good model. The results showed that farm households in Great Solo has a surplus of staple food (rice) with an average consumption rate of 96.8 kg/capita/year. This number is lower than the national rate of 136.7 kg/capita/year. The main factors affecting the level of food self-sufficiency in the farmer household level are: rice production, rice consumption, land tenure, and number of family members. Key recommendations from this study are; improvement scale of the land cultivation for rice farming and non-rice diversification consumption.

  7. Architectural design of an Algol interpreter

    NASA Technical Reports Server (NTRS)

    Jackson, C. K.

    1971-01-01

    The design of a syntax-directed interpreter for a subset of Algol is described. It is a conceptual design with sufficient details and completeness but as much independence of implementation as possible. The design includes a detailed description of a scanner, an analyzer described in the Floyd-Evans productions, a hash-coded symbol table, and an executor. Interpretation of sample programs is also provided to show how the interpreter functions.

  8. The endothelial sample size analysis in corneal specular microscopy clinical examinations.

    PubMed

    Abib, Fernando C; Holzchuh, Ricardo; Schaefer, Artur; Schaefer, Tania; Godois, Ronialci

    2012-05-01

    To evaluate endothelial cell sample size and statistical error in corneal specular microscopy (CSM) examinations. One hundred twenty examinations were conducted with 4 types of corneal specular microscopes: 30 with each BioOptics, CSO, Konan, and Topcon corneal specular microscopes. All endothelial image data were analyzed by respective instrument software and also by the Cells Analyzer software with a method developed in our lab. A reliability degree (RD) of 95% and a relative error (RE) of 0.05 were used as cut-off values to analyze images of the counted endothelial cells called samples. The sample size mean was the number of cells evaluated on the images obtained with each device. Only examinations with RE < 0.05 were considered statistically correct and suitable for comparisons with future examinations. The Cells Analyzer software was used to calculate the RE and customized sample size for all examinations. Bio-Optics: sample size, 97 ± 22 cells; RE, 6.52 ± 0.86; only 10% of the examinations had sufficient endothelial cell quantity (RE < 0.05); customized sample size, 162 ± 34 cells. CSO: sample size, 110 ± 20 cells; RE, 5.98 ± 0.98; only 16.6% of the examinations had sufficient endothelial cell quantity (RE < 0.05); customized sample size, 157 ± 45 cells. Konan: sample size, 80 ± 27 cells; RE, 10.6 ± 3.67; none of the examinations had sufficient endothelial cell quantity (RE > 0.05); customized sample size, 336 ± 131 cells. Topcon: sample size, 87 ± 17 cells; RE, 10.1 ± 2.52; none of the examinations had sufficient endothelial cell quantity (RE > 0.05); customized sample size, 382 ± 159 cells. A very high number of CSM examinations had sample errors based on Cells Analyzer software. The endothelial sample size (examinations) needs to include more cells to be reliable and reproducible. The Cells Analyzer tutorial routine will be useful for CSM examination reliability and reproducibility.

  9. Design and Feasibility Assessment of a Retrospective Epidemiological Study of Coal-Fired Power Plant Emissions in the Pittsburgh Pennsylvania Region

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Richard A. Bilonick; Daniel Connell; Evelyn Talbott

    2006-12-20

    Eighty-nine (89) percent of the electricity supplied in the 35-county Pittsburgh region (comprising parts of the states of Pennsylvania, Ohio, West Virginia, and Maryland) is generated by coal-fired power plants making this an ideal region in which to study the effects of the fine airborne particulates designated as PM{sub 2.5} emitted by the combustion of coal. This report demonstrates that during the period from 1999-2006 (1) sufficient and extensive exposure data, in particular samples of speciated PM{sub 2.5} components from 1999 to 2003, and including gaseous co-pollutants and weather have been collected, (2) sufficient and extensive mortality, morbidity, and relatedmore » health outcomes data are readily available, and (3) the relationship between health effects and fine particulates can most likely be satisfactorily characterized using a combination of sophisticated statistical methodologies including latent variable modeling (LVM) and generalized linear autoregressive moving average (GLARMA) time series analysis. This report provides detailed information on the available exposure data and the available health outcomes data for the construction of a comprehensive database suitable for analysis, illustrates the application of various statistical methods to characterize the relationship between health effects and exposure, and provides a road map for conducting the proposed study. In addition, a detailed work plan for conducting the study is provided and includes a list of tasks and an estimated budget. A substantial portion of the total study cost is attributed to the cost of analyzing a large number of archived PM{sub 2.5} filters. Analysis of a representative sample of the filters supports the reliability of this invaluable but as-yet untapped resource. These filters hold the key to having sufficient data on the components of PM{sub 2.5} but have a limited shelf life. If the archived filters are not analyzed promptly the important and costly information they contain will be lost.« less

  10. Bayesian geostatistics in health cartography: the perspective of malaria.

    PubMed

    Patil, Anand P; Gething, Peter W; Piel, Frédéric B; Hay, Simon I

    2011-06-01

    Maps of parasite prevalences and other aspects of infectious diseases that vary in space are widely used in parasitology. However, spatial parasitological datasets rarely, if ever, have sufficient coverage to allow exact determination of such maps. Bayesian geostatistics (BG) is a method for finding a large sample of maps that can explain a dataset, in which maps that do a better job of explaining the data are more likely to be represented. This sample represents the knowledge that the analyst has gained from the data about the unknown true map. BG provides a conceptually simple way to convert these samples to predictions of features of the unknown map, for example regional averages. These predictions account for each map in the sample, yielding an appropriate level of predictive precision.

  11. Bayesian geostatistics in health cartography: the perspective of malaria

    PubMed Central

    Patil, Anand P.; Gething, Peter W.; Piel, Frédéric B.; Hay, Simon I.

    2011-01-01

    Maps of parasite prevalences and other aspects of infectious diseases that vary in space are widely used in parasitology. However, spatial parasitological datasets rarely, if ever, have sufficient coverage to allow exact determination of such maps. Bayesian geostatistics (BG) is a method for finding a large sample of maps that can explain a dataset, in which maps that do a better job of explaining the data are more likely to be represented. This sample represents the knowledge that the analyst has gained from the data about the unknown true map. BG provides a conceptually simple way to convert these samples to predictions of features of the unknown map, for example regional averages. These predictions account for each map in the sample, yielding an appropriate level of predictive precision. PMID:21420361

  12. Gas chromatography/matrix-isolation apparatus

    DOEpatents

    Reedy, G.T.

    1986-06-10

    A gas-sample collection device provides matrix isolation of individual gas bands from a gas chromatographic separation and for the spectroscopic analysis of the individual sample bands. The device includes a vacuum chamber containing a rotatably supported, specular carousel having at least one reflecting surface for holding a sample deposited thereon. A gas inlet is provided for depositing a mixture of sample and matrix material on the reflecting surface which is maintained at a sufficiently low temperature to cause solidification. A first parabolic mirror directs an incident beam of electromagnetic radiation, such as in the infrared (IR) spectrum, from a source onto the sample/matrix mixture while a second parabolic mirror directs a second beam of electromagnetic radiation reflected by the specular surface to an IR spectrometer for determining the absorption spectra of the sample material deposited on the reflecting surface. The pair of off-axis parabolic mirrors having a common focal point are positioned outside of the vacuum chamber and may be displaced in combination for improved beam positioning and alignment. The carousel is provided with an aperture for each reflecting surface to facilitate accurate positioning of the incident beam relative to the gas-samples under analysis. Improved gas-sample deposition is insured by the use of a long focal length stereomicroscope positioned outside of the vacuum chamber for monitoring sample formation through a window, while the sample collector is positioned outside of the zone bounded by the incident and reflected electromagnetic beams for improved sample access and monitoring. 10 figs.

  13. Gas chromatography/matrix-isolation apparatus

    DOEpatents

    Reedy, Gerald T.

    1986-01-01

    A gas-sample collection device provides matrix isolation of individual gas bands from a gas chromatographic separation and for the spectroscopic analysis of the individual sample bands. The device includes a vacuum chamber containing a rotatably supported, specular carousel having at least one reflecting surface for holding a sample deposited thereon. A gas inlet is provided for depositing a mixture of sample and matrix material on the reflecting surface which is maintained at a sufficiently low temperature to cause solidification. A first parabolic mirror directs an incident beam of electromagnetic radiation, such as in the infrared (IR) spectrum, from a source onto the sample/matrix mixture while a second parabolic mirror directs a second beam of electromagnetic radiation reflected by the specular surface to an IR spectrometer for determining the absorption spectra of the sample material deposited on the reflecting surface. The pair of off-axis parabolic mirrors having a common focal point are positioned outside of the vacuum chamber and may be displaced in combination for improved beam positioning and alignment. The carousel is provided with an aperture for each reflecting surface to facilitate accurate positioning of the incident beam relative to the gas-samples under analysis. Improved gas-sample deposition is insured by the use of a long focal length stereomicroscope positioned outside of the vacuum chamber for monitoring sample formation through a window, while the sample collector is positioned outside of the zone bounded by the incident and reflected electromagnetic beams for improved sample access and monitoring.

  14. Finding the Fertile Phase: Low-Cost Luteinizing Hormone Sticks Versus Electronic Fertility Monitor.

    PubMed

    Barron, Mary Lee; Vanderkolk, Kaitlin; Raviele, Kathleen

    To investigate if generic Wondfo ovulation sticks (WLH) are sufficiently sensitive to the luteinizing hormone (LH) surge in urine when used with the Marquette Fertility Algorithm. The electronic hormonal fertility monitor (EHFM) is highly accurate in detecting the LH surge but cost of the monitor and the accompanying test sticks has increased over the last several years. The EHFM is sensitive to detect the LH surge at 20 milli-international units per milliliter (mIU/mL); the WLH sticks are slightly less sensitive at 25 mIU/mL. A convenience sample of women using the Marquette Method of Natural Family Planning with the EHFM to avoid pregnancy were recruited (N = 54). Each participant used the EHFM every morning after cycle day 6 and tested morning and evening urine with the WLH stick until the day following detection of the LH surge on the EHFM. Forty-two women provided 219 cycles. Frequency of LH surge detection was 182/219 (83.1%) for EHFM and 203/219 (92.7%) for WLH sticks. Agreement between the EHFM and the WLH on the day of the LH surge was 97.7%. High fertility readings providing a warning of peak fertility at least 5 days before peak was 67% for the WLH; the EHFM was 47.7%. Paired sample correlations for high fertility was .174 (p = .014) and paired sample differences t was -4.729 (p = .000). The WLH stick is sufficiently sensitive to use in place of the EFHM for determining peak fertility and with the Marquette Fertility algorithm. Even with minimal use, WLH sticks cost about half the price of the monitor strips and provide more flexibility of use. Cost differences increase with the number of sticks used per cycle. Further research with a larger sample is needed to verify results.

  15. A Pilot Study Using Mixed GPS/Narrative Interview Methods to Understand Geospatial Behavior in Homeless Populations.

    PubMed

    North, Carol S; Wohlford, Sarah E; Dean, Denis J; Black, Melissa; Balfour, Margaret E; Petrovich, James C; Downs, Dana L; Pollio, David E

    2017-08-01

    Tracking the movements of homeless populations presents methodological difficulties, but understanding their movements in space and time is needed to inform optimal placement of services. This pilot study developed, tested, and refined methods to apply global positioning systems (GPS) technology paired with individual narratives to chronicle the movements of homeless populations. Detail of methods development and difficulties encountered and addressed, and geospatial findings are provided. A pilot sample of 29 adults was recruited from a low-demand homeless shelter in the downtown area of Fort Worth, Texas. Pre- and post-deployment interviews provided participant characteristics and planned and retrospectively-reported travels. Only one of the first eight deployments returned with sufficient usable data. Ultimately 19 participants returned the GPS device with >20 h of usable data. Protocol adjustments addressing methodological difficulties achieved 81 % of subsequent participants returning with sufficient usable data. This study established methods and demonstrated feasibility for tracking homeless population travels.

  16. Construct Validity and Reliability of the Questionnaire on the Quality of Physician-Patient Interaction in Adults With Hypertension.

    PubMed

    Hickman, Ronald L; Clochesy, John M; Hetland, Breanna; Alaamri, Marym

    2017-04-01

    There are limited reliable and valid measures of the patient- provider interaction among adults with hypertension. Therefore, the purpose of this report is to describe the construct validity and reliability of the Questionnaire on the Quality of Physician-Patient Interaction (QQPPI), in community-dwelling adults with hypertension. A convenience sample of 109 participants with hypertension was recruited and administered the QQPPI at baseline and 8 weeks later. The exploratory factor analysis established a 12-item, 2-factor structure for the QQPPI was valid in this sample. The modified QQPPI proved to have sufficient internal consistency and test- retest reliability. The modified QQPPI is a valid and reliable measure of the provider-patient interaction, a construct posited to impact self-management, in adults with hypertension.

  17. Tank 241-AZ-102 Privatization Push Mode Core Sampling and Analysis Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    RASMUSSEN, J.H.

    1999-08-02

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for samples obtained from tank 241-AZ-102. The purpose of this sampling event is to obtain information about the characteristics of the contents of 241-AZ-102 required to satisfy the Data Quality Objectives For TWRS Privatization Phase I: Confirm Tank TIS An Appropriate Feed Source For High-Level Waste Feed Batch X(HLW DQO) (Nguyen 1999a), Data Quality Objectives For TWRS Privatization Phase 1: Confirm Tank TIS An Appropriate Feed Source For Low-Activity Waste Feed Batch X (LAW DQO) (Nguyen 1999b), Low Activity Waste andmore » High Level Waste Feed Data Quality Objectives (L&H DQO) (Patello et al. 1999) and Characterization Data Needs for Development, Design, and Operation of Retrieval Equipment Developed through the Data Quality Objective Process (Equipment DQO) (Bloom 1996). The Tank Characterization Technical Sampling Basis document (Brown et al. 1998) indicates that these issues, except the Equipment DQO apply to tank 241-AZ-102 for this sampling event. The Equipment DQO is applied for shear strength measurements of the solids segments only. Poppiti (1999) requires additional americium-241 analyses of the sludge segments. Brown et al. (1998) also identify safety screening, regulatory issues and provision of samples to the Privatization Contractor(s) as applicable issues for this tank. However, these issues will not be addressed via this sampling event. Reynolds et al. (1999) concluded that information from previous sampling events was sufficient to satisfy the safety screening requirements for tank 241 -AZ-102. Push mode core samples will be obtained from risers 15C and 24A to provide sufficient material for the chemical analyses and tests required to satisfy these data quality objectives. The 222-S Laboratory will extrude core samples, composite the liquids and solids, perform chemical analyses, and provide subsamples to the Process Chemistry Laboratory. The Process Chemistry Laboratory will prepare test plans and perform process tests to evaluate the behavior of the 241-AZ-102 waste undergoing the retrieval and treatment scenarios defined in the applicable DQOs. Requirements for analyses of samples originating in the process tests will be documented in the corresponding test plan.« less

  18. Temporal and spatial resolution required for imaging myocardial function

    NASA Astrophysics Data System (ADS)

    Eusemann, Christian D.; Robb, Richard A.

    2004-05-01

    4-D functional analysis of myocardial mechanics is an area of significant interest and research in cardiology and vascular/interventional radiology. Current multidimensional analysis is limited by insufficient temporal resolution of x-ray and magnetic resonance based techniques, but recent improvements in system design holds hope for faster and higher resolution scans to improve images of moving structures allowing more accurate functional studies, such as in the heart. This paper provides a basis for the requisite temporal and spatial resolution for useful imaging during individual segments of the cardiac cycle. Multiple sample rates during systole and diastole are compared to determine an adequate sample frequency to reduce regional myocardial tracking errors. Concurrently, out-of-plane resolution has to be sufficiently high to minimize partial volume effect. Temporal resolution and out-of-plane spatial resolution are related factors that must be considered together. The data used for this study is a DSR dynamic volume image dataset with high temporal and spatial resolution using implanted fiducial markers to track myocardial motion. The results of this study suggest a reduced exposure and scan time for x-ray and magnetic resonance imaging methods, since a lower sample rate during systole is sufficient, whereas the period of rapid filling during diastole requires higher sampling. This could potentially reduce the cost of these procedures and allow higher patient throughput.

  19. Surveying multiple health professional team members within institutional settings: an example from the nursing home industry.

    PubMed

    Clark, Melissa A; Roman, Anthony; Rogers, Michelle L; Tyler, Denise A; Mor, Vincent

    2014-09-01

    Quality improvement and cost containment initiatives in health care increasingly involve interdisciplinary teams of providers. To understand organizational functioning, information is often needed from multiple members of a leadership team since no one person may have sufficient knowledge of all aspects of the organization. To minimize survey burden, it is ideal to ask unique questions of each member of the leadership team in areas of their expertise. However, this risks substantial missing data if all eligible members of the organization do not respond to the survey. Nursing home administrators (NHA) and directors of nursing (DoN) play important roles in the leadership of long-term care facilities. Surveys were administered to NHAs and DoNs from a random, nationally representative sample of U.S. nursing homes about the impact of state policies, market forces, and organizational factors that impact provider performance and residents' outcomes. Responses were obtained from a total of 2,686 facilities (response rate [RR] = 66.6%) in which at least one individual completed the questionnaire and 1,693 facilities (RR = 42.0%) in which both providers participated. No evidence of nonresponse bias was detected. A high-quality representative sample of two providers in a long-term care facility can be obtained. It is possible to optimize data collection by obtaining unique information about the organization from each provider while minimizing the number of items asked of each individual. However, sufficient resources must be available for follow-up to nonresponders with particular attention paid to lower resourced, lower quality facilities caring for higher acuity residents in highly competitive nursing home markets. © The Author(s) 2014.

  20. Surveying Multiple Health Professional Team Members within Institutional Settings: An Example from the Nursing Home Industry

    PubMed Central

    Clark, Melissa A.; Roman, Anthony; Rogers, Michelle L.; Tyler, Denise A.; Mor, Vincent

    2015-01-01

    Quality improvement and cost containment initiatives in health care increasingly involve interdisciplinary teams of providers. To understand organizational functioning, information is often needed from multiple members of a leadership team since no one person may have sufficient knowledge of all aspects of the organization. To minimize survey burden, it is ideal to ask unique questions of each member of the leadership team in areas of their expertise. However, this risks substantial missing data if all eligible members of the organization do not respond to the survey. Nursing Home Administrators (NHA) and Directors of Nursing (DoN) play important roles in the leadership of long-term care facilities. Surveys were administered to NHAs and DoNs from a random, nationally-representative sample of U.S. nursing homes about the impact of state policies, market forces, and organizational factors that impact provider performance and residents’ outcomes. Responses were obtained from a total of 2686 facilities [Response Rate=66.6%] in which at least one individual completed the questionnaire and 1693 facilities [Response Rate=42.0%] in which both providers participated. No evidence of non-response bias was detected. A high-quality representative sample of two providers in a long-term care facility can be obtained. It is possible to optimize data collection by obtaining unique information about the organization from each provider while minimizing the number of items asked of each individual. However, sufficient resources must be available for follow-up to non-responders with particular attention paid to lower resourced, lower quality facilities caring for higher acuity residents in highly competitive nursing home markets. PMID:24500999

  1. Detecting higher-order wavefront errors with an astigmatic hybrid wavefront sensor.

    PubMed

    Barwick, Shane

    2009-06-01

    The reconstruction of wavefront errors from measurements over subapertures can be made more accurate if a fully characterized quadratic surface can be fitted to the local wavefront surface. An astigmatic hybrid wavefront sensor with added neural network postprocessing is shown to have this capability, provided that the focal image of each subaperture is sufficiently sampled. Furthermore, complete local curvature information is obtained with a single image without splitting beam power.

  2. Development of Automated Moment Tensor Software at the Prototype International Data Center

    DTIC Science & Technology

    2000-09-01

    Berkeley Digital Seismic Network stations in the 100 to 500 km distance range. With sufficient azimuthal coverage this method is found to perform...the solution reported by NIED (http://argent.geo.bosai.go.jp/ freesia /event/hypo/joho.html). The normal mechanism obtained by the three-component...Digital Seismic Network stations. These stations provide more than 100 degrees of azimuthal coverage, which is an adequate sampling of the focal

  3. 3D Diffraction Microscope Provides a First Deep View

    NASA Astrophysics Data System (ADS)

    Miao, Jianwei

    2005-03-01

    When a coherent diffraction pattern is sampled at a spacing sufficiently finer than the Bragg peak frequency (i.e. the inverse of the sample size), the phase information is in principle encoded inside the diffraction pattern, and can be directly retrieved by using an iterative process. In combination of this oversampling phasing method with either coherent X-rays or electrons, a novel form of diffraction microscopy has recently been developed to image nanoscale materials and biological structures. In this talk, I will present the principle of the oversampling method, discuss the first experimental demonstration of this microscope, and illustrate some applications in nanoscience and biology.

  4. Errors in Measuring Water Potentials of Small Samples Resulting from Water Adsorption by Thermocouple Psychrometer Chambers 1

    PubMed Central

    Bennett, Jerry M.; Cortes, Peter M.

    1985-01-01

    The adsorption of water by thermocouple psychrometer assemblies is known to cause errors in the determination of water potential. Experiments were conducted to evaluate the effect of sample size and psychrometer chamber volume on measured water potentials of leaf discs, leaf segments, and sodium chloride solutions. Reasonable agreement was found between soybean (Glycine max L. Merr.) leaf water potentials measured on 5-millimeter radius leaf discs and large leaf segments. Results indicated that while errors due to adsorption may be significant when using small volumes of tissue, if sufficient tissue is used the errors are negligible. Because of the relationship between water potential and volume in plant tissue, the errors due to adsorption were larger with turgid tissue. Large psychrometers which were sealed into the sample chamber with latex tubing appeared to adsorb more water than those sealed with flexible plastic tubing. Estimates are provided of the amounts of water adsorbed by two different psychrometer assemblies and the amount of tissue sufficient for accurate measurements of leaf water potential with these assemblies. It is also demonstrated that water adsorption problems may have generated low water potential values which in prior studies have been attributed to large cut surface area to volume ratios. PMID:16664367

  5. Errors in measuring water potentials of small samples resulting from water adsorption by thermocouple psychrometer chambers.

    PubMed

    Bennett, J M; Cortes, P M

    1985-09-01

    The adsorption of water by thermocouple psychrometer assemblies is known to cause errors in the determination of water potential. Experiments were conducted to evaluate the effect of sample size and psychrometer chamber volume on measured water potentials of leaf discs, leaf segments, and sodium chloride solutions. Reasonable agreement was found between soybean (Glycine max L. Merr.) leaf water potentials measured on 5-millimeter radius leaf discs and large leaf segments. Results indicated that while errors due to adsorption may be significant when using small volumes of tissue, if sufficient tissue is used the errors are negligible. Because of the relationship between water potential and volume in plant tissue, the errors due to adsorption were larger with turgid tissue. Large psychrometers which were sealed into the sample chamber with latex tubing appeared to adsorb more water than those sealed with flexible plastic tubing. Estimates are provided of the amounts of water adsorbed by two different psychrometer assemblies and the amount of tissue sufficient for accurate measurements of leaf water potential with these assemblies. It is also demonstrated that water adsorption problems may have generated low water potential values which in prior studies have been attributed to large cut surface area to volume ratios.

  6. DNA nanomechanics allows direct digital detection of complementary DNA and microRNA targets.

    PubMed

    Husale, Sudhir; Persson, Henrik H J; Sahin, Ozgur

    2009-12-24

    Techniques to detect and quantify DNA and RNA molecules in biological samples have had a central role in genomics research. Over the past decade, several techniques have been developed to improve detection performance and reduce the cost of genetic analysis. In particular, significant advances in label-free methods have been reported. Yet detection of DNA molecules at concentrations below the femtomolar level requires amplified detection schemes. Here we report a unique nanomechanical response of hybridized DNA and RNA molecules that serves as an intrinsic molecular label. Nanomechanical measurements on a microarray surface have sufficient background signal rejection to allow direct detection and counting of hybridized molecules. The digital response of the sensor provides a large dynamic range that is critical for gene expression profiling. We have measured differential expressions of microRNAs in tumour samples; such measurements have been shown to help discriminate between the tissue origins of metastatic tumours. Two hundred picograms of total RNA is found to be sufficient for this analysis. In addition, the limit of detection in pure samples is found to be one attomolar. These results suggest that nanomechanical read-out of microarrays promises attomolar-level sensitivity and large dynamic range for the analysis of gene expression, while eliminating biochemical manipulations, amplification and labelling.

  7. Use of buccal swabs for sampling DNA from nestling and adult birds

    USGS Publications Warehouse

    Handel, Colleen M.; Pajot, Lisa; Talbot, Sandra L.; Sage, George K.

    2006-01-01

    We evaluated the feasibility and efficiency of using swabs to collect buccal epithelial cells fromsmall (2‐ to 13‐g) birds as a source of DNA for genetic studies. We used commercially available buccal swab kits to collect samples from 42 adult and 39 nestling (4‐ to 8‐day‐old) black‐capped chickadees (Poecile atricapillus) and from6 4‐day‐old nestling boreal chickadees (P. hudsonica). We compared DNA from buccal epithelial samples to that fromblood samples from the same individuals. We extracted sufficient quantities of DNA for analysis from all buccalsamples, and samples remained viable even after being stored in original plastic sampling tubes at room temperature for up to 18 months. Yields were equivalent whether extracted using the proprietary quick‐extraction solution provided with buccal swab kits or using a salt‐extraction process with inexpensive reagents. Yields of DNA from buccal samples were consistently lower than those from blood samples, but quantities were sufficient for all analyses. Assignment of sex, based on DNA extracted from paired buccal and blood samples, was identical for all 87 birds. We found no difference in the genotypes obtained from buccal and blood samples for 12 individuals tested using 5 microsatellite loci and found perfect concordance in sequencing of an 823‐base‐pair segment within the control region of mitochondrial DNA for 7 individuals tested. Use of buccal swabs is highly recommended as a rapid, noninvasive technique for sampling avian genomic DNA, especially for extremely young altricial nestlings or small‐bodied adults, or for any birds for which blood sampling may be impossible or stressful.

  8. In situ diagnostics of the crystal-growth process through neutron imaging: application to scintillators

    DOE PAGES

    Tremsin, Anton S.; Makowska, Małgorzata G.; Perrodin, Didier; ...

    2016-04-12

    Neutrons are known to be unique probes in situations where other types of radiation fail to penetrate samples and their surrounding structures. In this paper it is demonstrated how thermal and cold neutron radiography can provide time-resolved imaging of materials while they are being processed (e.g.while growing single crystals). The processing equipment, in this case furnaces, and the scintillator materials are opaque to conventional X-ray interrogation techniques. The distribution of the europium activator within a BaBrCl:Eu scintillator (0.1 and 0.5% nominal doping concentrations per mole) is studiedin situduring the melting and solidification processes with a temporal resolution of 5–7 s.more » The strong tendency of the Eu dopant to segregate during the solidification process is observed in repeated cycles, with Eu forming clusters on multiple length scales (only for clusters larger than ~50 µm, as limited by the resolution of the present experiments). It is also demonstrated that the dopant concentration can be quantified even for very low concentration levels (~0.1%) in 10 mm thick samples. The interface between the solid and liquid phases can also be imaged, provided there is a sufficient change in concentration of one of the elements with a sufficient neutron attenuation cross section. Tomographic imaging of the BaBrCl:0.1%Eu sample reveals a strong correlation between crystal fractures and Eu-deficient clusters. The results of these experiments demonstrate the unique capabilities of neutron imaging forin situdiagnostics and the optimization of crystal-growth procedures.« less

  9. Hydrology and Water Quality near Bromide Pavilion in Chickasaw National Recreation Area, Murray County, Oklahoma, 2000

    USGS Publications Warehouse

    Andrews, William J.; Burrough, Steven P.

    2002-01-01

    The Bromide Pavilion in Chickasaw National Recreation Area drew many thousands of people annually to drink the mineral-rich waters piped from nearby Bromide and Medicine Springs. Periodic detection of fecal coliform bacteria in water piped to the pavilion from the springs, low yields of the springs, or flooding by adjacent Rock Creek prompted National Park Service officials to discontinue piping of the springs to the pavilion in the 1970s. Park officials would like to resume piping mineralized spring water to the pavilion to restore it as a visitor attraction, but they are concerned about the ability of the springs to provide sufficient quantities of potable water. Pumping and sampling of Bromide and Medicine Springs and Rock Creek six times during 2000 indicate that these springs may not provide sufficient water for Bromide Pavilion to supply large numbers of visitors. A potential problem with piping water from Medicine Spring is the presence of an undercut, overhanging cliff composed of conglomerate, which may collapse. Evidence of intermittent inundation of the springs by Rock Creek and seepage of surface water into the spring vaults from the adjoining creek pose a threat of contamination of the springs. Escherichia coli, fecal coliform, and fecal streptococcal bacteria were detected in some samples from the springs, indicating possible fecal contamination. Cysts of Giardia lamblia and oocysts of Cryptosporidium parvum protozoa were not detected in the creek or the springs. Total culturable enteric viruses were detected in only one water sample taken from Rock Creek.

  10. Chemical quality and regulatory compliance of drinking water in Iceland.

    PubMed

    Gunnarsdottir, Maria J; Gardarsson, Sigurdur M; Jonsson, Gunnar St; Bartram, Jamie

    2016-11-01

    Assuring sufficient quality of drinking water is of great importance for public wellbeing and prosperity. Nations have developed regulatory system with the aim of providing drinking water of sufficient quality and to minimize the risk of contamination of the water supply in the first place. In this study the chemical quality of Icelandic drinking water was evaluated by systematically analyzing results from audit monitoring where 53 parameters were assessed for 345 samples from 79 aquifers, serving 74 water supply systems. Compliance to the Icelandic Drinking Water Regulation (IDWR) was evaluated with regard to parametric values, minimum requirement of sampling, and limit of detection. Water quality compliance was divided according to health-related chemicals and indicators, and analyzed according to size. Samples from few individual locations were benchmarked against natural background levels (NBLs) in order to identify potential pollution sources. The results show that drinking compliance was 99.97% in health-related chemicals and 99.44% in indicator parameters indicating that Icelandic groundwater abstracted for drinking water supply is generally of high quality with no expected health risks. In 10 water supply systems, of the 74 tested, there was an indication of anthropogenic chemical pollution, either at the source or in the network, and in another 6 water supplies there was a need to improve the water intake to prevent surface water intrusion. Benchmarking against the NBLs proved to be useful in tracing potential pollution sources, providing a useful tool for identifying pollution at an early stage. Copyright © 2016 Elsevier GmbH. All rights reserved.

  11. Sequim Marine Research Laboratory routine environmental measurements during CY-1977

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fix, J.J.; Blumer, P.J.

    1978-06-01

    Beginning in 1976, a routine environmental program was established at the Marine Research Laboratory (MRL) at Sequim, Washington. The program is intended to demonstrate the negligible impact of current MRL operations on the surrounding environs and to provide baseline data through which any cumulative impact could be detected. The sampling frequency is greater during the first 2 years of the program to provide sufficient initial information to allow reliable estimates of observed radionuclide concentrations and to construct a long-term sampling program. The program is designed, primarily, to determine levels of radioactivity present in selected biota in Sequim Bay. The biotamore » were selected because of their presence near the laboratory and their capacity to concentrate trace elements. Other samples were obtained to determine the radionuclides in Sequim Bay and laboratory drinking water, as well as the ambient radiation exposure levels and surface deposition of fallout radionuclides for the laboratory area. Appendix A provides a summary of the analytical methods used. The present document includes data obtained during CY 1977 in addition to CY-1976 data published previously.« less

  12. Molecular epidemiological and phylogenetic analyses of canine parvovirus in domestic dogs and cats in Beijing, 2010-2013.

    PubMed

    Wu, Jing; Gao, Xin-Tao; Hou, Shao-Hua; Guo, Xiao-Yu; Yang, Xue-Shong; Yuan, Wei-Feng; Xin, Ting; Zhu, Hong-Fei; Jia, Hong

    2015-10-01

    Fifty-five samples (15.62%) collected from dogs and cats were identified as canine parvovirus (CPV) infection in Beijing during 2010-2013. The nucleotide identities and aa similarities were 98.2-100% and 97.7-100%, respectively, when compared with the reference isolates. Also, several synonymous and non-synonymous mutations were also recorded for the first time. New CPV-2a was dominant, accounting for 90.90% of the samples. Two of the 16 samples collected from cats were identified as new CPV-2a (12.5%), showing nucleotide identities of 100% with those from dogs. Twelve samples (15.78%) collected from completely immunized dogs were found to be new CPV-2a, which means CPV-2 vaccines may not provide sufficient protection for the epidemic strains.

  13. Laboratory study of adsorption and deliquescence on the surface of Mars

    NASA Astrophysics Data System (ADS)

    Nikolakakos, George; Whiteway, James A.

    2018-07-01

    A sample of the zeolitic mineral chabazite was subjected to a range of water vapor pressures and temperatures found on present day Mars. Laser Raman scattering was applied to detect the relative amounts of water and carbon dioxide adsorbed by the sample. Results show that zeolites are capable of adsorbing water from the atmosphere on diurnal time scales and that Raman scattering spectroscopy provides a promising method for detecting this process during a landed mission. When the water vapor pressure and temperature were sufficiently low, the zeolite sample also adsorbed carbon dioxide, resulting in the simultaneous adsorption of water and carbon dioxide on the surface mineral grains. Additional experiments were carried out using a mixture of magnesium perchlorate and chabazite. The sample of mixed surface material remained visually unchanged during water adsorption, but was found to darken during deliquescence.

  14. Determination of proenkephalin products in brain tissue by high-performance liquid chromatography and a modified bioassay procedure.

    PubMed

    Bailey, C; Kitchen, I

    1985-06-01

    A method is described for the separation of proenkephalin products using gradient high-performance liquid chromatography preceded by Sep-Pak chromatography. Samples can be assayed simply by use of a modified mouse vas deferens bioassay which is sufficiently sensitive for most applications. The preliminary Sep-Pak chromatography method excludes alpha-neoendorphin and the dynorphins and thus provides a suitable procedure for separation of prodynorphin and proenkephalin products.

  15. Voids and constraints on nonlinear clustering of galaxies

    NASA Technical Reports Server (NTRS)

    Vogeley, Michael S.; Geller, Margaret J.; Park, Changbom; Huchra, John P.

    1994-01-01

    Void statistics of the galaxy distribution in the Center for Astrophysics Redshift Survey provide strong constraints on galaxy clustering in the nonlinear regime, i.e., on scales R equal to or less than 10/h Mpc. Computation of high-order moments of the galaxy distribution requires a sample that (1) densely traces the large-scale structure and (2) covers sufficient volume to obtain good statistics. The CfA redshift survey densely samples structure on scales equal to or less than 10/h Mpc and has sufficient depth and angular coverage to approach a fair sample on these scales. In the nonlinear regime, the void probability function (VPF) for CfA samples exhibits apparent agreement with hierarchical scaling (such scaling implies that the N-point correlation functions for N greater than 2 depend only on pairwise products of the two-point function xi(r)) However, simulations of cosmological models show that this scaling in redshift space does not necessarily imply such scaling in real space, even in the nonlinear regime; peculiar velocities cause distortions which can yield erroneous agreement with hierarchical scaling. The underdensity probability measures the frequency of 'voids' with density rho less than 0.2 -/rho. This statistic reveals a paucity of very bright galaxies (L greater than L asterisk) in the 'voids.' Underdensities are equal to or greater than 2 sigma more frequent in bright galaxy samples than in samples that include fainter galaxies. Comparison of void statistics of CfA samples with simulations of a range of cosmological models favors models with Gaussian primordial fluctuations and Cold Dark Matter (CDM)-like initial power spectra. Biased models tend to produce voids that are too empty. We also compare these data with three specific models of the Cold Dark Matter cosmogony: an unbiased, open universe CDM model (omega = 0.4, h = 0.5) provides a good match to the VPF of the CfA samples. Biasing of the galaxy distribution in the 'standard' CDM model (omega = 1, b = 1.5; see below for definitions) and nonzero cosmological constant CDM model (omega = 0.4, h = 0.6 lambda(sub 0) = 0.6, b = 1.3) produce voids that are too empty. All three simulations match the observed VPF and underdensity probability for samples of very bright (M less than M asterisk = -19.2) galaxies, but produce voids that are too empty when compared with samples that include fainter galaxies.

  16. An original method to evaluate the transport parameters and reconstruct the electric field in solid-state photodetectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Santi, A.; Piacentini, G.; Zanichelli, M.

    2014-05-12

    A method for reconstructing the spatial profile of the electric field along the thickness of a generic bulk solid-state photodetector is proposed. Furthermore, the mobility and lifetime of both electrons and holes can be evaluated contextually. The method is based on a procedure of minimization built up from current transient profiles induced by laser pulses in a planar detector at different applied voltages. The procedure was tested in CdTe planar detectors for X- and Gamma rays. The devices were measured in a single-carrier transport configuration by impinging laser light on the sample cathode. This method could be suitable for manymore » other devices provided that they are made of materials with sufficiently high resistivity, i.e., with a sufficiently low density of intrinsic carriers.« less

  17. Protein structure determination by electron diffraction using a single three-dimensional nanocrystal.

    PubMed

    Clabbers, M T B; van Genderen, E; Wan, W; Wiegers, E L; Gruene, T; Abrahams, J P

    2017-09-01

    Three-dimensional nanometre-sized crystals of macromolecules currently resist structure elucidation by single-crystal X-ray crystallography. Here, a single nanocrystal with a diffracting volume of only 0.14 µm 3 , i.e. no more than 6 × 10 5 unit cells, provided sufficient information to determine the structure of a rare dimeric polymorph of hen egg-white lysozyme by electron crystallography. This is at least an order of magnitude smaller than was previously possible. The molecular-replacement solution, based on a monomeric polyalanine model, provided sufficient phasing power to show side-chain density, and automated model building was used to reconstruct the side chains. Diffraction data were acquired using the rotation method with parallel beam diffraction on a Titan Krios transmission electron microscope equipped with a novel in-house-designed 1024 × 1024 pixel Timepix hybrid pixel detector for low-dose diffraction data collection. Favourable detector characteristics include the ability to accurately discriminate single high-energy electrons from X-rays and count them, fast readout to finely sample reciprocal space and a high dynamic range. This work, together with other recent milestones, suggests that electron crystallography can provide an attractive alternative in determining biological structures.

  18. Protein structure determination by electron diffraction using a single three-dimensional nanocrystal

    PubMed Central

    Clabbers, M. T. B.; van Genderen, E.; Wiegers, E. L.; Gruene, T.; Abrahams, J. P.

    2017-01-01

    Three-dimensional nanometre-sized crystals of macromolecules currently resist structure elucidation by single-crystal X-ray crystallography. Here, a single nanocrystal with a diffracting volume of only 0.14 µm3, i.e. no more than 6 × 105 unit cells, provided sufficient information to determine the structure of a rare dimeric polymorph of hen egg-white lysozyme by electron crystallography. This is at least an order of magnitude smaller than was previously possible. The molecular-replacement solution, based on a monomeric polyalanine model, provided sufficient phasing power to show side-chain density, and automated model building was used to reconstruct the side chains. Diffraction data were acquired using the rotation method with parallel beam diffraction on a Titan Krios transmission electron microscope equipped with a novel in-house-designed 1024 × 1024 pixel Timepix hybrid pixel detector for low-dose diffraction data collection. Favourable detector characteristics include the ability to accurately discriminate single high-energy electrons from X-rays and count them, fast readout to finely sample reciprocal space and a high dynamic range. This work, together with other recent milestones, suggests that electron crystallography can provide an attractive alternative in determining biological structures. PMID:28876237

  19. 49 CFR 40.195 - What happens when an individual is unable to provide a sufficient amount of urine for a pre...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... provide a sufficient amount of urine for a pre-employment follow-up or return-to-duty test because of a... providing a sufficient specimen for a pre-employment follow-up or return-to-duty test and the condition... Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Problems in Drug Tests...

  20. 49 CFR 40.195 - What happens when an individual is unable to provide a sufficient amount of urine for a pre...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... provide a sufficient amount of urine for a pre-employment follow-up or return-to-duty test because of a... providing a sufficient specimen for a pre-employment follow-up or return-to-duty test and the condition... Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Problems in Drug Tests...

  1. 49 CFR 40.195 - What happens when an individual is unable to provide a sufficient amount of urine for a pre...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... provide a sufficient amount of urine for a pre-employment follow-up or return-to-duty test because of a... providing a sufficient specimen for a pre-employment follow-up or return-to-duty test and the condition... Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Problems in Drug Tests...

  2. 49 CFR 40.195 - What happens when an individual is unable to provide a sufficient amount of urine for a pre...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... provide a sufficient amount of urine for a pre-employment follow-up or return-to-duty test because of a... providing a sufficient specimen for a pre-employment follow-up or return-to-duty test and the condition... Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Problems in Drug Tests...

  3. 49 CFR 40.195 - What happens when an individual is unable to provide a sufficient amount of urine for a pre...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... provide a sufficient amount of urine for a pre-employment follow-up or return-to-duty test because of a... providing a sufficient specimen for a pre-employment follow-up or return-to-duty test and the condition... Transportation PROCEDURES FOR TRANSPORTATION WORKPLACE DRUG AND ALCOHOL TESTING PROGRAMS Problems in Drug Tests...

  4. 49 CFR 40.193 - What happens when an employee does not provide a sufficient amount of urine for a drug test?

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... sufficient amount of urine for a drug test? 40.193 Section 40.193 Transportation Office of the Secretary of... § 40.193 What happens when an employee does not provide a sufficient amount of urine for a drug test... sufficient amount of urine to permit a drug test (i.e., 45 mL of urine). (b) As the collector, you must do...

  5. 49 CFR 40.193 - What happens when an employee does not provide a sufficient amount of urine for a drug test?

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... sufficient amount of urine for a drug test? 40.193 Section 40.193 Transportation Office of the Secretary of... § 40.193 What happens when an employee does not provide a sufficient amount of urine for a drug test... sufficient amount of urine to permit a drug test (i.e., 45 mL of urine). (b) As the collector, you must do...

  6. 49 CFR 40.193 - What happens when an employee does not provide a sufficient amount of urine for a drug test?

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... sufficient amount of urine for a drug test? 40.193 Section 40.193 Transportation Office of the Secretary of... § 40.193 What happens when an employee does not provide a sufficient amount of urine for a drug test... sufficient amount of urine to permit a drug test (i.e., 45 mL of urine). (b) As the collector, you must do...

  7. 49 CFR 40.193 - What happens when an employee does not provide a sufficient amount of urine for a drug test?

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... sufficient amount of urine for a drug test? 40.193 Section 40.193 Transportation Office of the Secretary of... § 40.193 What happens when an employee does not provide a sufficient amount of urine for a drug test... sufficient amount of urine to permit a drug test (i.e., 45 mL of urine). (b) As the collector, you must do...

  8. Study samples are too small to produce sufficiently precise reliability coefficients.

    PubMed

    Charter, Richard A

    2003-04-01

    In a survey of journal articles, test manuals, and test critique books, the author found that a mean sample size (N) of 260 participants had been used for reliability studies on 742 tests. The distribution was skewed because the median sample size for the total sample was only 90. The median sample sizes for the internal consistency, retest, and interjudge reliabilities were 182, 64, and 36, respectively. The author presented sample size statistics for the various internal consistency methods and types of tests. In general, the author found that the sample sizes that were used in the internal consistency studies were too small to produce sufficiently precise reliability coefficients, which in turn could cause imprecise estimates of examinee true-score confidence intervals. The results also suggest that larger sample sizes have been used in the last decade compared with those that were used in earlier decades.

  9. Detecting insect pollinator declines on regional and global scales

    USGS Publications Warehouse

    Lubuhn, Gretchen; Droege, Sam; Connor, Edward F.; Gemmill-Herren, Barbara; Potts, Simon G.; Minckley, Robert L.; Griswold, Terry; Jean, Robert; Kula, Emanuel; Roubik, David W.; Cane, Jim; Wright, Karen W.; Frankie, Gordon; Parker, Frank

    2013-01-01

    Recently there has been considerable concern about declines in bee communities in agricultural and natural habitats. The value of pollination to agriculture, provided primarily by bees, is >$200 billion/year worldwide, and in natural ecosystems it is thought to be even greater. However, no monitoring program exists to accurately detect declines in abundance of insect pollinators; thus, it is difficult to quantify the status of bee communities or estimate the extent of declines. We used data from 11 multiyear studies of bee communities to devise a program to monitor pollinators at regional, national, or international scales. In these studies, 7 different methods for sampling bees were used and bees were sampled on 3 different continents. We estimated that a monitoring program with 200-250 sampling locations each sampled twice over 5 years would provide sufficient power to detect small (2-5%) annual declines in the number of species and in total abundance and would cost U.S.$2,000,000. To detect declines as small as 1% annually over the same period would require >300 sampling locations. Given the role of pollinators in food security and ecosystem function, we recommend establishment of integrated regional and international monitoring programs to detect changes in pollinator communities.

  10. Evaluation of errors in quantitative determination of asbestos in rock

    NASA Astrophysics Data System (ADS)

    Baietto, Oliviero; Marini, Paola; Vitaliti, Martina

    2016-04-01

    The quantitative determination of the content of asbestos in rock matrices is a complex operation which is susceptible to important errors. The principal methodologies for the analysis are Scanning Electron Microscopy (SEM) and Phase Contrast Optical Microscopy (PCOM). Despite the PCOM resolution is inferior to that of SEM, PCOM analysis has several advantages, including more representativity of the analyzed sample, more effective recognition of chrysotile and a lower cost. The DIATI LAA internal methodology for the analysis in PCOM is based on a mild grinding of a rock sample, its subdivision in 5-6 grain size classes smaller than 2 mm and a subsequent microscopic analysis of a portion of each class. The PCOM is based on the optical properties of asbestos and of the liquids with note refractive index in which the particles in analysis are immersed. The error evaluation in the analysis of rock samples, contrary to the analysis of airborne filters, cannot be based on a statistical distribution. In fact for airborne filters a binomial distribution (Poisson), which theoretically defines the variation in the count of fibers resulting from the observation of analysis fields, chosen randomly on the filter, can be applied. The analysis in rock matrices instead cannot lean on any statistical distribution because the most important object of the analysis is the size of the of asbestiform fibers and bundles of fibers observed and the resulting relationship between the weights of the fibrous component compared to the one granular. The error evaluation generally provided by public and private institutions varies between 50 and 150 percent, but there are not, however, specific studies that discuss the origin of the error or that link it to the asbestos content. Our work aims to provide a reliable estimation of the error in relation to the applied methodologies and to the total content of asbestos, especially for the values close to the legal limits. The error assessments must be made through the repetition of the same analysis on the same sample to try to estimate the error on the representativeness of the sample and the error related to the sensitivity of the operator, in order to provide a sufficiently reliable uncertainty of the method. We used about 30 natural rock samples with different asbestos content, performing 3 analysis on each sample to obtain a trend sufficiently representative of the percentage. Furthermore we made on one chosen sample 10 repetition of the analysis to try to define more specifically the error of the methodology.

  11. Intelligence Failure: How a Commander Can Prevent It

    DTIC Science & Technology

    2009-10-23

    Failure: How a Commander Can Prevent It The job of intelligence is to provide the decision maker with sufficient understanding of the enemy to make...Failure: How a Commander Can Prevent It The job of intelligence is to provide the decision maker with sufficient understanding of the enemy to make...reinforce these lessons. 1 Introduction The job of intelligence is to provide the decision maker with sufficient understanding of

  12. Differences in prescription opioid analgesic availability: comparing minority and white pharmacies across Michigan.

    PubMed

    Green, Carmen R; Ndao-Brumblay, S Khady; West, Brady; Washington, Tamika

    2005-10-01

    Little is known about physical barriers to adequate pain treatment for minorities. This investigation explored sociodemographic determinants of pain medication availability in Michigan pharmacies. A cross-sectional survey-based study with census data and data provided by Michigan community retail pharmacists was designed. Sufficient opioid analgesic supplies was defined as stocking at least one long-acting, short-acting, and combination opioid analgesic. Pharmacies located in minority (or=70% white residents) zip code areas were randomly selected by using a 2-stage sampling selection process (response rate, 80%). For the 190 pharmacies surveyed, most were located in white areas (51.6%) and had sufficient supplies (84.1%). After accounting for zip code median age and stratifying by income, pharmacies in white areas (odds ratio, 13.36 high income vs 54.42 low income) and noncorporate pharmacies (odds ratio, 24.92 high income vs 3.61 low income) were more likely to have sufficient opioid analgesic supplies (P < .005). Racial differences in the odds of having a sufficient supply were significantly higher in low income areas when compared with high income areas. Having a pharmacy located near a hospital did not change the availability for opioid analgesics. Persons living in predominantly minority areas experienced significant barriers to accessing pain medication, with greater disparities in low income areas regardless of ethnic composition. Differences were also found on the basis of pharmacy type, suggesting variability in pharmacist's decision making. Michigan pharmacies in minority zip codes were 52 times less likely to carry sufficient opioid analgesics than pharmacies in white zip codes regardless of income. Lower income areas and corporate pharmacies were less likely to carry sufficient opioid analgesics. This study illustrates barriers to pain care and has public health implications.

  13. Distinguishing Internet-facing ICS Devices Using PLC Programming Information

    DTIC Science & Technology

    2014-06-19

    the course of this research. I would also like to thank the faculty and students at AFIT who helped me think through some of these problems. Finally, I...21 FTP 143 IMAP 1900 UPnP 6379 Redis 22 SSH 161 SNMP 2323 Telnet 7777 Oracle 23 Telnet 443 HTTPS 3306 MySQL 8000 Qconn 25 SMTP 445 SMB 3389 RDP 8080...250ms over the course of 10,000 samples provided sufficient data to test for statistically significant changes to ladder logic execution times with a

  14. ASPRS Digital Imagery Guideline Image Gallery Discussion

    NASA Technical Reports Server (NTRS)

    Ryan, Robert

    2002-01-01

    The objectives of the image gallery are to 1) give users and providers a simple means of identifying appropriate imagery for a given application/feature extraction; and 2) define imagery sufficiently to be described in engineering and acquisition terms. This viewgraph presentation includes a discussion of edge response and aliasing for image processing, and a series of images illustrating the effects of signal to noise ratio (SNR) on images. Another series of images illustrates how images are affected by varying the ground sample distances (GSD).

  15. Reporting of sample size calculations in analgesic clinical trials: ACTTION systematic review.

    PubMed

    McKeown, Andrew; Gewandter, Jennifer S; McDermott, Michael P; Pawlowski, Joseph R; Poli, Joseph J; Rothstein, Daniel; Farrar, John T; Gilron, Ian; Katz, Nathaniel P; Lin, Allison H; Rappaport, Bob A; Rowbotham, Michael C; Turk, Dennis C; Dworkin, Robert H; Smith, Shannon M

    2015-03-01

    Sample size calculations determine the number of participants required to have sufficiently high power to detect a given treatment effect. In this review, we examined the reporting quality of sample size calculations in 172 publications of double-blind randomized controlled trials of noninvasive pharmacologic or interventional (ie, invasive) pain treatments published in European Journal of Pain, Journal of Pain, and Pain from January 2006 through June 2013. Sixty-five percent of publications reported a sample size calculation but only 38% provided all elements required to replicate the calculated sample size. In publications reporting at least 1 element, 54% provided a justification for the treatment effect used to calculate sample size, and 24% of studies with continuous outcome variables justified the variability estimate. Publications of clinical pain condition trials reported a sample size calculation more frequently than experimental pain model trials (77% vs 33%, P < .001) but did not differ in the frequency of reporting all required elements. No significant differences in reporting of any or all elements were detected between publications of trials with industry and nonindustry sponsorship. Twenty-eight percent included a discrepancy between the reported number of planned and randomized participants. This study suggests that sample size calculation reporting in analgesic trial publications is usually incomplete. Investigators should provide detailed accounts of sample size calculations in publications of clinical trials of pain treatments, which is necessary for reporting transparency and communication of pre-trial design decisions. In this systematic review of analgesic clinical trials, sample size calculations and the required elements (eg, treatment effect to be detected; power level) were incompletely reported. A lack of transparency regarding sample size calculations may raise questions about the appropriateness of the calculated sample size. Copyright © 2015 American Pain Society. All rights reserved.

  16. Fast emulation of track reconstruction in the CMS simulation

    NASA Astrophysics Data System (ADS)

    Komm, Matthias; CMS Collaboration

    2017-10-01

    Simulated samples of various physics processes are a key ingredient within analyses to unlock the physics behind LHC collision data. Samples with more and more statistics are required to keep up with the increasing amounts of recorded data. During sample generation, significant computing time is spent on the reconstruction of charged particle tracks from energy deposits which additionally scales with the pileup conditions. In CMS, the FastSimulation package is developed for providing a fast alternative to the standard simulation and reconstruction workflow. It employs various techniques to emulate track reconstruction effects in particle collision events. Several analysis groups in CMS are utilizing the package, in particular those requiring many samples to scan the parameter space of physics models (e.g. SUSY) or for the purpose of estimating systematic uncertainties. The strategies for and recent developments in this emulation are presented, including a novel, flexible implementation of tracking emulation while retaining a sufficient, tuneable accuracy.

  17. Photo-induced ultrasound microscopy for photo-acoustic imaging of non-absorbing specimens

    NASA Astrophysics Data System (ADS)

    Tcarenkova, Elena; Koho, Sami V.; Hänninen, Pekka E.

    2017-08-01

    Photo-Acoustic Microscopy (PAM) has raised high interest in in-vivo imaging due to its ability to preserve the near-diffraction limited spatial resolution of optical microscopes, whilst extending the penetration depth to the mm-range. Another advantage of PAM is that it is a label-free technique - any substance that absorbs PAM excitation laser light can be viewed. However, not all sample structures desired to be observed absorb sufficiently to provide contrast for imaging. This work describes a novel imaging method that makes it possible to visualize optically transparent samples that lack intrinsic photo-acoustic contrast, without the addition of contrast agents. A thin, strongly light absorbing layer next to sample is used to generate a strong ultrasonic signal. This signal, when recorded from opposite side, contains ultrasonic transmission information of the sample and thus the method can be used to obtain an ultrasound transmission image on any PAM.

  18. Accounting for between-study variation in incremental net benefit in value of information methodology.

    PubMed

    Willan, Andrew R; Eckermann, Simon

    2012-10-01

    Previous applications of value of information methods for determining optimal sample size in randomized clinical trials have assumed no between-study variation in mean incremental net benefit. By adopting a hierarchical model, we provide a solution for determining optimal sample size with this assumption relaxed. The solution is illustrated with two examples from the literature. Expected net gain increases with increasing between-study variation, reflecting the increased uncertainty in incremental net benefit and reduced extent to which data are borrowed from previous evidence. Hence, a trial can become optimal where current evidence is sufficient assuming no between-study variation. However, despite the expected net gain increasing, the optimal sample size in the illustrated examples is relatively insensitive to the amount of between-study variation. Further percentage losses in expected net gain were small even when choosing sample sizes that reflected widely different between-study variation. Copyright © 2011 John Wiley & Sons, Ltd.

  19. Thermoelectric properties by high temperature annealing

    NASA Technical Reports Server (NTRS)

    Chen, Gang (Inventor); Kumar, Shankar (Inventor); Ren, Zhifeng (Inventor); Lee, Hohyun (Inventor)

    2009-01-01

    The present invention generally provides methods of improving thermoelectric properties of alloys by subjecting them to one or more high temperature annealing steps, performed at temperatures at which the alloys exhibit a mixed solid/liquid phase, followed by cooling steps. For example, in one aspect, such a method of the invention can include subjecting an alloy sample to a temperature that is sufficiently elevated to cause partial melting of at least some of the grains. The sample can then be cooled so as to solidify the melted grain portions such that each solidified grain portion exhibits an average chemical composition, characterized by a relative concentration of elements forming the alloy, that is different than that of the remainder of the grain.

  20. Absorption-emission optrode and methods of use thereof

    DOEpatents

    Hirschfeld, T.B.

    1990-05-29

    A method and apparatus are described for monitoring the physical and chemical properties of a sample fluid by measuring an optical signal generated by a fluorescent substance and modulated by an absorber substance. The emission band of the fluorescent substance overlaps the absorption band of the absorber substance, and the degree of overlap is dependent on the physical and chemical properties of the sample fluid. The fluorescent substance and absorber substance are immobilized on a substrate so that an effective number of molecules thereof are sufficiently close for resonant energy transfer to occur, thereby providing highly efficient modulation of the fluorescent emissions of the fluorescent substance by the absorber substance. 4 figs.

  1. Back strength and flexibility of EMS providers in practicing prehospital providers.

    PubMed

    Crill, Matthew T; Hostler, David

    2005-06-01

    In the execution of prehospital care duties, an EMS provider may be required to carry equipment and patients over long distances or over multiple flights of stairs at any time of the day. At a minimum, a prehospital provider must have sufficient lower back strength and hamstring flexibility to prevent musculoskeletal injury while lifting. This study administered fitness assessments related to the occupational activities of the prehospital provider with the purpose of describing the incidence of occupational back injury and percentage of providers with known risk factors for back injury. Ninety subjects were tested during a regional EMS conference. Men were significantly taller and heavier than women and had significantly less hamstring flexibility. Body Mass Index was 30.7 +/- 7.2 in men and 28 +/- 5.7 in women. However, no significant differences were noted in an extension test of back strength. When surveyed, 47.8% of subjects reported a back injury in the previous 6 months but only 39.1% of these injuries were sustained while performing EMS duties. While only 13% of these injuries resulted in missed work, 52.2% reported their injury interfered with their daily activities. In spite of the physical nature of the profession, EMS providers in our sample were significantly overweight according to their Body Mass Index and may lack sufficient back strength and flexibilityfor safe execution of their duties. This group of professionals may be at risk for occupational injury and should be targeted for interventions to improve strength and flexibility.

  2. DNA Yield From Tissue Samples in Surgical Pathology and Minimum Tissue Requirements for Molecular Testing.

    PubMed

    Austin, Melissa C; Smith, Christina; Pritchard, Colin C; Tait, Jonathan F

    2016-02-01

    Complex molecular assays are increasingly used to direct therapy and provide diagnostic and prognostic information but can require relatively large amounts of DNA. To provide data to pathologists to help them assess tissue adequacy and provide prospective guidance on the amount of tissue that should be procured. We used slide-based measurements to establish a relationship between processed tissue volume and DNA yield by A260 from 366 formalin-fixed, paraffin-embedded tissue samples submitted for the 3 most common molecular assays performed in our laboratory (EGFR, KRAS, and BRAF). We determined the average DNA yield per unit of tissue volume, and we used the distribution of DNA yields to calculate the minimum volume of tissue that should yield sufficient DNA 99% of the time. All samples with a volume greater than 8 mm(3) yielded at least 1 μg of DNA, and more than 80% of samples producing less than 1 μg were extracted from less than 4 mm(3) of tissue. Nine square millimeters of tissue should produce more than 1 μg of DNA 99% of the time. We conclude that 2 tissue cores, each 1 cm long and obtained with an 18-gauge needle, will almost always provide enough DNA for complex multigene assays, and our methodology may be readily extrapolated to individual institutional practice.

  3. How little data is enough? Phase-diagram analysis of sparsity-regularized X-ray computed tomography

    PubMed Central

    Jørgensen, J. S.; Sidky, E. Y.

    2015-01-01

    We introduce phase-diagram analysis, a standard tool in compressed sensing (CS), to the X-ray computed tomography (CT) community as a systematic method for determining how few projections suffice for accurate sparsity-regularized reconstruction. In CS, a phase diagram is a convenient way to study and express certain theoretical relations between sparsity and sufficient sampling. We adapt phase-diagram analysis for empirical use in X-ray CT for which the same theoretical results do not hold. We demonstrate in three case studies the potential of phase-diagram analysis for providing quantitative answers to questions of undersampling. First, we demonstrate that there are cases where X-ray CT empirically performs comparably with a near-optimal CS strategy, namely taking measurements with Gaussian sensing matrices. Second, we show that, in contrast to what might have been anticipated, taking randomized CT measurements does not lead to improved performance compared with standard structured sampling patterns. Finally, we show preliminary results of how well phase-diagram analysis can predict the sufficient number of projections for accurately reconstructing a large-scale image of a given sparsity by means of total-variation regularization. PMID:25939620

  4. How little data is enough? Phase-diagram analysis of sparsity-regularized X-ray computed tomography.

    PubMed

    Jørgensen, J S; Sidky, E Y

    2015-06-13

    We introduce phase-diagram analysis, a standard tool in compressed sensing (CS), to the X-ray computed tomography (CT) community as a systematic method for determining how few projections suffice for accurate sparsity-regularized reconstruction. In CS, a phase diagram is a convenient way to study and express certain theoretical relations between sparsity and sufficient sampling. We adapt phase-diagram analysis for empirical use in X-ray CT for which the same theoretical results do not hold. We demonstrate in three case studies the potential of phase-diagram analysis for providing quantitative answers to questions of undersampling. First, we demonstrate that there are cases where X-ray CT empirically performs comparably with a near-optimal CS strategy, namely taking measurements with Gaussian sensing matrices. Second, we show that, in contrast to what might have been anticipated, taking randomized CT measurements does not lead to improved performance compared with standard structured sampling patterns. Finally, we show preliminary results of how well phase-diagram analysis can predict the sufficient number of projections for accurately reconstructing a large-scale image of a given sparsity by means of total-variation regularization.

  5. Tank 241-AY-101 Privatization Push Mode Core Sampling and Analysis Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TEMPLETON, A.M.

    2000-05-19

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for samples obtained from tank 241-AY-101. The purpose of this sampling event is to obtain information about the characteristics of the contents of 241-AY-101 required to satisfy ''Data Quality Objectives For RPP Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For High-Level Waste Feed Batch X(HLW DQO)' (Nguyen 1999a), ''Data Quality Objectives For TWRS Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For Low-Activity Waste Feed Butch X (LAW DQO) (Nguyen 1999b)'', ''Low Activity Wastemore » and High-Level Waste Feed Data Quality Objectives (L&H DQO)'' (Patello et al. 1999), and ''Characterization Data Needs for Development, Design, and Operation of Retrieval Equipment Developed through the Data Quality Objective Process (Equipment DQO)'' (Bloom 1996). Special instructions regarding support to the LAW and HLW DQOs are provided by Baldwin (1999). Push mode core samples will be obtained from risers 15G and 150 to provide sufficient material for the chemical analyses and tests required to satisfy these data quality objectives. The 222-S Laboratory will extrude core samples; composite the liquids and solids; perform chemical analyses on composite and segment samples; archive half-segment samples; and provide sub-samples to the Process Chemistry Laboratory. The Process Chemistry Laboratory will prepare test plans and perform process tests to evaluate the behavior of the 241-AY-101 waste undergoing the retrieval and treatment scenarios defined in the applicable DQOs. Requirements for analyses of samples originating in the process tests will be documented in the corresponding test plans and are not within the scope of this SAP.« less

  6. Development of twelve microsatellite loci in the red tree corals Primnoa resedaeformis and Primnoa pacifica

    USGS Publications Warehouse

    Morrison, Cheryl L.; Springmann, Marcus J.; Shroades, Kelsey; Stone, Robert P.

    2015-01-01

    A suite of tetra-, penta-, and hexa-nucleotide microsatellite loci were developed from Roche 454 pyrosequencing data for the cold-water octocorals Primnoa resedaeformis and P. pacifica. Twelve of 98 primer sets tested consistently amplified in 30 P. resedaeformis samples from Baltimore Canyon (western North Atlantic Ocean) and in 24 P. pacifica samples (Shutter Ridge, eastern Gulf of Alaska). The loci displayed moderate levels of allelic diversity (average 7.5 alleles/locus) and heterozygosity (average 47 %). Levels of genetic diversity were sufficient to produce unique multi-locus genotypes and to distinguish species. These common species are long-lived (hundreds of years) and provide essential fish habitat (P. pacifica), yet populations are provided little protection from human activities. These loci will be used to determine regional patterns of population connectivity to inform effective marine spatial planning and ecosystem-based fisheries management.

  7. PCR-based detection of Toxoplasma gondii DNA in blood and ocular samples for diagnosis of ocular toxoplasmosis.

    PubMed

    Bourdin, C; Busse, A; Kouamou, E; Touafek, F; Bodaghi, B; Le Hoang, P; Mazier, D; Paris, L; Fekkar, A

    2014-11-01

    PCR detection of Toxoplasma gondii in blood has been suggested as a possibly efficient method for the diagnosis of ocular toxoplasmosis (OT) and furthermore for genotyping the strain involved in the disease. To assess this hypothesis, we performed PCR with 121 peripheral blood samples from 104 patients showing clinical and/or biological evidence of ocular toxoplasmosis and from 284 (258 patients) controls. We tested 2 different extraction protocols, using either 200 μl (small volume) or 2 ml (large volume) of whole blood. Sensitivity was poor, i.e., 4.1% and 25% for the small- and large-volume extractions, respectively. In comparison, PCR with ocular samples yielded 35.9% sensitivity, while immunoblotting and calculation of the Goldmann-Witmer coefficient yielded 47.6% and 72.3% sensitivities, respectively. Performing these three methods together provided 89.4% sensitivity. Whatever the origin of the sample (ocular or blood), PCR provided higher sensitivity for immunocompromised patients than for their immunocompetent counterparts. Consequently, PCR detection of Toxoplasma gondii in blood samples cannot currently be considered a sufficient tool for the diagnosis of OT, and ocular sampling remains necessary for the biological diagnosis of OT. Copyright © 2014, American Society for Microbiology. All Rights Reserved.

  8. Device for collecting and analyzing matrix-isolated samples

    DOEpatents

    Reedy, Gerald T.

    1979-01-01

    A gas-sample collection device is disclosed for matrix isolation of individual gas bands from a gas chromatographic separation and for presenting these distinct samples for spectrometric examination. The device includes a vacuum chamber containing a rotatably supported, specular carrousel having a number of external, reflecting surfaces around its axis of rotation for holding samples. A gas inlet is provided for depositing sample and matrix material on the individual reflecting surfaces maintained at a sufficiently low temperature to cause solidification. Two optical windows or lenses are installed in the vacuum chamber walls for transmitting a beam of electromagnetic radiation, for instance infrared light, through a selected sample. Positioned within the chamber are two concave mirrors, the first aligned to receive the light beam from one of the lenses and focus it to the sample on one of the reflecting surfaces of the carrousel. The second mirror is aligned to receive reflected light from that carrousel surface and to focus it outwardly through the second lens. The light beam transmitted from the sample is received by a spectrometer for determining absorption spectra.

  9. Sampling considerations for modal analysis with damping

    NASA Astrophysics Data System (ADS)

    Park, Jae Young; Wakin, Michael B.; Gilbert, Anna C.

    2015-03-01

    Structural health monitoring (SHM) systems are critical for monitoring aging infrastructure (such as buildings or bridges) in a cost-effective manner. Wireless sensor networks that sample vibration data over time are particularly appealing for SHM applications due to their flexibility and low cost. However, in order to extend the battery life of wireless sensor nodes, it is essential to minimize the amount of vibration data these sensors must collect and transmit. In recent work, we have studied the performance of the Singular Value Decomposition (SVD) applied to the collection of data and provided new finite sample analysis characterizing conditions under which this simple technique{also known as the Proper Orthogonal Decomposition (POD){can correctly estimate the mode shapes of the structure. Specifically, we provided theoretical guarantees on the number and duration of samples required in order to estimate a structure's mode shapes to a desired level of accuracy. In that previous work, however, we considered simplified Multiple-Degree-Of-Freedom (MDOF) systems with no damping. In this paper we consider MDOF systems with proportional damping and show that, with sufficiently light damping, the POD can continue to provide accurate estimates of a structure's mode shapes. We support our discussion with new analytical insight and experimental demonstrations. In particular, we study the tradeoffs between the level of damping, the sampling rate and duration, and the accuracy to which the structure's mode shapes can be estimated.

  10. A method for release and multiple strand amplification of small quantities of DNA from endospores of the fastidious bacterium Pasteuria penetrans.

    PubMed

    Mauchline, T H; Mohan, S; Davies, K G; Schaff, J E; Opperman, C H; Kerry, B R; Hirsch, P R

    2010-05-01

    To establish a reliable protocol to extract DNA from Pasteuria penetrans endospores for use as template in multiple strand amplification, thus providing sufficient material for genetic analyses. To develop a highly sensitive PCR-based diagnostic tool for P. penetrans. An optimized method to decontaminate endospores, release and purify DNA enabled multiple strand amplification. DNA purity was assessed by cloning and sequencing gyrB and 16S rRNA gene fragments obtained from PCR using generic primers. Samples indicated to be 100%P. penetrans by the gyrB assay were estimated at 46% using the 16S rRNA gene. No bias was detected on cloning and sequencing 12 housekeeping and sporulation gene fragments from amplified DNA. The detection limit by PCR with Pasteuria-specific 16S rRNA gene primers following multiple strand amplification of DNA extracted using the method was a single endospore. Generation of large quantities DNA will facilitate genomic sequencing of P. penetrans. Apparent differences in sample purity are explained by variations in 16S rRNA gene copy number in Eubacteria leading to exaggerated estimations of sample contamination. Detection of single endospores will facilitate investigations of P. penetrans molecular ecology. These methods will advance studies on P. penetrans and facilitate research on other obligate and fastidious micro-organisms where it is currently impractical to obtain DNA in sufficient quantity and quality.

  11. Integrating scales of seagrass monitoring to meet conservation needs

    USGS Publications Warehouse

    Neckles, Hilary A.; Kopp, Blaine S.; Peterson, Bradley J.; Pooler, Penelope S.

    2012-01-01

    We evaluated a hierarchical framework for seagrass monitoring in two estuaries in the northeastern USA: Little Pleasant Bay, Massachusetts, and Great South Bay/Moriches Bay, New York. This approach includes three tiers of monitoring that are integrated across spatial scales and sampling intensities. We identified monitoring attributes for determining attainment of conservation objectives to protect seagrass ecosystems from estuarine nutrient enrichment. Existing mapping programs provided large-scale information on seagrass distribution and bed sizes (tier 1 monitoring). We supplemented this with bay-wide, quadrat-based assessments of seagrass percent cover and canopy height at permanent sampling stations following a spatially distributed random design (tier 2 monitoring). Resampling simulations showed that four observations per station were sufficient to minimize bias in estimating mean percent cover on a bay-wide scale, and sample sizes of 55 stations in a 624-ha system and 198 stations in a 9,220-ha system were sufficient to detect absolute temporal increases in seagrass abundance from 25% to 49% cover and from 4% to 12% cover, respectively. We made high-resolution measurements of seagrass condition (percent cover, canopy height, total and reproductive shoot density, biomass, and seagrass depth limit) at a representative index site in each system (tier 3 monitoring). Tier 3 data helped explain system-wide changes. Our results suggest tiered monitoring as an efficient and feasible way to detect and predict changes in seagrass systems relative to multi-scale conservation objectives.

  12. Reflections on experimental research in medical education.

    PubMed

    Cook, David A; Beckman, Thomas J

    2010-08-01

    As medical education research advances, it is important that education researchers employ rigorous methods for conducting and reporting their investigations. In this article we discuss several important yet oft neglected issues in designing experimental research in education. First, randomization controls for only a subset of possible confounders. Second, the posttest-only design is inherently stronger than the pretest-posttest design, provided the study is randomized and the sample is sufficiently large. Third, demonstrating the superiority of an educational intervention in comparison to no intervention does little to advance the art and science of education. Fourth, comparisons involving multifactorial interventions are hopelessly confounded, have limited application to new settings, and do little to advance our understanding of education. Fifth, single-group pretest-posttest studies are susceptible to numerous validity threats. Finally, educational interventions (including the comparison group) must be described in detail sufficient to allow replication.

  13. Effective removal of co-purified inhibitors from extracted DNA samples using synchronous coefficient of drag alteration (SCODA) technology.

    PubMed

    Schmedes, Sarah; Marshall, Pamela; King, Jonathan L; Budowle, Bruce

    2013-07-01

    Various types of biological samples present challenges for extraction of DNA suitable for subsequent molecular analyses. Commonly used extraction methods, such as silica membrane columns and phenol-chloroform, while highly successful may still fail to provide a sufficiently pure DNA extract with some samples. Synchronous coefficient of drag alteration (SCODA), implemented in Boreal Genomics' Aurora Nucleic Acid Extraction System (Boreal Genomics, Vancouver, BC), is a new technology that offers the potential to remove inhibitors effectively while simultaneously concentrating DNA. In this initial study, SCODA was tested for its ability to remove various concentrations of forensically and medically relevant polymerase chain reaction (PCR) inhibitors naturally found in tissue, hair, blood, plant, and soil samples. SCODA was used to purify and concentrate DNA from intentionally contaminated DNA samples containing known concentrations of hematin, humic acid, melanin, and tannic acid. The internal positive control (IPC) provided in the Quantifiler™ Human DNA Quantification Kit (Life Technologies, Foster City, CA) and short tandem repeat (STR) profiling (AmpFℓSTR® Identifiler® Plus PCR Amplification Kit; Life Technologies, Foster City, CA) were used to measure inhibition effects and hence purification. SCODA methodology yielded overall higher efficiency of purification of highly contaminated samples compared with the QIAquick® PCR Purification Kit (Qiagen, Valencia, CA). SCODA-purified DNA yielded no cycle shift of the IPC for each sample and yielded greater allele percentage recovery and relative fluorescence unit values compared with the QIAquick® purification method. The Aurora provided an automated, minimal-step approach to successfully remove inhibitors and concentrate DNA from challenged samples.

  14. Predicting nitrate discharge dynamics in mesoscale catchments using the lumped StreamGEM model and Bayesian parameter inference

    NASA Astrophysics Data System (ADS)

    Woodward, Simon James Roy; Wöhling, Thomas; Rode, Michael; Stenger, Roland

    2017-09-01

    The common practice of infrequent (e.g., monthly) stream water quality sampling for state of the environment monitoring may, when combined with high resolution stream flow data, provide sufficient information to accurately characterise the dominant nutrient transfer pathways and predict annual catchment yields. In the proposed approach, we use the spatially lumped catchment model StreamGEM to predict daily stream flow and nitrate concentration (mg L-1 NO3-N) in four contrasting mesoscale headwater catchments based on four years of daily rainfall, potential evapotranspiration, and stream flow measurements, and monthly or daily nitrate concentrations. Posterior model parameter distributions were estimated using the Markov Chain Monte Carlo sampling code DREAMZS and a log-likelihood function assuming heteroscedastic, t-distributed residuals. Despite high uncertainty in some model parameters, the flow and nitrate calibration data was well reproduced across all catchments (Nash-Sutcliffe efficiency against Log transformed data, NSL, in the range 0.62-0.83 for daily flow and 0.17-0.88 for nitrate concentration). The slight increase in the size of the residuals for a separate validation period was considered acceptable (NSL in the range 0.60-0.89 for daily flow and 0.10-0.74 for nitrate concentration, excluding one data set with limited validation data). Proportions of flow and nitrate discharge attributed to near-surface, fast seasonal groundwater and slow deeper groundwater were consistent with expectations based on catchment geology. The results for the Weida Stream in Thuringia, Germany, using monthly as opposed to daily nitrate data were, for all intents and purposes, identical, suggesting that four years of monthly nitrate sampling provides sufficient information for calibration of the StreamGEM model and prediction of catchment dynamics. This study highlights the remarkable effectiveness of process based, spatially lumped modelling with commonly available monthly stream sample data, to elucidate high resolution catchment function, when appropriate calibration methods are used that correctly handle the inherent uncertainties.

  15. The Global Precipitation Mission

    NASA Technical Reports Server (NTRS)

    Braun, Scott; Kummerow, Christian

    2000-01-01

    The Global Precipitation Mission (GPM), expected to begin around 2006, is a follow-up to the Tropical Rainfall Measuring Mission (TRMM). Unlike TRMM, which primarily samples the tropics, GPM will sample both the tropics and mid-latitudes. The primary, or core, satellite will be a single, enhanced TRMM satellite that can quantify the 3-D spatial distributions of precipitation and its associated latent heat release. The core satellite will be complemented by a constellation of very small and inexpensive drones with passive microwave instruments that will sample the rainfall with sufficient frequency to be not only of climate interest, but also have local, short-term impacts by providing global rainfall coverage at approx. 3 h intervals. The data is expected to have substantial impact upon quantitative precipitation estimation/forecasting and data assimilation into global and mesoscale numerical models. Based upon previous studies of rainfall data assimilation, GPM is expected to lead to significant improvements in forecasts of extratropical and tropical cyclones. For example, GPM rainfall data can provide improved initialization of frontal systems over the Pacific and Atlantic Oceans. The purpose of this talk is to provide information about GPM to the USWRP (U.S. Weather Research Program) community and to discuss impacts on quantitative precipitation estimation/forecasting and data assimilation.

  16. Tank 241-AY-101 Privatization Push Mode Core Sampling and Analysis Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TEMPLETON, A.M.

    2000-01-12

    This sampling and analysis plan (SAP) identifies characterization objectives pertaining to sample collection, laboratory analytical evaluation, and reporting requirements for samples obtained from tank 241-AY-101. The purpose of this sampling event is to obtain information about the characteristics of the contents of 241-AY-101 required to satisfy Data Quality Objectives For RPP Privatization Phase I: Confirm Tank T Is An Appropriate Feed Source For High-Level Waste Feed Batch X(HLW DQO) (Nguyen 1999a), Data Quality Objectives For TWRS Privatization Phase I : Confirm Tank T Is An Appropriate Feed Source For Low-Activity Waste Feed Batch X (LAW DQO) (Nguyen 1999b), Low Activitymore » Waste and High-Level Waste Feed Data Quality Objectives (L and H DQO) (Patello et al. 1999), and Characterization Data Needs for Development, Design, and Operation of Retrieval Equipment Developed through the Data Quality Objective Process (Equipment DQO) (Bloom 1996). Special instructions regarding support to the LAW and HLW DQOs are provided by Baldwin (1999). Push mode core samples will be obtained from risers 15G and 150 to provide sufficient material for the chemical analyses and tests required to satisfy these data quality objectives. The 222-S Laboratory will extrude core samples; composite the liquids and solids; perform chemical analyses on composite and segment samples; archive half-segment samples; and provide subsamples to the Process Chemistry Laboratory. The Process Chemistry Laboratory will prepare test plans and perform process tests to evaluate the behavior of the 241-AY-101 waste undergoing the retrieval and treatment scenarios defined in the applicable DQOs. Requirements for analyses of samples originating in the process tests will be documented in the corresponding test plans and are not within the scope of this SAP.« less

  17. Depositing nanoparticles on a silicon substrate using a freeze drying technique.

    PubMed

    Sigehuzi, Tomoo

    2017-08-28

    For the microscopic observation of nanoparticles, an adequate sample preparation is an essential part of this task. Much research has been performed for usable preparation methods that will yield aggregate-free samples. A freeze drying technique, which only requires a -80  ° C freezer and a freeze dryer, is shown to provide an on-substrate dispersion of mostly isolated nanoparticles. The particle density could be made sufficiently high for efficient observations using atomic force microscopy. Since this sandwich method is purely physical, it could be applied to deposit various nanoparticles independent of their surface chemical properties. Suspension film thickness, or the dimensionality of the suspension film, was shown to be crucial for the isolation of the particles. Silica nanoparticles were dispersed on a silicon substrate using this method and the sample properties were examined using atomic force microscopy.

  18. Rainfall, Streamflow, and Water-Quality Data During Stormwater Monitoring, Halawa Stream Drainage Basin, Oahu, Hawaii, July 1, 2000 to June 30, 2001

    USGS Publications Warehouse

    Presley, Todd K.

    2001-01-01

    The State of Hawaii Department of Transportation Stormwater Monitoring Program was implemented on January 1, 2001. The program includes the collection of rainfall, streamflow, and water-quality data at selected sites in the Halawa Stream drainage basin. Rainfall and streamflow data were collected from July 1, 2000 to June 30, 2001. Few storms during the year met criteria for antecedent dry conditions or provided enough runoff to sample. The storm of June 5, 2001 was sufficiently large to cause runoff. On June 5, 2001, grab samples were collected at five sites along North Halawa and Halawa Streams. The five samples were later analyzed for nutrients, trace metals, oil and grease, total petroleum hydrocarbons, fecal coliform, biological and chemical oxygen demands, total suspended solids, and total dissolved solids.

  19. Fast Cooling and Vitrification of Aqueous Solutions for Cryopreservation

    NASA Astrophysics Data System (ADS)

    Warkentin, Matt; Husseini, Naji; Berejnov, Viatcheslav; Thorne, Robert

    2006-03-01

    In many applications, a small volume of aqueous solution must be cooled at a rate sufficient to produce amorphous solid water. Two prominent examples include flash-freezing of protein crystals for X-ray data collection and freezing of cells (i.e. spermatozoa) for cryopreservation. The cooling rate required to vitrify pure water (˜10^6 K/s) is unattainable for volumes that might contain cells or protein crystals, but the required rate can be reduced by adding cryoprotectants. We report the first measurements of the critical concentration required to produce a vitrified sample as a function of the sample's volume, the cryogen into which the sample is plunged, and the temperature of the cryogen, for a wide range of cryoprotectants. These experiments have broad practical consequences for cryopreservation, and provide insight into the physics of glass formation in aqueous systems.

  20. The Potassium-Argon Laser Experiment (KArLE): In Situ Geochronology for Planetary Robotic Missions

    NASA Technical Reports Server (NTRS)

    Cohen, Barbara

    2016-01-01

    The Potassium (K) - Argon (Ar) Laser Experiment (KArLE) will make in situ noble-gas geochronology measurements aboard planetary robotic landers and roverss. Laser-Induced Breakdown Spectroscopy (LIBS) is used to measure the K abun-dance in a sample and to release its noble gases; the evolved Ar is measured by mass spectrometry (MS); and rela-tive K content is related to absolute Ar abundance by sample mass, determined by optical measurement of the ablated volume. KArLE measures a whole-rock K-Ar age to 10% or better for rocks 2 Ga or older, sufficient to resolve the absolute age of many planetary samples. The LIBS-MS approach is attractive because the analytical components have been flight proven, do not require further technical development, and provide complementary measurements as well as in situ geochronology.

  1. High efficiency direct detection of ions from resonance ionization of sputtered atoms

    DOEpatents

    Gruen, Dieter M.; Pellin, Michael J.; Young, Charles E.

    1986-01-01

    A method and apparatus are provided for trace and other quantitative analysis with high efficiency of a component in a sample, with the analysis involving the removal by ion or other bombardment of a small quantity of ion and neutral atom groups from the sample, the conversion of selected neutral atom groups to photoions by laser initiated resonance ionization spectroscopy, the selective deflection of the photoions for separation from original ion group emanating from the sample, and the detection of the photoions as a measure of the quantity of the component. In some embodiments, the original ion group is accelerated prior to the RIS step for separation purposes. Noise and other interference are reduced by shielding the detector from primary and secondary ions and deflecting the photoions sufficiently to avoid the primary and secondary ions.

  2. High efficiency direct detection of ions from resonance ionization of sputtered atoms

    DOEpatents

    Gruen, D.M.; Pellin, M.J.; Young, C.E.

    1985-01-16

    A method and apparatus are provided for trace and other quantitative analysis with high efficiency of a component in a sample, with the analysis involving the removal by ion or other bombardment of a small quantity of ion and neutral atom groups from the sample, the conversion of selected neutral atom groups to photoions by laser initiated resonance ionization spectroscopy, the selective deflection of the photoions for separation from original ion group emanating from the sample, and the detection of the photoions as a measure of the quantity of the component. In some embodiments, the original ion group is accelerated prior to the RIS step for separation purposes. Noise and other interference are reduced by shielding the detector from primary and secondary ions and deflecting the photoions sufficiently to avoid the primary and secondary ions.

  3. Registered nurses' perceptions of cultural and linguistic hospital resources.

    PubMed

    Whitman, Marilyn V; Davis, Jullet A

    2009-01-01

    As the patient population continues to diversify, the need to provide care that is culturally and linguistically appropriate is intensifying. This study describes the perceptions of registered nurses (RNs) in Alabama hospitals regarding the training and resources available for providing culturally and linguistically appropriate care. The population consists of all RNs working in Alabama hospitals. A sample of 1976 RNs was obtained using an online survey. The findings indicate that although some resources and training are currently provided to nurses, the majority of respondents still lack sufficient resources and training to provide culturally and linguistically appropriate care. The lack of uniformity in resources and training makes it difficult to ensure that all healthcare providers are receiving the same information. However, hospitals do have the flexibility to tailor training to areas that are specific to their population needs.

  4. Determinants of utilization of sufficient tetanus toxoid immunization during pregnancy: evidence from the Kenya Demographic and Health Survey, 2008-2009.

    PubMed

    Haile, Zelalem T; Chertok, Ilana R Azulay; Teweldeberhan, Asli K

    2013-06-01

    Although the effectiveness of tetanus toxoid (TT) immunization during pregnancy in preventing maternal and neonatal tetanus is well established, in many developing countries, TT immunization programs are underutilized. The objective of this study was to examine factors associated with sufficient TT immunization among postpartum women in Kenya. Population based secondary data analysis was conducted using de-identified data from the 2008-2009 Kenyan Demographic and Health Survey (KDHS) for 1,370 female participants who had a live birth during or within 12 months of the cross-sectional survey. Chi-square test and independent sample t test were conducted to assess bivariate associations and a multivariable logistic regression analysis was conducted to examine associations before and after adjustment for demographic, socioeconomic, cultural, and access to care factors. The main factors contributing to having been sufficiently immunized against tetanus were lower birth order, higher household wealth index, women's employment, making joint health-related decisions with a partner, and higher number of antenatal care visits. Implications for health care providers and other professionals involved in development of strategies and interventions aimed at improving immunization rates are discussed.

  5. General administrative rulings and decisions; amendment to the examination and investigation sample requirements; companion document to direct final rule--FDA. Proposed rule.

    PubMed

    1998-09-25

    The Food and Drug Administration (FDA) is proposing to amend its regulations regarding the collection of twice the quantity of food, drug, or cosmetic estimated to be sufficient for analysis. This action increases the dollar amount that FDA will consider to determine whether to routinely collect a reserve sample of a food, drug, or cosmetic product in addition to the quantity sufficient for analysis. Experience has demonstrated that the current dollar amount does not adequately cover the cost of most quantities sufficient for analysis plus reserve samples. This proposed rule is a companion to the direct final rule published elsewhere in this issue of the Federal Register. This action is part of FDA's continuing effort to achieve the objectives of the President's "Reinventing Government" initiative, and it is intended to reduce the burden of unnecessary regulations on food, drugs, and cosmetics without diminishing the protection of the public health.

  6. Systems analysis of BCL2 protein family interactions establishes a model to predict responses to chemotherapy.

    PubMed

    Lindner, Andreas U; Concannon, Caoimhín G; Boukes, Gerhardt J; Cannon, Mary D; Llambi, Fabien; Ryan, Deborah; Boland, Karen; Kehoe, Joan; McNamara, Deborah A; Murray, Frank; Kay, Elaine W; Hector, Suzanne; Green, Douglas R; Huber, Heinrich J; Prehn, Jochen H M

    2013-01-15

    Apoptotic desensitization is a hallmark of cancer cells, but present knowledge of molecular systems controlling apoptosis has yet to provide significant prognostic insights. Here, we report findings from a systems study of the intrinsic pathway of apoptosis by BCL2 family proteins and clinical translation of its findings into a model with applications in colorectal cancer (CRC). By determining absolute protein quantifications in CRC cells and patient tumor samples, we found that BAK and BAX were expressed more highly than their antiapoptotic inhibitors. This counterintuitive finding suggested that sole inhibition of effector BAX and BAK could not be sufficient for systems stability in nonstressed cells. Assuming a model of direct effector activation by BH3-only proteins, we calculated that the amount of stress-induced BH3-only proteins required to activate mitochondrial apoptosis could predict individual death responses of CRC cells to 5-fluorouracil/oxaliplatin. Applying this model predictor to protein profiles in tumor and matched normal tissue samples from 26 patients with CRCs, we found that differences in protein quantities were sufficient to model the increased tumor sensitivity to chemotherapy compared with normal tissue. In addition, these differences were sufficient to differentiate clinical responders from nonresponders with high confidence. Applications of our model, termed DR_MOMP, were used to assess the impact of apoptosis-sensitizing drugs in lowering the necessary dose of state-of-the-art chemotherapy in individual patients. Together, our findings offer a ready clinical tool with the potential to tailor chemotherapy to individual patients.

  7. On the Ability of Space- Based Passive and Active Remote Sensing Observations of CO2 to Detect Flux Perturbations to the Carbon Cycle

    NASA Technical Reports Server (NTRS)

    Crowell, Sean M. R.; Kawa, S. Randolph; Browell, Edward V.; Hammerling, Dorit M.; Moore, Berrien; Schaefer, Kevin; Doney, Scott C.

    2018-01-01

    Space-borne observations of CO2 are vital to gaining understanding of the carbon cycle in regions of the world that are difficult to measure directly, such as the tropical terrestrial biosphere, the high northern and southern latitudes, and in developing nations such as China. Measurements from passive instruments such as GOSAT (Greenhouse Gases Observing Satellite) and OCO-2 (Orbiting Carbon Observatory 2), however, are constrained by solar zenith angle limitations as well as sensitivity to the presence of clouds and aerosols. Active measurements such as those in development for the Active Sensing of CO2 Emissions over Nights, Days and Seasons (ASCENDS) mission show strong potential for making measurements in the high-latitude winter and in cloudy regions. In this work we examine the enhanced flux constraint provided by the improved coverage from an active measurement such as ASCENDS. The simulation studies presented here show that with sufficient precision, ASCENDS will detect permafrost thaw and fossil fuel emissions shifts at annual and seasonal time scales, even in the presence of transport errors, representativeness errors, and biogenic flux errors. While OCO-2 can detect some of these perturbations at the annual scale, the seasonal sampling provided by ASCENDS provides the stronger constraint. Plain Language Summary: Active and passive remote sensors show the potential to provide unprecedented information on the carbon cycle. With the all-season sampling, active remote sensors are more capable of constraining high-latitude emissions. The reduced sensitivity to cloud and aerosol also makes active sensors more capable of providing information in cloudy and polluted scenes with sufficient accuracy. These experiments account for errors that are fundamental to the top-down approach for constraining emissions, and even including these sources of error, we show that satellite remote sensors are critical for understanding the carbon cycle.

  8. Advances in the measurement of sulfur isotopes by multi-collector ICP-MS (MC-ICP- MS)

    NASA Astrophysics Data System (ADS)

    Ridley, W. I.; Wilson, S. A.; Anthony, M. W.

    2006-12-01

    The demonstrated capability to measure 34S/32S by MC-ICP-MS with a precision (2ó) of ~0.2 per mil has many potential applications in geochemistry. However, a number of obstacles limit this potential. First, to achieve the precision indicated above requires sufficient mass resolution to separate isobaric interferences of 16O2 and 17O2 on 32S and 34S, respectively. These requirements for high resolution mean overall instrument sensitivity is reduced. Second, current methods preclude analysis of samples with complex matrices, a common characteristic of sulfur-bearing geologic materials. Here, we describe and discuss a method that provides both efficient removal of matrix constituents, and provides pre-concentration of S, thus overcoming these obstacles. The method involves the separation of sulfur from matrix constituents by high pressure (1000 psi) ion chromatography (HPIC), followed by isotope measurement using MC-ICP-MS. This combination allows for analysis of liquid samples with a wide range of S concentrations. A powerful advantage of this technique is the efficient separation of many sulfur species from matrix cations and anions (for instance in a seawater or acid mine drainage matrix), as well as the separation of sulfur species, e.g., sulfate, sulfite, thiosulfate, thiocynate, from each other for isotope analysis. The automated HPIC system uses a carbonate-bicarbonate eluent with eluent suppression, and has sufficient baseline separation to collect the various sulfur species as pure fractions. The individual fractions are collected over a specific time interval based upon a pre-determined elution profile and peak retention times. The addition of a second ion exchange column into the system allows pre-concentration of sulfur species by 2-3 orders of magnitude for samples that otherwise would have sulfur concentrations too low to provide precise isotopic ratios. The S isotope ratios are measured by MC-ICP-MS using a desolvating sample introduction system, a standard-sample bracketing method employing standards that are well characterized for sulfur isotope composition using stable isotope gas mass spectrometry. Data are collected in time-resolved mode, which reduces analytical time and allows for flexibility in data integration. Preliminary data indicates that sulfur species do not fractionate during the column chemistry.

  9. Optimization of HPV DNA detection in urine by improving collection, storage, and extraction.

    PubMed

    Vorsters, A; Van den Bergh, J; Micalessi, I; Biesmans, S; Bogers, J; Hens, A; De Coster, I; Ieven, M; Van Damme, P

    2014-11-01

    The benefits of using urine for the detection of human papillomavirus (HPV) DNA have been evaluated in disease surveillance, epidemiological studies, and screening for cervical cancers in specific subgroups. HPV DNA testing in urine is being considered for important purposes, notably the monitoring of HPV vaccination in adolescent girls and young women who do not wish to have a vaginal examination. The need to optimize and standardize sampling, storage, and processing has been reported.In this paper, we examined the impact of a DNA-conservation buffer, the extraction method, and urine sampling on the detection of HPV DNA and human DNA in urine provided by 44 women with a cytologically normal but HPV DNA-positive cervical sample. Ten women provided first-void and midstream urine samples. DNA analysis was performed using real-time PCR to allow quantification of HPV and human DNA.The results showed that an optimized method for HPV DNA detection in urine should (a) prevent DNA degradation during extraction and storage, (b) recover cell-free HPV DNA in addition to cell-associated DNA, (c) process a sufficient volume of urine, and (d) use a first-void sample.In addition, we found that detectable human DNA in urine may not be a good internal control for sample validity. HPV prevalence data that are based on urine samples collected, stored, and/or processed under suboptimal conditions may underestimate infection rates.

  10. Static versus dynamic sampling for data mining

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    John, G.H.; Langley, P.

    1996-12-31

    As data warehouses grow to the point where one hundred gigabytes is considered small, the computational efficiency of data-mining algorithms on large databases becomes increasingly important. Using a sample from the database can speed up the datamining process, but this is only acceptable if it does not reduce the quality of the mined knowledge. To this end, we introduce the {open_quotes}Probably Close Enough{close_quotes} criterion to describe the desired properties of a sample. Sampling usually refers to the use of static statistical tests to decide whether a sample is sufficiently similar to the large database, in the absence of any knowledgemore » of the tools the data miner intends to use. We discuss dynamic sampling methods, which take into account the mining tool being used and can thus give better samples. We describe dynamic schemes that observe a mining tool`s performance on training samples of increasing size and use these results to determine when a sample is sufficiently large. We evaluate these sampling methods on data from the UCI repository and conclude that dynamic sampling is preferable.« less

  11. Acoustic Enrichment of Extracellular Vesicles from Biological Fluids.

    PubMed

    Ku, Anson; Lim, Hooi Ching; Evander, Mikael; Lilja, Hans; Laurell, Thomas; Scheding, Stefan; Ceder, Yvonne

    2018-06-11

    Extracellular vesicles (EVs) have emerged as a rich source of biomarkers providing diagnostic and prognostic information in diseases such as cancer. Large-scale investigations into the contents of EVs in clinical cohorts are warranted, but a major obstacle is the lack of a rapid, reproducible, efficient, and low-cost methodology to enrich EVs. Here, we demonstrate the applicability of an automated acoustic-based technique to enrich EVs, termed acoustic trapping. Using this technology, we have successfully enriched EVs from cell culture conditioned media and urine and blood plasma from healthy volunteers. The acoustically trapped samples contained EVs ranging from exosomes to microvesicles in size and contained detectable levels of intravesicular microRNAs. Importantly, this method showed high reproducibility and yielded sufficient quantities of vesicles for downstream analysis. The enrichment could be obtained from a sample volume of 300 μL or less, an equivalent to 30 min of enrichment time, depending on the sensitivity of downstream analysis. Taken together, acoustic trapping provides a rapid, automated, low-volume compatible, and robust method to enrich EVs from biofluids. Thus, it may serve as a novel tool for EV enrichment from large number of samples in a clinical setting with minimum sample preparation.

  12. Dose coverage calculation using a statistical shape model—applied to cervical cancer radiotherapy

    NASA Astrophysics Data System (ADS)

    Tilly, David; van de Schoot, Agustinus J. A. J.; Grusell, Erik; Bel, Arjan; Ahnesjö, Anders

    2017-05-01

    A comprehensive methodology for treatment simulation and evaluation of dose coverage probabilities is presented where a population based statistical shape model (SSM) provide samples of fraction specific patient geometry deformations. The learning data consists of vector fields from deformable image registration of repeated imaging giving intra-patient deformations which are mapped to an average patient serving as a common frame of reference. The SSM is created by extracting the most dominating eigenmodes through principal component analysis of the deformations from all patients. The sampling of a deformation is thus reduced to sampling weights for enough of the most dominating eigenmodes that describe the deformations. For the cervical cancer patient datasets in this work, we found seven eigenmodes to be sufficient to capture 90% of the variance in the deformations of the, and only three eigenmodes for stability in the simulated dose coverage probabilities. The normality assumption of the eigenmode weights was tested and found relevant for the 20 most dominating eigenmodes except for the first. Individualization of the SSM is demonstrated to be improved using two deformation samples from a new patient. The probabilistic evaluation provided additional information about the trade-offs compared to the conventional single dataset treatment planning.

  13. Adaptable Constrained Genetic Programming: Extensions and Applications

    NASA Technical Reports Server (NTRS)

    Janikow, Cezary Z.

    2005-01-01

    An evolutionary algorithm applies evolution-based principles to problem solving. To solve a problem, the user defines the space of potential solutions, the representation space. Sample solutions are encoded in a chromosome-like structure. The algorithm maintains a population of such samples, which undergo simulated evolution by means of mutation, crossover, and survival of the fittest principles. Genetic Programming (GP) uses tree-like chromosomes, providing very rich representation suitable for many problems of interest. GP has been successfully applied to a number of practical problems such as learning Boolean functions and designing hardware circuits. To apply GP to a problem, the user needs to define the actual representation space, by defining the atomic functions and terminals labeling the actual trees. The sufficiency principle requires that the label set be sufficient to build the desired solution trees. The closure principle allows the labels to mix in any arity-consistent manner. To satisfy both principles, the user is often forced to provide a large label set, with ad hoc interpretations or penalties to deal with undesired local contexts. This unfortunately enlarges the actual representation space, and thus usually slows down the search. In the past few years, three different methodologies have been proposed to allow the user to alleviate the closure principle by providing means to define, and to process, constraints on mixing the labels in the trees. Last summer we proposed a new methodology to further alleviate the problem by discovering local heuristics for building quality solution trees. A pilot system was implemented last summer and tested throughout the year. This summer we have implemented a new revision, and produced a User's Manual so that the pilot system can be made available to other practitioners and researchers. We have also designed, and partly implemented, a larger system capable of dealing with much more powerful heuristics.

  14. The Performance of a PN Spread Spectrum Receiver Preceded by an Adaptive Interference Suppression Filter.

    DTIC Science & Technology

    1982-12-01

    Sequence dj Estimate of the Desired Signal DEL Sampling Time Interval DS Direct Sequence c Sufficient Statistic E/T Signal Power Erfc Complimentary Error...Namely, a white Gaussian noise (WGN) generator was added. Also, a statistical subroutine was added in order to assess performance improvement at the...reference code and then passed through a correlation detector whose output is the sufficient 1 statistic , e . Using a threshold device and the sufficient

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alstone, Peter; Jacobson, Arne; Mills, Evan

    Efforts to promote rechargeable electric lighting as a replacement for fuel-based light sources in developing countries are typically predicated on the notion that lighting service levels can be maintained or improved while reducing the costs and environmental impacts of existing practices. However, the extremely low incomes of those who depend on fuel-based lighting create a need to balance the hypothetically possible or desirable levels of light with those that are sufficient and affordable. In a pilot study of four night vendors in Kenya, we document a field technique we developed to simultaneously measure the effectiveness of lighting service provided bymore » a lighting system and conduct a survey of lighting service demand by end-users. We took gridded illuminance measurements across each vendor's working and selling area, with users indicating the sufficiency of light at each point. User light sources included a mix of kerosene-fueled hurricane lanterns, pressure lamps, and LED lanterns.We observed illuminance levels ranging from just above zero to 150 lux. The LED systems markedly improved the lighting service levels over those provided by kerosene-fueled hurricane lanterns. Users reported that the minimum acceptable threshold was about 2 lux. The results also indicated that the LED lamps in use by the subjects did not always provide sufficient illumination over the desired retail areas. Our sample size is much too small, however, to reach any conclusions about requirements in the broader population. Given the small number of subjects and very specific type of user, our results should be regarded as indicative rather than conclusive. We recommend replicating the method at larger scales and across a variety of user types and contexts. Policymakers should revisit the subject of recommended illuminance levels regularly as LED technology advances and the price/service balance point evolves.« less

  16. Simultaneous determination of chloroquine and its three metabolites in human plasma, whole blood and urine by ion-pair high-performance liquid chromatography.

    PubMed

    Houzé, P; de Reynies, A; Baud, F J; Benatar, M F; Pays, M

    1992-02-14

    A method was developed for the separation and measurement of chloroquine and three metabolites (desethylchloroquine, bisdesethylchloroquine and 4-amino-7-chloroquinoline) in biological samples by ion-pair high-performance liquid chromatography with UV detection. The method uses 2,3-diaminoaphthalene as an internal standard and provides a limit of detection between 1 and 2 ng/ml for chloroquine and its metabolites. The assay was linear in the range 12.5-250 ng/ml and the analytical recovery and reproducibility were sufficient. The assay was applied to the analysis of biological samples from a patient undergoing chloroquine chemoprophylaxis and a patient who had ingested chloroquine in a suicide attempt.

  17. Profiling of polar metabolites in biological extracts using diamond hydride-based aqueous normal phase chromatography.

    PubMed

    Callahan, Damien L; De Souza, David; Bacic, Antony; Roessner, Ute

    2009-07-01

    Highly polar metabolites, such as sugars and most amino acids are not retained by conventional RP LC columns. Without sufficient retention low concentration compounds are not detected due ion suppression and structural isomers are not resolved. In contrast, hydrophilic interaction chromatography (HILIC) and aqueous normal phase chromatography (ANP) retain compounds based on their hydrophilicity and therefore provides a means of separating highly polar compounds. Here, an ANP method based on the diamond hydride stationary phase is presented for profiling biological small molecules by LC. A rapid separation system based upon a fast gradient that delivers reproducible chromatography is presented. Approximately 1000 compounds were reproducibly detected in human urine samples and clear differences between these samples were identified. This chromatography was also applied to xylem fluid from soyabean (Glycine max) plants to which 400 compounds were detected. This method greatly increases the metabolite coverage over RP-only metabolite profiling in biological samples. We show that both forms of chromatography are necessary for untargeted comprehensive metabolite profiling and that the diamond hydride stationary phase provides a good option for polar metabolite analysis.

  18. Hemolivia and hepatozoon: haemogregarines with tangled evolutionary relationships.

    PubMed

    Kvičerová, Jana; Hypša, Václav; Dvořáková, Nela; Mikulíček, Peter; Jandzik, David; Gardner, Michael George; Javanbakht, Hossein; Tiar, Ghoulem; Siroký, Pavel

    2014-09-01

    The generic name Hemolivia has been used for haemogregarines characterized by morphological and biological features. The few molecular studies, focused on other haemogregarine genera but involving Hemolivia samples, indicated its close relationship to the genus Hepatozoon. Here we analyze molecular data for Hemolivia from a broad geographic area and host spectrum and provide detailed morphological documentation of the included samples. Based on molecular analyses in context of other haemogregarines, we demonstrate that several sequences deposited in GenBank from isolates described as Hepatozoon belong to the Hemolivia cluster. This illustrates the overall difficulty with recognizing Hemolivia and Hepatozoon without sufficient morphological and molecular information. The close proximity of both genera is also reflected in uncertainty about their precise phylogeny when using 18S rDNA. They cluster with almost identical likelihood either as two sister taxa or as monophyletic Hemolivia within paraphyletic Hepatozoon. However, regardless of these difficulties, the results presented here provide a reliable background for the unequivocal placement of new samples into the Hemolivia/ Hepatozoon complex. Copyright © 2014 Elsevier GmbH. All rights reserved.

  19. Working toward Self-Sufficiency.

    ERIC Educational Resources Information Center

    Caplan, Nathan

    1985-01-01

    Upon arrival in the United States, the Southeast Asian "Boat People" faced a multitude of problems that would seem to have hindered their achieving economic self-sufficiency. Nonetheless, by the time of a 1982 research study which interviewed nearly 1,400 refugee households, 25 percent of all the households in the sample had achieved…

  20. 7 CFR 58.244 - Number of samples.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 3 2013-01-01 2013-01-01 false Number of samples. 58.244 Section 58.244 Agriculture... Procedures § 58.244 Number of samples. As many samples shall be taken from each dryer production lot as is necessary to assure proper composition and quality control. A sufficient number of representative samples...

  1. 7 CFR 58.244 - Number of samples.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 3 2010-01-01 2010-01-01 false Number of samples. 58.244 Section 58.244 Agriculture... Procedures § 58.244 Number of samples. As many samples shall be taken from each dryer production lot as is necessary to assure proper composition and quality control. A sufficient number of representative samples...

  2. Martian Chemical and Isotopic Reference Standards in Earth-based Laboratories — An Invitation for Geochemical, Astrobiological, and Engineering Dialog on Considering a Weathered Chondrite for Mars Sample Return.

    NASA Astrophysics Data System (ADS)

    Ashley, J. W.; Tait, A. W.; Velbel, M. A.; Boston, P. J.; Carrier, B. L.; Cohen, B. A.; Schröder, C.; Bland, P.

    2017-12-01

    Exogenic rocks (meteorites) found on Mars 1) have unweathered counterparts on Earth; 2) weather differently than indigenous rocks; and 3) may be ideal habitats for putative microorganisms and subsequent biosignature preservation. These attributes show the potential of meteorites for addressing hypothesis-driven science. They raise the question of whether chondritic meteorites, of sufficient weathering intensity, might be considered as candidates for sample return in a potential future mission. Pursuant to this discussion are the following questions. A) Is there anything to be learned from the laboratory study of a martian chondrite that cannot be learned from indigenous materials; and if so, B) is the science value high enough to justify recovery? If both A and B answer affirmatively, then C) what are the engineering constraints for sample collection for Mars 2020 and potential follow-on missions; and finally D) what is the likelihood of finding a favorable sample? Observations relevant to these questions include: i) Since 2005, 24 candidate and confirmed meteorites have been identified on Mars at three rover landing sites, demonstrating their ubiquity and setting expectations for future finds. All have been heavily altered by a variety of physical and chemical processes. While the majority of these are irons (not suitable for recovery), several are weathered stony meteorites. ii) Exogenic reference materials provide the only chemical/isotope standards on Mars, permitting quantification of alteration rates if residence ages can be attained; and possibly enabling the removal of Late Amazonian weathering overprints from other returned samples. iii) Recent studies have established the habitability of chondritic meteorites with terrestrial microorganisms, recommending their consideration when exploring astrobiological questions. High reactivity, organic content, and permeability show stony meteorites to be more attractive for colonization and subsequent biosignature preservation than Earth rocks. iv) Compressive strengths of most ordinary chondrites are within the range of rocks being tested for the Mars 2020 drill bits, provided that sufficient size, stability, and flatness of a target can be achieved. Alternatively, the regolith collection bit could be employed for unconsolidated material.

  3. Injection current minimization of InAs/InGaAs quantum dot laser by optimization of its active region and reflectivity of laser cavity edges

    NASA Astrophysics Data System (ADS)

    Korenev, V. V.; Savelyev, A. V.; Zhukov, A. E.; Maximov, M. V.

    2015-11-01

    The ways to optimize key parameters of active region and edge reflectivity of edge- emitting semiconductor quantum dot laser are provided. It is shown that in the case of optimal cavity length and sufficiently large dispersion lasing spectrum of a given width can be obtained at injection current up to an order of magnitude lower in comparison to non-optimized sample. The influence of internal loss and edge reflection is also studied in details.

  4. Ultrafast electron diffraction with megahertz MeV electron pulses from a superconducting radio-frequency photoinjector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Feng, L. W.; Lin, L.; Huang, S. L.

    We report ultrafast relativistic electron diffraction operating at the megahertz repetition rate where the electron beam is produced in a superconducting radio-frequency (rf) photoinjector. We show that the beam quality is sufficiently high to provide clear diffraction patterns from gold and aluminium samples. With the number of electrons, several orders of magnitude higher than that from a normal conducting photocathode rf gun, such high repetition rate ultrafast MeV electron diffraction may open up many new opportunities in ultrafast science.

  5. Mission Advantages of NEXT: Nasa's Evolutionary Xenon Thruster

    NASA Technical Reports Server (NTRS)

    Oleson, Steven; Gefert, Leon; Benson, Scott; Patterson, Michael; Noca, Muriel; Sims, Jon

    2002-01-01

    With the demonstration of the NSTAR propulsion system on the Deep Space One mission, the range of the Discovery class of NASA missions can now be expanded. NSTAR lacks, however, sufficient performance for many of the more challenging Office of Space Science (OSS) missions. Recent studies have shown that NASA's Evolutionary Xenon Thruster (NEXT) ion propulsion system is the best choice for many exciting potential OSS missions including outer planet exploration and inner solar system sample returns. The NEXT system provides the higher power, higher specific impulse, and higher throughput required by these science missions.

  6. Level 1 environmental assessment performance evaluation. Final report jun 77-oct 78

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estes, E.D.; Smith, F.; Wagoner, D.E.

    1979-02-01

    The report gives results of a two-phased evaluation of Level 1 environmental assessment procedures. Results from Phase I, a field evaluation of the Source Assessment Sampling System (SASS), showed that the SASS train performed well within the desired factor of 3 Level 1 accuracy limit. Three sample runs were made with two SASS trains sampling simultaneously and from approximately the same sampling point in a horizontal duct. A Method-5 train was used to estimate the 'true' particulate loading. The sampling systems were upstream of the control devices to ensure collection of sufficient material for comparison of total particulate, particle sizemore » distribution, organic classes, and trace elements. Phase II consisted of providing each of three organizations with three types of control samples to challenge the spectrum of Level 1 analytical procedures: an artificial sample in methylene chloride, an artificial sample on a flyash matrix, and a real sample composed of the combined XAD-2 resin extracts from all Phase I runs. Phase II results showed that when the Level 1 analytical procedures are carefully applied, data of acceptable accuracy is obtained. Estimates of intralaboratory and interlaboratory precision are made.« less

  7. Reflectance of metallic indium for solar energy applications

    NASA Technical Reports Server (NTRS)

    Bouquet, F. L.; Hasegawa, T.

    1984-01-01

    An investigation has been conducted in order to compile quantitative data on the reflective properties of metallic indium. The fabricated samples were of sufficiently high quality that differences from similar second-surface silvered mirrors were not apparent to the human eye. Three second-surface mirror samples were prepared by means of vacuum deposition techniques, yielding indium thicknesses of approximately 1000 A. Both hemispherical and specular measurements were made. It is concluded that metallic indium possesses a sufficiently high specular reflectance to be potentially useful in many solar energy applications.

  8. Stochastic stability properties of jump linear systems

    NASA Technical Reports Server (NTRS)

    Feng, Xiangbo; Loparo, Kenneth A.; Ji, Yuandong; Chizeck, Howard J.

    1992-01-01

    Jump linear systems are defined as a family of linear systems with randomly jumping parameters (usually governed by a Markov jump process) and are used to model systems subject to failures or changes in structure. The authors study stochastic stability properties in jump linear systems and the relationship among various moment and sample path stability properties. It is shown that all second moment stability properties are equivalent and are sufficient for almost sure sample path stability, and a testable necessary and sufficient condition for second moment stability is derived. The Lyapunov exponent method for the study of almost sure sample stability is discussed, and a theorem which characterizes the Lyapunov exponents of jump linear systems is presented.

  9. Improving tritium exposure reconstructions using accelerator mass spectrometry

    PubMed Central

    Hunt, J. R.; Vogel, J. S.; Knezovich, J. P.

    2010-01-01

    Direct measurement of tritium atoms by accelerator mass spectrometry (AMS) enables rapid low-activity tritium measurements from milligram-sized samples and permits greater ease of sample collection, faster throughput, and increased spatial and/or temporal resolution. Because existing methodologies for quantifying tritium have some significant limitations, the development of tritium AMS has allowed improvements in reconstructing tritium exposure concentrations from environmental measurements and provides an important additional tool in assessing the temporal and spatial distribution of chronic exposure. Tritium exposure reconstructions using AMS were previously demonstrated for a tree growing on known levels of tritiated water and for trees exposed to atmospheric releases of tritiated water vapor. In these analyses, tritium levels were measured from milligram-sized samples with sample preparation times of a few days. Hundreds of samples were analyzed within a few months of sample collection and resulted in the reconstruction of spatial and temporal exposure from tritium releases. Although the current quantification limit of tritium AMS is not adequate to determine natural environmental variations in tritium concentrations, it is expected to be sufficient for studies assessing possible health effects from chronic environmental tritium exposure. PMID:14735274

  10. Technical Basis for the Removal of Unremediated Nitrate Salt Sampling (UNS) to Support LANL Treatment Studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Funk, David John

    2016-05-05

    The sampling of unremediated nitrate salts (UNS) was originally proposed by the U.S. Department of Energy (DOE) and Los Alamos National Security, LLC (LANS) (collectively, the Permittees) as a means to ensure adequate understanding and characterization of the problematic waste stream created when the Permittees remediated these nitrate salts-bearing waste with an organic absorbent. The proposal to sample the UNS was driven by a lack of understanding with respect to the radioactive contamination release that occurred within the underground repository at the Waste Isolation Pilot Plant (WIPP) in February 14, 2014, as well as recommendations made by a Peer Reviewmore » Team. As discussed, the Permittees believe that current knowledge and understanding of the waste has sufficiently matured such that this additional sampling is not required. Perhaps more importantly, the risk of both chemical and radiological exposure to the workers sampling the UNS drum material is unwarranted. This memo provides the technical justification and rationale for excluding the UNS sampling from the treatment studies.« less

  11. Evaluating information content of SNPs for sample-tagging in re-sequencing projects.

    PubMed

    Hu, Hao; Liu, Xiang; Jin, Wenfei; Hilger Ropers, H; Wienker, Thomas F

    2015-05-15

    Sample-tagging is designed for identification of accidental sample mix-up, which is a major issue in re-sequencing studies. In this work, we develop a model to measure the information content of SNPs, so that we can optimize a panel of SNPs that approach the maximal information for discrimination. The analysis shows that as low as 60 optimized SNPs can differentiate the individuals in a population as large as the present world, and only 30 optimized SNPs are in practice sufficient in labeling up to 100 thousand individuals. In the simulated populations of 100 thousand individuals, the average Hamming distances, generated by the optimized set of 30 SNPs are larger than 18, and the duality frequency, is lower than 1 in 10 thousand. This strategy of sample discrimination is proved robust in large sample size and different datasets. The optimized sets of SNPs are designed for Whole Exome Sequencing, and a program is provided for SNP selection, allowing for customized SNP numbers and interested genes. The sample-tagging plan based on this framework will improve re-sequencing projects in terms of reliability and cost-effectiveness.

  12. Preanalytical Errors in Hematology Laboratory- an Avoidable Incompetence.

    PubMed

    HarsimranKaur, Vikram Narang; Selhi, Pavneet Kaur; Sood, Neena; Singh, Aminder

    2016-01-01

    Quality assurance in the hematology laboratory is a must to ensure laboratory users of reliable test results with high degree of precision and accuracy. Even after so many advances in hematology laboratory practice, pre-analytical errors remain a challenge for practicing pathologists. This study was undertaken with an objective to evaluate the types and frequency of preanalytical errors in hematology laboratory of our center. All the samples received in the Hematology Laboratory of Dayanand Medical College and Hospital, Ludhiana, India over a period of one year (July 2013-July 2014) were included in the study and preanalytical variables like clotted samples, quantity not sufficient, wrong sample, without label, wrong label were studied. Of 471,006 samples received in the laboratory, preanalytical errors, as per the above mentioned categories was found in 1802 samples. The most common error was clotted samples (1332 samples, 0.28% of the total samples) followed by quantity not sufficient (328 sample, 0.06%), wrong sample (96 samples, 0.02%), without label (24 samples, 0.005%) and wrong label (22 samples, 0.005%). Preanalytical errors are frequent in laboratories and can be corrected by regular analysis of the variables involved. Rectification can be done by regular education of the staff.

  13. Sensitivity enhancement and contrasting information provided by free radicals in oriented-sample NMR of bicelle-reconstituted membrane proteins.

    PubMed

    Tesch, Deanna M; Nevzorov, Alexander A

    2014-02-01

    Elucidating structure and topology of membrane proteins (MPs) is essential for unveiling functionality of these important biological constituents. Oriented-sample solid-state NMR (OS-NMR) is capable of providing such information on MPs under nearly physiological conditions. However, two dimensional OS-NMR experiments can take several days to complete due to long longitudinal relaxation times combined with the large number of scans to achieve sufficient signal sensitivity in biological samples. Here, free radicals 5-DOXYL stearic acid, TEMPOL, and CAT-1 were added to uniformly (15)N-labeled Pf1 coat protein reconstituted in DMPC/DHPC bicelles, and their effect on the longitudinal relaxation times (T1Z) was investigated. The dramatically shortened T1Z's allowed for the signal gain per unit time to be used for either: (i) up to a threefold reduction of the total experimental time at 99% magnetization recovery or (ii) obtaining up to 74% signal enhancement between the control and radical samples during constant experimental time at "optimal" relaxation delays. In addition, through OS-NMR and high-field EPR studies, free radicals were able to provide positional constraints in the bicelle system, which provide a description of the location of each residue in Pf1 coat protein within the bicellar membranes. This information can be useful in the determination of oligomerization states and immersion depths of larger membrane proteins. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. The 2015-2016 SEPMAP Program at NASA JSC: Science, Engineering, and Program Management Training

    NASA Technical Reports Server (NTRS)

    Graham, L.; Archer, D.; Bakalyar, J.; Berger, E.; Blome, E.; Brown, R.; Cox, S.; Curiel, P.; Eid, R.; Eppler, D.; hide

    2017-01-01

    The Systems Engineering Project Management Advancement Program (SEPMAP) at NASA Johnson Space Center (JSC) is an employee development program designed to provide graduate level training in project management and systems engineering. The program includes an applied learning project with engineering and integrated science goals requirements. The teams were presented with a task: Collect a representative sample set from a field site using a hexacopter platform, as if performing a scientific reconnaissance to assess whether the site is of sufficient scientific interest to justify exploration by astronauts. Four teams worked through the eighteen-month course to design customized sampling payloads integrated with the hexacopter, and then operate the aircraft to meet sampling requirements of number (= 5) and mass (= 5g each). The "Mars Yard" at JSC was utilized for this purpose. This project activity closely parallels NASA plans for the future exploration of Mars, where remote sites will be reconnoitered ahead of crewed exploration.

  15. Prediction-based sampled-data H∞ controller design for attitude stabilisation of a rigid spacecraft with disturbances

    NASA Astrophysics Data System (ADS)

    Zhu, Baolong; Zhang, Zhiping; Zhou, Ding; Ma, Jie; Li, Shunli

    2017-08-01

    This paper investigates the H∞ control problem of the attitude stabilisation of a rigid spacecraft with external disturbances using prediction-based sampled-data control strategy. Aiming to achieve a 'virtual' closed-loop system, a type of parameterised sampled-data controller is designed by introducing a prediction mechanism. The resultant closed-loop system is equivalent to a hybrid system featured by a continuous-time and an impulsive differential system. By using a time-varying Lyapunov functional, a generalised bounded real lemma (GBRL) is first established for a kind of impulsive differential system. Based on this GBRL and Lyapunov functional approach, a sufficient condition is derived to guarantee the closed-loop system to be asymptotically stable and to achieve a prescribed H∞ performance. In addition, the controller parameter tuning is cast into a convex optimisation problem. Simulation and comparative results are provided to illustrate the effectiveness of the developed control scheme.

  16. A microsampling method for genotyping coral symbionts

    NASA Astrophysics Data System (ADS)

    Kemp, D. W.; Fitt, W. K.; Schmidt, G. W.

    2008-06-01

    Genotypic characterization of Symbiodinium symbionts in hard corals has routinely involved coring, or the removal of branches or a piece of the coral colony. These methods can potentially underestimate the complexity of the Symbiodinium community structure and may produce lesions. This study demonstrates that microscale sampling of individual coral polyps provided sufficient DNA for identifying zooxanthellae clades by RFLP analyses, and subclades through the use of PCR amplification of the ITS-2 region of rDNA and denaturing-gradient gel electrophoresis. Using this technique it was possible to detect distinct ITS-2 types of Symbiodinium from two or three adjacent coral polyps. These methods can be used to intensely sample coral-symbiont population/communities while causing minimal damage. The effectiveness and fine scale capabilities of these methods were demonstrated by sampling and identifying phylotypes of Symbiodinium clades A, B, and C that co-reside within a single Montastraea faveolata colony.

  17. A 'feather-trap' for collecting DNA samples from birds.

    PubMed

    Maurer, Golo; Beck, Nadeena; Double, Michael C

    2010-01-01

    Genetic analyses of birds are usually based on DNA extracted from a blood sample. For some species, however, obtaining blood samples is difficult because they are sensitive to handling, pose a conservation or animal welfare concern, or evade capture. In such cases, feathers obtained from live birds in the wild can provide an alternative source of DNA. Here, we provide the first description and evaluation of a 'feather-trap', consisting of small strips of double-sided adhesive tape placed close to a nest with chicks, as a simple, inexpensive and minimally invasive method to collect feathers. The feather-trap was tested in tropical conditions on the Australian pheasant coucal (Centropus phasianinus). None of the 12 pairs of coucals on which the feather-trap was used abandoned the nest, and feeding rates did not differ from those of birds not exposed to a feather-trap. On average, 4.2 feathers were collected per trap over 2-5 days and, despite exposure to monsoonal rain, DNA was extracted from 71.4% of samples, albeit at low concentrations. The amount of genomic DNA extracted from each feather was sufficient to reliably genotype individuals at up to five microsatellite loci for parentage analysis. We show that a feather-trap can provide a reliable alternative for obtaining DNA in species where taking blood is difficult. It may also prove useful for collecting feather samples for other purposes, e.g. stable-isotope analysis. © 2009 Blackwell Publishing Ltd.

  18. Comparisons of NDT Methods to Inspect Cork and Cork filled Epoxy Bands

    NASA Technical Reports Server (NTRS)

    Lingbloom, Mike

    2007-01-01

    Sheet cork and cork filled epoxy provide external insulation for the Reusable Solid Rocket Motor (RSRM) on the Nation's Space Transportation System (STS). Interest in the reliability of the external insulation bonds has increased since the Columbia incident. A non-destructive test (NDT) method that will provide the best inspection for these bonds has been under evaluation. Electronic Shearography has been selected as the primary NDT method for inspection of these bond lines in the RSRM production flow. ATK Launch Systems Group has purchased an electronic shearography system that includes a vacuum chamber that is used for evaluation of test parts and custom vacuum windows for inspection of full-scale motors. Although the electronic shearography technology has been selected as the primary method for inspection of the external bonds, other technologies that exist continue to be investigated. The NASA/Marshall Space Flight Center (MSFC) NDT department has inspected several samples for comparison with electronic shearography with various inspections systems in their laboratory. The systems that were evaluated are X-ray backscatter, terahertz imaging, and microwave imaging. The samples tested have some programmed flaws as well as some flaws that occurred naturally during the sample making process. These samples provide sufficient flaw variation for the evaluation of the different inspection systems. This paper will describe and compare the basic functionality, test method and test results including dissection for each inspection technology.

  19. Latex samples for RAMSES electrophoresis experiment on IML 2

    NASA Technical Reports Server (NTRS)

    Seaman, Geoffrey V. F.; Knox, Robert J.

    1994-01-01

    The objectives of these reported studies were to provide ground based support services for the flight experiment team for the RAMSES experiment to be flown aboard IML-2. The specific areas of support included consultation on the performance of particle based electrophoresis studies, development of methods for the preparation of suitable samples for the flight hardware, the screening of particles to obtain suitable candidates for the flight experiment, and the electrophoretic characterization of sample particle preparations. The first phases of these studies were performed under this contract, while the follow on work was performed under grant number NAG8 1081, 'Preparation and Characterization of Latex Samples for RAMSES Experiment on IML 2.' During this first phase of the experiment the following benchmarks were achieved: Methods were tested for the concentration and resuspension of latex samples in the greater than 0.4 micron diameter range to provide moderately high solids content samples free of particle aggregation which interferred with the normal functioning of the RAMSES hardware. Various candidate latex preparations were screened and two candidate types of latex were identified for use in the flight experiments, carboxylate modified latex (CML) and acrylic acid-acrylamide modified latex (AAM). These latexes have relatively hydrophilic surfaces, are not prone to aggregate, and display sufficiently low electrophoretic mobilities in the flight buffer so that they can be used to make mixtures to test the resolving power of the flight hardware.

  20. Legionella detection by culture and qPCR: Comparing apples and oranges.

    PubMed

    Whiley, Harriet; Taylor, Michael

    2016-01-01

    Legionella spp. are the causative agent of Legionnaire's disease and an opportunistic pathogen of significant public health concern. Identification and quantification from environmental sources is crucial for identifying outbreak origins and providing sufficient information for risk assessment and disease prevention. Currently there are a range of methods for Legionella spp. quantification from environmental sources, but the two most widely used and accepted are culture and real-time polymerase chain reaction (qPCR). This paper provides a review of these two methods and outlines their advantages and limitations. Studies from the last 10 years which have concurrently used culture and qPCR to quantify Legionella spp. from environmental sources have been compiled. 26/28 studies detected Legionella at a higher rate using qPCR compared to culture, whilst only one study detected equivalent levels of Legionella spp. using both qPCR and culture. Aggregating the environmental samples from all 28 studies, 2856/3967 (72%) tested positive for the presence of Legionella spp. using qPCR and 1331/3967 (34%) using culture. The lack of correlation between methods highlights the need to develop an acceptable standardized method for quantification that is sufficient for risk assessment and management of this human pathogen.

  1. Recruitment of Older Adults: Success May Be in the Details

    PubMed Central

    McHenry, Judith C.; Insel, Kathleen C.; Einstein, Gilles O.; Vidrine, Amy N.; Koerner, Kari M.; Morrow, Daniel G.

    2015-01-01

    Purpose: Describe recruitment strategies used in a randomized clinical trial of a behavioral prospective memory intervention to improve medication adherence for older adults taking antihypertensive medication. Results: Recruitment strategies represent 4 themes: accessing an appropriate population, communication and trust-building, providing comfort and security, and expressing gratitude. Recruitment activities resulted in 276 participants with a mean age of 76.32 years, and study enrollment included 207 women, 69 men, and 54 persons representing ethnic minorities. Recruitment success was linked to cultivating relationships with community-based organizations, face-to-face contact with potential study participants, and providing service (e.g., blood pressure checks) as an access point to eligible participants. Seventy-two percent of potential participants who completed a follow-up call and met eligibility criteria were enrolled in the study. The attrition rate was 14.34%. Implications: The projected increase in the number of older adults intensifies the need to study interventions that improve health outcomes. The challenge is to recruit sufficient numbers of participants who are also representative of older adults to test these interventions. Failing to recruit a sufficient and representative sample can compromise statistical power and the generalizability of study findings. PMID:22899424

  2. Kit for detecting nucleic acid sequences using competitive hybridization probes

    DOEpatents

    Lucas, Joe N.; Straume, Tore; Bogen, Kenneth T.

    2001-01-01

    A kit is provided for detecting a target nucleic acid sequence in a sample, the kit comprising: a first hybridization probe which includes a nucleic acid sequence that is sufficiently complementary to selectively hybridize to a first portion of the target sequence, the first hybridization probe including a first complexing agent for forming a binding pair with a second complexing agent; and a second hybridization probe which includes a nucleic acid sequence that is sufficiently complementary to selectively hybridize to a second portion of the target sequence to which the first hybridization probe does not selectively hybridize, the second hybridization probe including a detectable marker; a third hybridization probe which includes a nucleic acid sequence that is sufficiently complementary to selectively hybridize to a first portion of the target sequence, the third hybridization probe including the same detectable marker as the second hybridization probe; and a fourth hybridization probe which includes a nucleic acid sequence that is sufficiently complementary to selectively hybridize to a second portion of the target sequence to which the third hybridization probe does not selectively hybridize, the fourth hybridization probe including the first complexing agent for forming a binding pair with the second complexing agent; wherein the first and second hybridization probes are capable of simultaneously hybridizing to the target sequence and the third and fourth hybridization probes are capable of simultaneously hybridizing to the target sequence, the detectable marker is not present on the first or fourth hybridization probes and the first, second, third, and fourth hybridization probes each include a competitive nucleic acid sequence which is sufficiently complementary to a third portion of the target sequence that the competitive sequences of the first, second, third, and fourth hybridization probes compete with each other to hybridize to the third portion of the target sequence.

  3. Reliability and statistical power analysis of cortical and subcortical FreeSurfer metrics in a large sample of healthy elderly.

    PubMed

    Liem, Franziskus; Mérillat, Susan; Bezzola, Ladina; Hirsiger, Sarah; Philipp, Michel; Madhyastha, Tara; Jäncke, Lutz

    2015-03-01

    FreeSurfer is a tool to quantify cortical and subcortical brain anatomy automatically and noninvasively. Previous studies have reported reliability and statistical power analyses in relatively small samples or only selected one aspect of brain anatomy. Here, we investigated reliability and statistical power of cortical thickness, surface area, volume, and the volume of subcortical structures in a large sample (N=189) of healthy elderly subjects (64+ years). Reliability (intraclass correlation coefficient) of cortical and subcortical parameters is generally high (cortical: ICCs>0.87, subcortical: ICCs>0.95). Surface-based smoothing increases reliability of cortical thickness maps, while it decreases reliability of cortical surface area and volume. Nevertheless, statistical power of all measures benefits from smoothing. When aiming to detect a 10% difference between groups, the number of subjects required to test effects with sufficient power over the entire cortex varies between cortical measures (cortical thickness: N=39, surface area: N=21, volume: N=81; 10mm smoothing, power=0.8, α=0.05). For subcortical regions this number is between 16 and 76 subjects, depending on the region. We also demonstrate the advantage of within-subject designs over between-subject designs. Furthermore, we publicly provide a tool that allows researchers to perform a priori power analysis and sensitivity analysis to help evaluate previously published studies and to design future studies with sufficient statistical power. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. Empirical evaluation of sufficient similarity in dose-response for environmental risk assessment of a mixture of 11 pyrethroids.

    EPA Science Inventory

    Chemical mixtures in the environment are often the result of a dynamic process. When dose-response data are available on random samples throughout the process, equivalence testing can be used to determine whether the mixtures are sufficiently similar based on a pre-specified biol...

  5. State-of-the-art practices in farmland biodiversity monitoring for North America and Europe.

    PubMed

    Herzog, Felix; Franklin, Janet

    2016-12-01

    Policy makers and farmers need to know the status of farmland biodiversity in order to meet conservation goals and evaluate management options. Based on a review of 11 monitoring programs in Europe and North America and on related literature, we identify the design choices or attributes of a program that balance monitoring costs and usefulness for stakeholders. A useful program monitors habitats, vascular plants, and possibly faunal groups (ecosystem service providers, charismatic species) using a stratified random sample of the agricultural landscape, including marginal and intensive regions. The size of landscape samples varies with the grain of the agricultural landscape; for example, samples are smaller in Europe and larger in North America. Raw data are collected in a rolling survey, which distributes sampling over several years. Sufficient practical experience is now available to implement broad monitoring schemes on both continents. Technological developments in remote sensing, metagenomics, and social media may offer new opportunities for affordable farmland biodiversity monitoring and help to lower the overall costs of monitoring programs.

  6. Evaluation of four automated protocols for extraction of DNA from FTA cards.

    PubMed

    Stangegaard, Michael; Børsting, Claus; Ferrero-Miliani, Laura; Frank-Hansen, Rune; Poulsen, Lena; Hansen, Anders J; Morling, Niels

    2013-10-01

    Extraction of DNA using magnetic bead-based techniques on automated DNA extraction instruments provides a fast, reliable, and reproducible method for DNA extraction from various matrices. Here, we have compared the yield and quality of DNA extracted from FTA cards using four automated extraction protocols on three different instruments. The extraction processes were repeated up to six times with the same pieces of FTA cards. The sample material on the FTA cards was either blood or buccal cells. With the QIAamp DNA Investigator and QIAsymphony DNA Investigator kits, it was possible to extract DNA from the FTA cards in all six rounds of extractions in sufficient amount and quality to obtain complete short tandem repeat (STR) profiles on a QIAcube and a QIAsymphony SP. With the PrepFiler Express kit, almost all the extractable DNA was extracted in the first two rounds of extractions. Furthermore, we demonstrated that it was possible to successfully extract sufficient DNA for STR profiling from previously processed FTA card pieces that had been stored at 4 °C for up to 1 year. This showed that rare or precious FTA card samples may be saved for future analyses even though some DNA was already extracted from the FTA cards.

  7. Powerful Statistical Inference for Nested Data Using Sufficient Summary Statistics

    PubMed Central

    Dowding, Irene; Haufe, Stefan

    2018-01-01

    Hierarchically-organized data arise naturally in many psychology and neuroscience studies. As the standard assumption of independent and identically distributed samples does not hold for such data, two important problems are to accurately estimate group-level effect sizes, and to obtain powerful statistical tests against group-level null hypotheses. A common approach is to summarize subject-level data by a single quantity per subject, which is often the mean or the difference between class means, and treat these as samples in a group-level t-test. This “naive” approach is, however, suboptimal in terms of statistical power, as it ignores information about the intra-subject variance. To address this issue, we review several approaches to deal with nested data, with a focus on methods that are easy to implement. With what we call the sufficient-summary-statistic approach, we highlight a computationally efficient technique that can improve statistical power by taking into account within-subject variances, and we provide step-by-step instructions on how to apply this approach to a number of frequently-used measures of effect size. The properties of the reviewed approaches and the potential benefits over a group-level t-test are quantitatively assessed on simulated data and demonstrated on EEG data from a simulated-driving experiment. PMID:29615885

  8. ADEQUACY OF VISUALLY CLASSIFIED PARTICLE COUNT STATISTICS FROM REGIONAL STREAM HABITAT SURVEYS

    EPA Science Inventory

    Streamlined sampling procedures must be used to achieve a sufficient sample size with limited resources in studies undertaken to evaluate habitat status and potential management-related habitat degradation at a regional scale. At the same time, these sampling procedures must achi...

  9. The Influence of Mark-Recapture Sampling Effort on Estimates of Rock Lobster Survival

    PubMed Central

    Kordjazi, Ziya; Frusher, Stewart; Buxton, Colin; Gardner, Caleb; Bird, Tomas

    2016-01-01

    Five annual capture-mark-recapture surveys on Jasus edwardsii were used to evaluate the effect of sample size and fishing effort on the precision of estimated survival probability. Datasets of different numbers of individual lobsters (ranging from 200 to 1,000 lobsters) were created by random subsampling from each annual survey. This process of random subsampling was also used to create 12 datasets of different levels of effort based on three levels of the number of traps (15, 30 and 50 traps per day) and four levels of the number of sampling-days (2, 4, 6 and 7 days). The most parsimonious Cormack-Jolly-Seber (CJS) model for estimating survival probability shifted from a constant model towards sex-dependent models with increasing sample size and effort. A sample of 500 lobsters or 50 traps used on four consecutive sampling-days was required for obtaining precise survival estimations for males and females, separately. Reduced sampling effort of 30 traps over four sampling days was sufficient if a survival estimate for both sexes combined was sufficient for management of the fishery. PMID:26990561

  10. Community-Level Physiological Profiling of Microbial Communities in Constructed Wetlands: Effects of Sample Preparation.

    PubMed

    Button, Mark; Weber, Kela; Nivala, Jaime; Aubron, Thomas; Müller, Roland Arno

    2016-03-01

    Community-level physiological profiling (CLPP) using BIOLOG® EcoPlates™ has become a popular method for characterizing and comparing the functional diversity, functional potential, and metabolic activity of heterotrophic microbial communities. The method was originally developed for profiling soil communities; however, its usage has expanded into the fields of ecotoxicology, agronomy, and the monitoring and profiling of microbial communities in various wastewater treatment systems, including constructed wetlands for water pollution control. When performing CLPP on aqueous samples from constructed wetlands, a wide variety of sample characteristics can be encountered and challenges may arise due to excessive solids, color, or turbidity. The aim of this study was to investigate the impacts of different sample preparation methods on CLPP performed on a variety of aqueous samples covering a broad range of physical and chemical characteristics. The results show that using filter paper, centrifugation, or settling helped clarify samples for subsequent CLPP analysis, however did not do so as effectively as dilution for the darkest samples. Dilution was able to provide suitable clarity for the darkest samples; however, 100-fold dilution significantly affected the carbon source utilization patterns (CSUPs), particularly with samples that were already partially or fully clear. Ten-fold dilution also had some effect on the CSUPs of samples which were originally clear; however, the effect was minimal. Based on these findings, for this specific set of samples, a 10-fold dilution provided a good balance between ease of use, sufficient clarity (for dark samples), and limited effect on CSUPs. The process and findings outlined here can hopefully serve future studies looking to utilize CLPP for functional analysis of microbial communities and also assist in comparing data from studies where different sample preparation methods were utilized.

  11. Provision of Fluoride Varnish to Medicaid-Enrolled Children by Physicians: The Massachusetts Experience

    PubMed Central

    Isong, Inyang A; Silk, Hugh; Rao, Sowmya R; Perrin, James M; Savageau, Judith A; Donelan, Karen

    2011-01-01

    Objectives To evaluate the impact of a 2008 Medicaid policy in Massachusetts (MA), regarding reimbursing physicians for providing fluoride varnish (FV) to eligible children in medical settings. Data Source Survey of a sample of primary care physicians in MA. Study Design Cross-sectional survey of a sample of physicians who provide care to MassHealth (MA Medicaid) enrolled-children. Dependent variables: history of completed preventive dental skills training, and FV provision. Independent variables: oral health knowledge, FV-attitudes, and physician and practice characteristics. Principal Findings Overall, 19 percent of respondents had completed the training required to be eligible to bill for FV provision. Only 5 percent of physicians were providing FV. Most respondents (63 percent) were not familiar with the new policy, and only 25 percent felt that FV should be provided during well-child visits. Most physicians (60 percent) did not feel that the reimbursement rate of U.S.$26/application was sufficient; 17 percent said that they would not provide FV, regardless of payment. Most common barriers to FV provision were a lack of time and logistical challenges. Conclusions Our findings suggest that simply reimbursing physicians for FV provision is insufficient to ensure provider participation. Success of this policy will likely require addressing several barriers identified. PMID:21762142

  12. HPMCD: the database of human microbial communities from metagenomic datasets and microbial reference genomes.

    PubMed

    Forster, Samuel C; Browne, Hilary P; Kumar, Nitin; Hunt, Martin; Denise, Hubert; Mitchell, Alex; Finn, Robert D; Lawley, Trevor D

    2016-01-04

    The Human Pan-Microbe Communities (HPMC) database (http://www.hpmcd.org/) provides a manually curated, searchable, metagenomic resource to facilitate investigation of human gastrointestinal microbiota. Over the past decade, the application of metagenome sequencing to elucidate the microbial composition and functional capacity present in the human microbiome has revolutionized many concepts in our basic biology. When sufficient high quality reference genomes are available, whole genome metagenomic sequencing can provide direct biological insights and high-resolution classification. The HPMC database provides species level, standardized phylogenetic classification of over 1800 human gastrointestinal metagenomic samples. This is achieved by combining a manually curated list of bacterial genomes from human faecal samples with over 21000 additional reference genomes representing bacteria, viruses, archaea and fungi with manually curated species classification and enhanced sample metadata annotation. A user-friendly, web-based interface provides the ability to search for (i) microbial groups associated with health or disease state, (ii) health or disease states and community structure associated with a microbial group, (iii) the enrichment of a microbial gene or sequence and (iv) enrichment of a functional annotation. The HPMC database enables detailed analysis of human microbial communities and supports research from basic microbiology and immunology to therapeutic development in human health and disease. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  13. Prevention of Surgical Fires: A Certification Course for Healthcare Providers.

    PubMed

    Fisher, Marquessa

    2015-08-01

    An estimated 550 to 650 surgical fires occur annually in the United States. Surgical fires may have severe consequences, including burns, disfigurement, long-term medical care, or death. This article introduces a potential certification program for the prevention of surgical fires. A pilot study was conducted with a convenience sample of 10 anesthesia providers who participated in the education module. The overall objective was to educate surgical team members and to prepare them to become certified in surgical fire prevention. On completion of the education module, participants completed the 50-question certification examination. The mean pretest score was 66%; none of the participants had enough correct responses (85%) to be considered competent in surgical fire prevention. The mean post- test score was 92.80%, with all participants answering at least 85% of questions correct. A paired-samples t test showed a statistically significant increase in knowledge: t (df = 9) = 11.40; P = .001. Results of the pilot study indicate that this course can remediate gaps in knowledge of surgical fire prevention for providers. Their poor performance on the pretest suggests that many providers may not receive sufficient instruction in surgical fire prevention.

  14. Testicular biopsy in psittacine birds (Psittaciformes): comparative evaluation of testicular reproductive status by endoscopic, histologic, and cytologic examination.

    PubMed

    Hänse, Maria; Krautwald-Junghanns, Maria-Elisabeth; Reitemeier, Susanne; Einspanier, Almuth; Schmidt, Volker

    2013-12-01

    Knowledge of the reproductive cycle of male parrots is important for examining the male genital tract and for successful breeding, especially of endangered species. To evaluate different diagnostic methods and criteria concerning the classification of reproductive stages, we examined 20 testicular samples obtained at necropsy in psittacine birds of different species and testicular biopsy samples collected from 9 cockatiels (Nymphicus hollandicus) and 7 rose-ringed parakeets (Psittacula krameri) by endoscopy 4 times over a 12-month period. The testicular reproductive status was assessed histologically and then compared with the macroscopic appearance of the testicles and cytologic results. The histologic examination was nondiagnostic in 19 of 59 testicular biopsy samples. By contrast, the cytologic preparations were diagnostic in 57 of 59 biopsy samples. The results of the cytologic examination coincided with the histologic results in 34 of 38 biopsy samples and 18 of 20 necropsy samples. Macroscopic parameters displayed some differences between reproductive stages but provided an unreliable indication of the reproductive status. These results suggest that microscopic examination of a testicular biopsy sample is a reliable method for evaluating the reproductive status of male parrots and is preferable to the macroscopic evaluation of the testicle. Cytologic examination provides fast preliminary results, even when the histologic preparation is not sufficient for evaluation, but results may be erroneous. Thus, a combination of histologic and cytologic examination is recommended for evaluating testicular reproductive status.

  15. Electrophoretic sample insertion. [device for uniformly distributing samples in flow path

    NASA Technical Reports Server (NTRS)

    Mccreight, L. R. (Inventor)

    1974-01-01

    Two conductive screens located in the flow path of an electrophoresis sample separation apparatus are charged electrically. The sample is introduced between the screens, and the charge is sufficient to disperse and hold the samples across the screens. When the charge is terminated, the samples are uniformly distributed in the flow path. Additionally, a first separation by charged properties has been accomplished.

  16. Spatially confined low-power optically pumped ultrafast synchrotron x-ray nanodiffraction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Joonkyu; Zhang, Qingteng; Chen, Pice

    2015-08-27

    The combination of ultrafast optical excitation and time-resolved synchrotron x-ray nanodiffraction provides unique insight into the photoinduced dynamics of materials, with the spatial resolution required to probe individual nanostructures or small volumes within heterogeneous materials. Optically excited x-ray nanobeam experiments are challenging because the high total optical power required for experimentally relevant optical fluences leads to mechanical instability due to heating. For a given fluence, tightly focusing the optical excitation reduces the average optical power by more than three orders of magnitude and thus ensures sufficient thermal stability for x-ray nanobeam studies. Delivering optical pulses via a scannable fiber-coupled opticalmore » objective provides a well-defined excitation geometry during rotation and translation of the sample and allows the selective excitation of isolated areas within the sample. Finally, experimental studies of the photoinduced lattice dynamics of a 35 nm BiFeO 3 thin film on a SrTiO 3 substrate demonstrate the potential to excite and probe nanoscale volumes.« less

  17. Prehospital delay in individuals with acute coronary disease: concordance of medical records and follow-up phone interviews.

    PubMed

    Goldberg, Robert J; Osganian, Stavroula; Zapka, Jane; Mitchell, Paul; Bittner, Vera; Daya, Mo; Luepker, Russell

    2002-01-01

    Patient-associated delay in seeking medical care in persons with acute coronary disease is receiving increasing importance given the time-dependent benefits associated with myocardial reperfusion therapies. We examined the extent of concordance between self-reported information about prehospital delay provided by patients to hospital staff at the time of hospitalization for coronary disease compared with information obtained from a telephone interview approximately 2 months following hospital discharge. The sample included 316 patients with acute myocardial infarction or unstable angina at 43 hospitals who had delay time information available from both data sources. The extent of agreement between the medical record and telephone accounts of delay was 47% in the total study sample, 53% in patients with acute myocardial infarction, and 40% in patients with unstable angina. These results suggest that a telephone interview carried out several months following hospitalization for acute coronary disease may not provide sufficiently reliable information about prehospital delay. Copyright 2002 S. Karger AG, Basel

  18. A novel application of artificial neural network for wind speed estimation

    NASA Astrophysics Data System (ADS)

    Fang, Da; Wang, Jianzhou

    2017-05-01

    Providing accurate multi-steps wind speed estimation models has increasing significance, because of the important technical and economic impacts of wind speed on power grid security and environment benefits. In this study, the combined strategies for wind speed forecasting are proposed based on an intelligent data processing system using artificial neural network (ANN). Generalized regression neural network and Elman neural network are employed to form two hybrid models. The approach employs one of ANN to model the samples achieving data denoising and assimilation and apply the other to predict wind speed using the pre-processed samples. The proposed method is demonstrated in terms of the predicting improvements of the hybrid models compared with single ANN and the typical forecasting method. To give sufficient cases for the study, four observation sites with monthly average wind speed of four given years in Western China were used to test the models. Multiple evaluation methods demonstrated that the proposed method provides a promising alternative technique in monthly average wind speed estimation.

  19. Reviews and syntheses: guiding the evolution of the observing system for the carbon cycle through quantitative network design

    NASA Astrophysics Data System (ADS)

    Kaminski, Thomas; Rayner, Peter Julian

    2017-10-01

    Various observational data streams have been shown to provide valuable constraints on the state and evolution of the global carbon cycle. These observations have the potential to reduce uncertainties in past, current, and predicted natural and anthropogenic surface fluxes. In particular such observations provide independent information for verification of actions as requested by the Paris Agreement. It is, however, difficult to decide which variables to sample, and how, where, and when to sample them, in order to achieve an optimal use of the observational capabilities. Quantitative network design (QND) assesses the impact of a given set of existing or hypothetical observations in a modelling framework. QND has been used to optimise in situ networks and assess the benefit to be expected from planned space missions. This paper describes recent progress and highlights aspects that are not yet sufficiently addressed. It demonstrates the advantage of an integrated QND system that can simultaneously evaluate a multitude of observational data streams and assess their complementarity and redundancy.

  20. Diagnosing prosopagnosia in East Asian individuals: Norms for the Cambridge Face Memory Test-Chinese.

    PubMed

    McKone, Elinor; Wan, Lulu; Robbins, Rachel; Crookes, Kate; Liu, Jia

    2017-07-01

    The Cambridge Face Memory Test (CFMT) is widely accepted as providing a valid and reliable tool in diagnosing prosopagnosia (inability to recognize people's faces). Previously, large-sample norms have been available only for Caucasian-face versions, suitable for diagnosis in Caucasian observers. These are invalid for observers of different races due to potentially severe other-race effects. Here, we provide large-sample norms (N = 306) for East Asian observers on an Asian-face version (CFMT-Chinese). We also demonstrate methodological suitability of the CFMT-Chinese for prosopagnosia diagnosis (high internal reliability, approximately normal distribution, norm-score range sufficiently far above chance). Additional findings were a female advantage on mean performance, plus a difference between participants living in the East (China) or the West (international students, second-generation children of immigrants), which we suggest might reflect personality differences associated with willingness to emigrate. Finally, we demonstrate suitability of the CFMT-Chinese for individual differences studies that use correlations within the normal range.

  1. Simulation of Forward and Inverse X-ray Scattering From Shocked Materials

    NASA Astrophysics Data System (ADS)

    Barber, John; Marksteiner, Quinn; Barnes, Cris

    2012-02-01

    The next generation of high-intensity, coherent light sources should generate sufficient brilliance to perform in-situ coherent x-ray diffraction imaging (CXDI) of shocked materials. In this work, we present beginning-to-end simulations of this process. This includes the calculation of the partially-coherent intensity profiles of self-amplified stimulated emission (SASE) x-ray free electron lasers (XFELs), as well as the use of simulated, shocked molecular-dynamics-based samples to predict the evolution of the resulting diffraction patterns. In addition, we will explore the corresponding inverse problem by performing iterative phase retrieval to generate reconstructed images of the simulated sample. The development of these methods in the context of materials under extreme conditions should provide crucial insights into the design and capabilities of shocked in-situ imaging experiments.

  2. Accuracy assessment with complex sampling designs

    Treesearch

    Raymond L. Czaplewski

    2010-01-01

    A reliable accuracy assessment of remotely sensed geospatial data requires a sufficiently large probability sample of expensive reference data. Complex sampling designs reduce cost or increase precision, especially with regional, continental and global projects. The General Restriction (GR) Estimator and the Recursive Restriction (RR) Estimator separate a complex...

  3. 40 CFR 86.1537 - Idle test run.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Heavy-Duty Engines, New Methanol-Fueled Natural Gas-Fueled, and Liquefied Petroleum Gas-Fueled Diesel-Cycle Heavy-Duty Engines, New Otto-Cycle Light-Duty Trucks, and New Methanol-Fueled Natural Gas-Fueled... dilute sampling. (6) For bag sampling, sample idle emissions long enough to obtain a sufficient bag...

  4. Determination of the botanical origin of honey by front-face synchronous fluorescence spectroscopy.

    PubMed

    Lenhardt, Lea; Zeković, Ivana; Dramićanin, Tatjana; Dramićanin, Miroslav D; Bro, Rasmus

    2014-01-01

    Front-face synchronous fluorescence spectroscopy combined with chemometrics is used to classify honey samples according to their botanical origin. Synchronous fluorescence spectra of three monofloral (linden, sunflower, and acacia), polyfloral (meadow mix), and fake (fake acacia and linden) honey types (109 samples) were collected in an excitation range of 240-500 nm for synchronous wavelength intervals of 30-300 nm. Chemometric analysis of the gathered data included principal component analysis and partial least squares discriminant analysis. Mean cross-validated classification errors of 0.2 and 4.8% were found for a model that accounts only for monofloral samples and for a model that includes both the monofloral and polyfloral groups, respectively. The results demonstrate that single synchronous fluorescence spectra of different honeys differ significantly because of their distinct physical and chemical characteristics and provide sufficient data for the clear differentiation among honey groups. The spectra of fake honey samples showed pronounced differences from those of genuine honey, and these samples are easily recognized on the basis of their synchronous fluorescence spectra. The study demonstrated that this method is a valuable and promising technique for honey authentication.

  5. 9 CFR 3.80 - Primary enclosures.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... injuring themselves; and (xi) Provide sufficient space for the nonhuman primates to make normal postural... maintained so as to provide sufficient space to allow each nonhuman primate to make normal postural... postural adjustments and movements within the primary enclosure. Different species of prosimians vary in...

  6. Non-terminal blood sampling techniques in guinea pigs.

    PubMed

    Birck, Malene M; Tveden-Nyborg, Pernille; Lindblad, Maiken M; Lykkesfeldt, Jens

    2014-10-11

    Guinea pigs possess several biological similarities to humans and are validated experimental animal models(1-3). However, the use of guinea pigs currently represents a relatively narrow area of research and descriptive data on specific methodology is correspondingly scarce. The anatomical features of guinea pigs are slightly different from other rodent models, hence modulation of sampling techniques to accommodate for species-specific differences, e.g., compared to mice and rats, are necessary to obtain sufficient and high quality samples. As both long and short term in vivo studies often require repeated blood sampling the choice of technique should be well considered in order to reduce stress and discomfort in the animals but also to ensure survival as well as compliance with requirements of sample size and accessibility. Venous blood samples can be obtained at a number of sites in guinea pigs e.g., the saphenous and jugular veins, each technique containing both advantages and disadvantages(4,5). Here, we present four different blood sampling techniques for either conscious or anaesthetized guinea pigs. The procedures are all non-terminal procedures provided that sample volumes and number of samples do not exceed guidelines for blood collection in laboratory animals(6). All the described methods have been thoroughly tested and applied for repeated in vivo blood sampling in studies within our research facility.

  7. Rapid and effective processing of blood specimens for diagnostic PCR using filter paper and Chelex-100.

    PubMed Central

    Polski, J M; Kimzey, S; Percival, R W; Grosso, L E

    1998-01-01

    AIM: To provide a more efficient method for isolating DNA from peripheral blood for use in diagnostic DNA mutation analysis. METHODS: The use of blood impregnated filter paper and Chelex-100 in DNA isolation was evaluated and compared with standard DNA isolation techniques. RESULTS: In polymerase chain reaction (PCR) based assays of five point mutations, identical results were obtained with DNA isolated routinely from peripheral blood and isolated using the filter paper and Chelex-100 method. CONCLUSION: In the clinical setting, this method provides a useful alternative to conventional DNA isolation. It is easily implemented and inexpensive, and provides sufficient, stable DNA for multiple assays. The potential for specimen contamination is reduced because most of the steps are performed in a single microcentrifuge tube. In addition, this method provides for easy storage and transport of samples from the point of acquisition. PMID:9893748

  8. Rapid and effective processing of blood specimens for diagnostic PCR using filter paper and Chelex-100.

    PubMed

    Polski, J M; Kimzey, S; Percival, R W; Grosso, L E

    1998-08-01

    To provide a more efficient method for isolating DNA from peripheral blood for use in diagnostic DNA mutation analysis. The use of blood impregnated filter paper and Chelex-100 in DNA isolation was evaluated and compared with standard DNA isolation techniques. In polymerase chain reaction (PCR) based assays of five point mutations, identical results were obtained with DNA isolated routinely from peripheral blood and isolated using the filter paper and Chelex-100 method. In the clinical setting, this method provides a useful alternative to conventional DNA isolation. It is easily implemented and inexpensive, and provides sufficient, stable DNA for multiple assays. The potential for specimen contamination is reduced because most of the steps are performed in a single microcentrifuge tube. In addition, this method provides for easy storage and transport of samples from the point of acquisition.

  9. Estimation of breeding values using selected pedigree records.

    PubMed

    Morton, Richard; Howarth, Jordan M

    2005-06-01

    Fish bred in tanks or ponds cannot be easily tagged individually. The parentage of any individual may be determined by DNA fingerprinting, but is sufficiently expensive that large numbers cannot be so finger-printed. The measurement of the objective trait can be made on a much larger sample relatively cheaply. This article deals with experimental designs for selecting individuals to be finger-printed and for the estimation of the individual and family breeding values. The general setup provides estimates for both genetic effects regarded as fixed or random and for fixed effects due to known regressors. The family effects can be well estimated when even very small numbers are finger-printed, provided that they are the individuals with the most extreme phenotypes.

  10. Symmetry-breaking phase transitions in highly concentrated semen

    PubMed Central

    Creppy, Adama; Plouraboué, Franck; Praud, Olivier; Druart, Xavier; Cazin, Sébastien; Yu, Hui

    2016-01-01

    New experimental evidence of self-motion of a confined active suspension is presented. Depositing fresh semen sample in an annular shaped microfluidic chip leads to a spontaneous vortex state of the fluid at sufficiently large sperm concentration. The rotation occurs unpredictably clockwise or counterclockwise and is robust and stable. Furthermore, for highly active and concentrated semen, richer dynamics can occur such as self-sustained or damped rotation oscillations. Experimental results obtained with systematic dilution provide a clear evidence of a phase transition towards collective motion associated with local alignment of spermatozoa akin to the Vicsek model. A macroscopic theory based on previously derived self-organized hydrodynamics models is adapted to this context and provides predictions consistent with the observed stationary motion. PMID:27733694

  11. Using Language Sampling in Clinical Assessments with Bilingual Children: Challenges and Future Directions

    PubMed Central

    Gutiérrez-Clellen, Vera F.; Simon-Cereijido, Gabriela

    2012-01-01

    Current language tests designed to assess Spanish-English-speaking children have limited clinical accuracy and do not provide sufficient information to plan language intervention. In contrast, spontaneous language samples obtained in the two languages can help identify language impairment with higher accuracy. In this article, we describe several diagnostic indicators that can be used in language assessments based on spontaneous language samples. First, based on previous research with monolingual and bilingual English speakers, we show that a verb morphology composite measure in combination with a measure of mean length of utterance (MLU) can provide valuable diagnostic information for English development in bilingual children. Dialectal considerations are discussed. Second, we discuss the available research with bilingual Spanish speakers and show a series of procedures to be used for the analysis of Spanish samples: (a) limited MLU and proportional use of ungrammatical utterances; (b) limited grammatical accuracy on articles, verbs, and clitic pronouns; and (c) limited MLU, omission of theme arguments, and limited use of ditransitive verbs. Third, we illustrate the analysis of verb argument structure using a rubric as an assessment tool. Estimated scores on morphological and syntactic measures are expected to increase the sensitivity of clinical assessments with young bilingual children. Further research using other measures of language will be needed for older school-age children. PMID:19851951

  12. An audit of the statistics and the comparison with the parameter in the population

    NASA Astrophysics Data System (ADS)

    Bujang, Mohamad Adam; Sa'at, Nadiah; Joys, A. Reena; Ali, Mariana Mohamad

    2015-10-01

    The sufficient sample size that is needed to closely estimate the statistics for particular parameters are use to be an issue. Although sample size might had been calculated referring to objective of the study, however, it is difficult to confirm whether the statistics are closed with the parameter for a particular population. All these while, guideline that uses a p-value less than 0.05 is widely used as inferential evidence. Therefore, this study had audited results that were analyzed from various sub sample and statistical analyses and had compared the results with the parameters in three different populations. Eight types of statistical analysis and eight sub samples for each statistical analysis were analyzed. Results found that the statistics were consistent and were closed to the parameters when the sample study covered at least 15% to 35% of population. Larger sample size is needed to estimate parameter that involve with categorical variables compared with numerical variables. Sample sizes with 300 to 500 are sufficient to estimate the parameters for medium size of population.

  13. Proteomic Challenges: Sample Preparation Techniques for Microgram-Quantity Protein Analysis from Biological Samples

    PubMed Central

    Feist, Peter; Hummon, Amanda B.

    2015-01-01

    Proteins regulate many cellular functions and analyzing the presence and abundance of proteins in biological samples are central focuses in proteomics. The discovery and validation of biomarkers, pathways, and drug targets for various diseases can be accomplished using mass spectrometry-based proteomics. However, with mass-limited samples like tumor biopsies, it can be challenging to obtain sufficient amounts of proteins to generate high-quality mass spectrometric data. Techniques developed for macroscale quantities recover sufficient amounts of protein from milligram quantities of starting material, but sample losses become crippling with these techniques when only microgram amounts of material are available. To combat this challenge, proteomicists have developed micro-scale techniques that are compatible with decreased sample size (100 μg or lower) and still enable excellent proteome coverage. Extraction, contaminant removal, protein quantitation, and sample handling techniques for the microgram protein range are reviewed here, with an emphasis on liquid chromatography and bottom-up mass spectrometry-compatible techniques. Also, a range of biological specimens, including mammalian tissues and model cell culture systems, are discussed. PMID:25664860

  14. Augmentation of the IUE Ultraviolet Spectral Atlas

    NASA Astrophysics Data System (ADS)

    Wu, Chi-Chao

    IUE is the only and last satellite which will support a survey program to record the ultraviolet spectrum of a large number of bright normal stars. It is important to have a library of high quality low dispersion spectra of sufficient number of stars that provide good coverage in spectral type and luminosity class. Such a library is invaluable for stellar population synthesis of galaxies, studying the nature of distant galaxies, establishing a UV spectral classification system, providing comparison stars for interstellar extinction studies and for peculiar objects or binary systems, studying the effects of temperature, gravity and metallicity on stellar UV spectra, and as a teaching aid. We propose to continue observations of normal stars in order to provide (1) a stellar library as complete as practical, which will be able to support astronomical research by the scientific community long into the future, and (2) a sufficient sample of stars to guard against variability and peculiarity, and to allow a finite range of temperature, gravity, and metallicity in a given spectral type-luminosity class combination. Our primary goal is to collect the data and make them available to the community immediately (without claiming the 6-month proprietary right). The data will be published in the IUE Newsletter as soon as practical, and the data will be prepared for distribution by the IUE Observatory and the NSSDC.

  15. Evaluating single-pass catch as a tool for identifying spatial pattern in fish distribution

    USGS Publications Warehouse

    Bateman, Douglas S.; Gresswell, Robert E.; Torgersen, Christian E.

    2005-01-01

    We evaluate the efficacy of single-pass electrofishing without blocknets as a tool for collecting spatially continuous fish distribution data in headwater streams. We compare spatial patterns in abundance, sampling effort, and length-frequency distributions from single-pass sampling of coastal cutthroat trout (Oncorhynchus clarki clarki) to data obtained from a more precise multiple-pass removal electrofishing method in two mid-sized (500–1000 ha) forested watersheds in western Oregon. Abundance estimates from single- and multiple-pass removal electrofishing were positively correlated in both watersheds, r = 0.99 and 0.86. There were no significant trends in capture probabilities at the watershed scale (P > 0.05). Moreover, among-sample variation in fish abundance was higher than within-sample error in both streams indicating that increased precision of unit-scale abundance estimates would provide less information on patterns of abundance than increasing the fraction of habitat units sampled. In the two watersheds, respectively, single-pass electrofishing captured 78 and 74% of the estimated population of cutthroat trout with 7 and 10% of the effort. At the scale of intermediate-sized watersheds, single-pass electrofishing exhibited a sufficient level of precision to be effective in detecting spatial patterns of cutthroat trout abundance and may be a useful tool for providing the context for investigating fish-habitat relationships at multiple scales.

  16. A novel, privacy-preserving cryptographic approach for sharing sequencing data

    PubMed Central

    Cassa, Christopher A; Miller, Rachel A; Mandl, Kenneth D

    2013-01-01

    Objective DNA samples are often processed and sequenced in facilities external to the point of collection. These samples are routinely labeled with patient identifiers or pseudonyms, allowing for potential linkage to identity and private clinical information if intercepted during transmission. We present a cryptographic scheme to securely transmit externally generated sequence data which does not require any patient identifiers, public key infrastructure, or the transmission of passwords. Materials and methods This novel encryption scheme cryptographically protects participant sequence data using a shared secret key that is derived from a unique subset of an individual’s genetic sequence. This scheme requires access to a subset of an individual’s genetic sequence to acquire full access to the transmitted sequence data, which helps to prevent sample mismatch. Results We validate that the proposed encryption scheme is robust to sequencing errors, population uniqueness, and sibling disambiguation, and provides sufficient cryptographic key space. Discussion Access to a set of an individual’s genotypes and a mutually agreed cryptographic seed is needed to unlock the full sequence, which provides additional sample authentication and authorization security. We present modest fixed and marginal costs to implement this transmission architecture. Conclusions It is possible for genomics researchers who sequence participant samples externally to protect the transmission of sequence data using unique features of an individual’s genetic sequence. PMID:23125421

  17. How Good Are Statistical Models at Approximating Complex Fitness Landscapes?

    PubMed Central

    du Plessis, Louis; Leventhal, Gabriel E.; Bonhoeffer, Sebastian

    2016-01-01

    Fitness landscapes determine the course of adaptation by constraining and shaping evolutionary trajectories. Knowledge of the structure of a fitness landscape can thus predict evolutionary outcomes. Empirical fitness landscapes, however, have so far only offered limited insight into real-world questions, as the high dimensionality of sequence spaces makes it impossible to exhaustively measure the fitness of all variants of biologically meaningful sequences. We must therefore revert to statistical descriptions of fitness landscapes that are based on a sparse sample of fitness measurements. It remains unclear, however, how much data are required for such statistical descriptions to be useful. Here, we assess the ability of regression models accounting for single and pairwise mutations to correctly approximate a complex quasi-empirical fitness landscape. We compare approximations based on various sampling regimes of an RNA landscape and find that the sampling regime strongly influences the quality of the regression. On the one hand it is generally impossible to generate sufficient samples to achieve a good approximation of the complete fitness landscape, and on the other hand systematic sampling schemes can only provide a good description of the immediate neighborhood of a sequence of interest. Nevertheless, we obtain a remarkably good and unbiased fit to the local landscape when using sequences from a population that has evolved under strong selection. Thus, current statistical methods can provide a good approximation to the landscape of naturally evolving populations. PMID:27189564

  18. Portable Dew Point Mass Spectrometry System for Real-Time Gas and Moisture Analysis

    NASA Technical Reports Server (NTRS)

    Arkin, C.; Gillespie, Stacey; Ratzel, Christopher

    2010-01-01

    A portable instrument incorporates both mass spectrometry and dew point measurement to provide real-time, quantitative gas measurements of helium, nitrogen, oxygen, argon, and carbon dioxide, along with real-time, quantitative moisture analysis. The Portable Dew Point Mass Spectrometry (PDP-MS) system comprises a single quadrupole mass spectrometer and a high vacuum system consisting of a turbopump and a diaphragm-backing pump. A capacitive membrane dew point sensor was placed upstream of the MS, but still within the pressure-flow control pneumatic region. Pressure-flow control was achieved with an upstream precision metering valve, a capacitance diaphragm gauge, and a downstream mass flow controller. User configurable LabVIEW software was developed to provide real-time concentration data for the MS, dew point monitor, and sample delivery system pressure control, pressure and flow monitoring, and recording. The system has been designed to include in situ, NIST-traceable calibration. Certain sample tubing retains sufficient water that even if the sample is dry, the sample tube will desorb water to an amount resulting in moisture concentration errors up to 500 ppm for as long as 10 minutes. It was determined that Bev-A-Line IV was the best sample line to use. As a result of this issue, it is prudent to add a high-level humidity sensor to PDP-MS so such events can be prevented in the future.

  19. Advances in Assays and Analytical Approaches for Botulinum Toxin Detection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grate, Jay W.; Ozanich, Richard M.; Warner, Marvin G.

    2010-08-04

    Methods to detect botulinum toxin, the most poisonous substance known, are reviewed. Current assays are being developed with two main objectives in mind: 1) to obtain sufficiently low detection limits to replace the mouse bioassay with an in vitro assay, and 2) to develop rapid assays for screening purposes that are as sensitive as possible while requiring an hour or less to process the sample an obtain the result. This review emphasizes the diverse analytical approaches and devices that have been developed over the last decade, while also briefly reviewing representative older immunoassays to provide background and context.

  20. Monitoring Species of Concern Using Noninvasive Genetic Sampling and Capture-Recapture Methods

    DTIC Science & Technology

    2016-11-01

    ABBREVIATIONS AICc Akaike’s Information Criterion with small sample size correction AZGFD Arizona Game and Fish Department BMGR Barry M. Goldwater...MNKA Minimum Number Known Alive N Abundance Ne Effective Population Size NGS Noninvasive Genetic Sampling NGS-CR Noninvasive Genetic...parameter estimates from capture-recapture models require sufficient sample sizes , capture probabilities and low capture biases. For NGS-CR, sample

  1. Performance Evaluation of the Operational Air Quality Monitor for Water Testing Aboard the International Space Station

    NASA Technical Reports Server (NTRS)

    Wallace, William T.; Limero, Thomas F.; Gazda, Daniel B.; Macatangay, Ariel V.; Dwivedi, Prabha; Fernandez, Facundo M.

    2014-01-01

    In the history of manned spaceflight, environmental monitoring has relied heavily on archival sampling. For short missions, this type of sample collection was sufficient; returned samples provided a snapshot of the presence of chemical and biological contaminants in the spacecraft air and water. However, with the construction of the International Space Station (ISS) and the subsequent extension of mission durations, soon to be up to one year, the need for enhanced, real-time environmental monitoring became more pressing. The past several years have seen the implementation of several real-time monitors aboard the ISS, complemented with reduced archival sampling. The station air is currently monitored for volatile organic compounds (VOCs) using gas chromatography-differential mobility spectrometry (Air Quality Monitor [AQM]). The water on ISS is analyzed to measure total organic carbon and biocide concentrations using the Total Organic Carbon Analyzer (TOCA) and the Colorimetric Water Quality Monitoring Kit (CWQMK), respectively. The current air and water monitors provide important data, but the number and size of the different instruments makes them impractical for future exploration missions. It is apparent that there is still a need for improvements in environmental monitoring capabilities. One such improvement could be realized by modifying a single instrument to analyze both air and water. As the AQM currently provides quantitative, compound-specific information for target compounds present in air samples, and many of the compounds are also targets for water quality monitoring, this instrument provides a logical starting point to evaluate the feasibility of this approach. In this presentation, we will discuss our recent studies aimed at determining an appropriate method for introducing VOCs from water samples into the gas phase and our current work, in which an electro-thermal vaporization unit has been interfaced with the AQM to analyze target analytes at the relevant concentrations at which they are routinely detected in archival water samples from the ISS.

  2. Scout-view Assisted Interior Micro-CT

    PubMed Central

    Sen Sharma, Kriti; Holzner, Christian; Vasilescu, Dragoş M.; Jin, Xin; Narayanan, Shree; Agah, Masoud; Hoffman, Eric A.; Yu, Hengyong; Wang, Ge

    2013-01-01

    Micro computed tomography (micro-CT) is a widely-used imaging technique. A challenge of micro-CT is to quantitatively reconstruct a sample larger than the field-of-view (FOV) of the detector. This scenario is characterized by truncated projections and associated image artifacts. However, for such truncated scans, a low resolution scout scan with an increased FOV is frequently acquired so as to position the sample properly. This study shows that the otherwise discarded scout scans can provide sufficient additional information to uniquely and stably reconstruct the interior region of interest. Two interior reconstruction methods are designed to utilize the multi-resolution data without a significant computational overhead. While most previous studies used numerically truncated global projections as interior data, this study uses truly hybrid scans where global and interior scans were carried out at different resolutions. Additionally, owing to the lack of standard interior micro-CT phantoms, we designed and fabricated novel interior micro-CT phantoms for this study to provide means of validation for our algorithms. Finally, two characteristic samples from separate studies were scanned to show the effect of our reconstructions. The presented methods show significant improvements over existing reconstruction algorithms. PMID:23732478

  3. Prospective surveillance of semen quality in the workplace

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schenker, M.B.; Samuels, S.J.; Perkins, C.

    We performed a prospective surveillance of semen quality among workers in the plant where 1,2-dibromo-3-chloropropane was first recognized as an occupational cause of impaired semen quality and of infertility. All male employees of the Agricultural Chemical Division were required to participate. Ninety-seven workers (92% participation) provided 258 semen samples over the 4 years of the program. Most samples were analyzed at the plant with a mini-laboratory designed for the study. Motility and shape measures were made objectively. Sixty-six subjects (68%) were non-azoospermic. Generalized multiple regression showed no significant predictors for any response, with the exception of the motility measures, whichmore » were reduced with longer times between ejaculation and assay. Between- and within-person standard deviations and correlations were calculated. Comparison of this population with fertile artificial insemination donors (16 men, 498 ejaculates) revealed generally higher ejaculate-to-ejaculate standard deviations in the worker samples. This is probably due to less well controlled conditions of sperm collection in the workplace setting. For cross-sectional studies, one ejaculate per worker is recommended as sufficient; for estimating an individual worker's mean, even three ejaculates may not provide enough precision.« less

  4. Impact of ADC parameters on linear optical sampling systems

    NASA Astrophysics Data System (ADS)

    Nguyen, Trung-Hien; Gay, Mathilde; Gomez-Agis, Fausto; Lobo, Sébastien; Sentieys, Olivier; Simon, Jean-Claude; Peucheret, Christophe; Bramerie, Laurent

    2017-11-01

    Linear optical sampling (LOS), based on the coherent photodetection of an optical signal under test with a low repetition-rate signal originating from a pulsed local oscillator (LO), enables the characterization of the temporal electric field of optical sources. Thanks to this technique, low-speed photodetectors and analog-to-digital converters (ADCs) can be integrated in the LOS system providing a cost-effective tool for characterizing high-speed signals. However, the impact of photodetector and ADC parameters on such LOS systems has not been explored in detail so far. These parameters, including the integration time of the track-and-hold function, the effective number of bits (ENOB) of the ADC, as well as the combined limited bandwidth of the photodetector and ADC are experimentally and numerically investigated in a LOS system for the first time. More specifically, by reconstructing 10-Gbit/s non-return-to-zero on-off keying (NRZ-OOK) and 10-Gbaud NRZ-quadrature phase-shift-keying (QPSK) signals, it is shown that a short integration time provides a better recovered signal fidelity. Furthermore, an ENOB of 6 bits and an ADC bandwidth normalized to the sampling rate of 2.8 are found to be sufficient in order to reliably monitor the considered signals.

  5. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations

    PubMed Central

    Hariharan, Prasanna; D’Souza, Gavin A.; Horner, Marc; Morrison, Tina M.; Malinauskas, Richard A.; Myers, Matthew R.

    2017-01-01

    A “credible” computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing “model credibility” is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a “threshold-based” validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results (“S”) of velocity and viscous shear stress were compared with inter-laboratory experimental measurements (“D”). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student’s t-test. However, following the threshold-based approach, a Student’s t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be considered sufficiently validated for the COU. However, for Re = 6500, at certain locations where the shear stress is close the hemolysis threshold, the CFD model could not be considered sufficiently validated for the COU. Our analysis showed that the model could be sufficiently validated either by reducing the uncertainties in experiments, simulations, and the threshold or by increasing the sample size for the experiments and simulations. The threshold approach can be applied to all types of computational models and provides an objective way of determining model credibility and for evaluating medical devices. PMID:28594889

  6. Use of the FDA nozzle model to illustrate validation techniques in computational fluid dynamics (CFD) simulations.

    PubMed

    Hariharan, Prasanna; D'Souza, Gavin A; Horner, Marc; Morrison, Tina M; Malinauskas, Richard A; Myers, Matthew R

    2017-01-01

    A "credible" computational fluid dynamics (CFD) model has the potential to provide a meaningful evaluation of safety in medical devices. One major challenge in establishing "model credibility" is to determine the required degree of similarity between the model and experimental results for the model to be considered sufficiently validated. This study proposes a "threshold-based" validation approach that provides a well-defined acceptance criteria, which is a function of how close the simulation and experimental results are to the safety threshold, for establishing the model validity. The validation criteria developed following the threshold approach is not only a function of Comparison Error, E (which is the difference between experiments and simulations) but also takes in to account the risk to patient safety because of E. The method is applicable for scenarios in which a safety threshold can be clearly defined (e.g., the viscous shear-stress threshold for hemolysis in blood contacting devices). The applicability of the new validation approach was tested on the FDA nozzle geometry. The context of use (COU) was to evaluate if the instantaneous viscous shear stress in the nozzle geometry at Reynolds numbers (Re) of 3500 and 6500 was below the commonly accepted threshold for hemolysis. The CFD results ("S") of velocity and viscous shear stress were compared with inter-laboratory experimental measurements ("D"). The uncertainties in the CFD and experimental results due to input parameter uncertainties were quantified following the ASME V&V 20 standard. The CFD models for both Re = 3500 and 6500 could not be sufficiently validated by performing a direct comparison between CFD and experimental results using the Student's t-test. However, following the threshold-based approach, a Student's t-test comparing |S-D| and |Threshold-S| showed that relative to the threshold, the CFD and experimental datasets for Re = 3500 were statistically similar and the model could be considered sufficiently validated for the COU. However, for Re = 6500, at certain locations where the shear stress is close the hemolysis threshold, the CFD model could not be considered sufficiently validated for the COU. Our analysis showed that the model could be sufficiently validated either by reducing the uncertainties in experiments, simulations, and the threshold or by increasing the sample size for the experiments and simulations. The threshold approach can be applied to all types of computational models and provides an objective way of determining model credibility and for evaluating medical devices.

  7. The Effect of Selected Intervention Tactics on Self-Sufficient Behaviors of the Homeless: An Application of the Theory of Planned Behavior.

    ERIC Educational Resources Information Center

    Moroz, Pauline

    A sample of 24 voluntary participants in a federally funded vocational training and placement program for homeless people in El Paso, Texas, was studied to identify specific interventions that increase self-sufficient behaviors of homeless individuals. Case study data were collected from orientation discussions, career counseling sessions, and…

  8. MSEBAG: a dynamic classifier ensemble generation based on `minimum-sufficient ensemble' and bagging

    NASA Astrophysics Data System (ADS)

    Chen, Lei; Kamel, Mohamed S.

    2016-01-01

    In this paper, we propose a dynamic classifier system, MSEBAG, which is characterised by searching for the 'minimum-sufficient ensemble' and bagging at the ensemble level. It adopts an 'over-generation and selection' strategy and aims to achieve a good bias-variance trade-off. In the training phase, MSEBAG first searches for the 'minimum-sufficient ensemble', which maximises the in-sample fitness with the minimal number of base classifiers. Then, starting from the 'minimum-sufficient ensemble', a backward stepwise algorithm is employed to generate a collection of ensembles. The objective is to create a collection of ensembles with a descending fitness on the data, as well as a descending complexity in the structure. MSEBAG dynamically selects the ensembles from the collection for the decision aggregation. The extended adaptive aggregation (EAA) approach, a bagging-style algorithm performed at the ensemble level, is employed for this task. EAA searches for the competent ensembles using a score function, which takes into consideration both the in-sample fitness and the confidence of the statistical inference, and averages the decisions of the selected ensembles to label the test pattern. The experimental results show that the proposed MSEBAG outperforms the benchmarks on average.

  9. Work-life balance and subjective well-being: the mediating role of need fulfilment.

    PubMed

    Gröpel, Peter; Kuhl, Julius

    2009-05-01

    The relationship between work-life balance (WLB) (i.e. the perceived sufficiency of the time available for work and social life) and well-being is well-documented. However, previous research failed to sufficiently explain why this relationship exists. In this research, the hypothesis was tested that a sufficient amount of the time available increases well-being because it facilitates satisfaction of personal needs. Using two separate samples (students and employees), the mediating role of need fulfilment in the relationship between WLB and well-being was supported. The results suggest that perceived sufficiency of the time available for work and social life predicts the level of well-being only if the individual's needs are fulfilled within that time.

  10. Fuel cell current collector

    DOEpatents

    Katz, Murray; Bonk, Stanley P.; Maricle, Donald L.; Abrams, Martin

    1991-01-01

    A fuel cell has a current collector plate (22) located between an electrode (20) and a separate plate (25). The collector plate has a plurality of arches (26, 28) deformed from a single flat plate in a checkerboard pattern. The arches are of sufficient height (30) to provide sufficient reactant flow area. Each arch is formed with sufficient stiffness to accept compressive load and sufficient resiliently to distribute the load and maintain electrical contact.

  11. An improved taxonomic sampling is a necessary but not sufficient condition for resolving inter-families relationships in Caridean decapods.

    PubMed

    Aznar-Cormano, L; Brisset, J; Chan, T-Y; Corbari, L; Puillandre, N; Utge, J; Zbinden, M; Zuccon, D; Samadi, S

    2015-04-01

    During the past decade, a large number of multi-gene analyses aimed at resolving the phylogenetic relationships within Decapoda. However relationships among families, and even among sub-families, remain poorly defined. Most analyses used an incomplete and opportunistic sampling of species, but also an incomplete and opportunistic gene selection among those available for Decapoda. Here we test in the Caridea if improving the taxonomic coverage following the hierarchical scheme of the classification, as it is currently accepted, provides a better phylogenetic resolution for the inter-families relationships. The rich collections of the Muséum National d'Histoire Naturelle de Paris are used for sampling as far as possible at least two species of two different genera for each family or subfamily. All potential markers are tested over this sampling. For some coding genes the amplification success varies greatly among taxa and the phylogenetic signal is highly saturated. This result probably explains the taxon-heterogeneity among previously published studies. The analysis is thus restricted to the genes homogeneously amplified over the whole sampling. Thanks to the taxonomic sampling scheme the monophyly of most families is confirmed. However the genes commonly used in Decapoda appear non-adapted for clarifying inter-families relationships, which remain poorly resolved. Genome-wide analyses, like transcriptome-based exon capture facilitated by the new generation sequencing methods might provide a sounder approach to resolve deep and rapid radiations like the Caridea.

  12. 24 CFR 572.110 - Identifying and selecting eligible families for homeownership.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... otherwise qualified eligible families who have completed participation in one of the following economic self-sufficiency programs: Project Self-Sufficiency, Operation Bootstrap, Family Self-Sufficiency, JOBS, and any... for the disclosure and verification of social security numbers, as provided by part 5, subpart B, of...

  13. Pairing call-response surveys and distance sampling for a mammalian carnivore

    USGS Publications Warehouse

    Hansen, Sara J. K.; Frair, Jacqueline L.; Underwood, Harold B.; Gibbs, James P.

    2015-01-01

    Density estimates accounting for differential animal detectability are difficult to acquire for wide-ranging and elusive species such as mammalian carnivores. Pairing distance sampling with call-response surveys may provide an efficient means of tracking changes in populations of coyotes (Canis latrans), a species of particular interest in the eastern United States. Blind field trials in rural New York State indicated 119-m linear error for triangulated coyote calls, and a 1.8-km distance threshold for call detectability, which was sufficient to estimate a detection function with precision using distance sampling. We conducted statewide road-based surveys with sampling locations spaced ≥6 km apart from June to August 2010. Each detected call (be it a single or group) counted as a single object, representing 1 territorial pair, because of uncertainty in the number of vocalizing animals. From 524 survey points and 75 detections, we estimated the probability of detecting a calling coyote to be 0.17 ± 0.02 SE, yielding a detection-corrected index of 0.75 pairs/10 km2 (95% CI: 0.52–1.1, 18.5% CV) for a minimum of 8,133 pairs across rural New York State. Importantly, we consider this an index rather than true estimate of abundance given the unknown probability of coyote availability for detection during our surveys. Even so, pairing distance sampling with call-response surveys provided a novel, efficient, and noninvasive means of monitoring populations of wide-ranging and elusive, albeit reliably vocal, mammalian carnivores. Our approach offers an effective new means of tracking species like coyotes, one that is readily extendable to other species and geographic extents, provided key assumptions of distance sampling are met.

  14. Investigation of transient melting of tungsten by ELMs in ASDEX Upgrade

    NASA Astrophysics Data System (ADS)

    Krieger, K.; Sieglin, B.; Balden, M.; Coenen, J. W.; Göths, B.; Laggner, F.; de Marne, P.; Matthews, G. F.; Nille, D.; Rohde, V.; Dejarnac, R.; Faitsch, M.; Giannone, L.; Herrmann, A.; Horacek, J.; Komm, M.; Pitts, R. A.; Ratynskaia, S.; Thoren, E.; Tolias, P.; ASDEX-Upgrade Team; EUROfusion MST1 Team

    2017-12-01

    Repetitive melting of tungsten by power transients originating from edge localized modes (ELMs) has been studied in the tokamak experiment ASDEX Upgrade. Tungsten samples were exposed to H-mode discharges at the outer divertor target plate using the Divertor Manipulator II system. The exposed sample was designed with an elevated sloped surface inclined against the incident magnetic field to increase the projected parallel power flux to a level were transient melting by ELMs would occur. Sample exposure was controlled by moving the outer strike point to the sample location. As extension to previous melt studies in the new experiment both the current flow from the sample to vessel potential and the local surface temperature were measured with sufficient time resolution to resolve individual ELMs. The experiment provided for the first time a direct link of current flow and surface temperature during transient ELM events. This allows to further constrain the MEMOS melt motion code predictions and to improve the validation of its underlying model assumptions. Post exposure ex situ analysis of the retrieved samples confirms the decreased melt motion observed at shallower magnetic field line to surface angles compared to that at leading edges exposed to the parallel power flux.

  15. Onsite Gaseous Centrifuge Enrichment Plant UF6 Cylinder Destructive Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anheier, Norman C.; Cannon, Bret D.; Qiao, Hong

    2012-07-17

    The IAEA safeguards approach for gaseous centrifuge enrichment plants (GCEPs) includes measurements of gross, partial, and bias defects in a statistical sampling plan. These safeguard methods consist principally of mass and enrichment nondestructive assay (NDA) verification. Destructive assay (DA) samples are collected from a limited number of cylinders for high precision offsite mass spectrometer analysis. DA is typically used to quantify bias defects in the GCEP material balance. Under current safeguards measures, the operator collects a DA sample from a sample tap following homogenization. The sample is collected in a small UF6 sample bottle, then sealed and shipped under IAEAmore » chain of custody to an offsite analytical laboratory. Current practice is expensive and resource intensive. We propose a new and novel approach for performing onsite gaseous UF6 DA analysis that provides rapid and accurate assessment of enrichment bias defects. DA samples are collected using a custom sampling device attached to a conventional sample tap. A few micrograms of gaseous UF6 is chemically adsorbed onto a sampling coupon in a matter of minutes. The collected DA sample is then analyzed onsite using Laser Ablation Absorption Ratio Spectrometry-Destructive Assay (LAARS-DA). DA results are determined in a matter of minutes at sufficient accuracy to support reliable bias defect conclusions, while greatly reducing DA sample volume, analysis time, and cost.« less

  16. Lunar placement of Mars quarantine facility

    NASA Technical Reports Server (NTRS)

    Davidson, James E.; Mitchell, W. F.

    1988-01-01

    Advanced mission scenarios are currently being contemplated that would call for the retrieval of surface samples from Mars, from a comet, and from other places in the solar system. An important consideration for all of these sample return missions is quarantine. Quarantine facilities on the Moon offer unique advantages over other locations. The Moon offers gravity, distance, and vacuum. It is sufficiently near the Earth to allow rapid resupply and easy communication. It is sufficiently distant to lessen the psychological impact of a quarantine facility on Earth's human inhabitants. Finally, the Moon is airless, and seems to be devoid of life. It is, therefore, more suited to contamination control efforts.

  17. 50 CFR 679.93 - Amendment 80 Program recordkeeping, permits, monitoring, and catch accounting.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... CONSERVATION AND MANAGEMENT, NATIONAL OCEANIC AND ATMOSPHERIC ADMINISTRATION, DEPARTMENT OF COMMERCE (CONTINUED... moved to the fish bin. (6) Sample storage. There is sufficient space to accommodate a minimum of 10 observer sampling baskets. This space must be within or adjacent to the observer sample station. (7) Pre...

  18. Two-sample binary phase 2 trials with low type I error and low sample size

    PubMed Central

    Litwin, Samuel; Basickes, Stanley; Ross, Eric A.

    2017-01-01

    Summary We address design of two-stage clinical trials comparing experimental and control patients. Our end-point is success or failure, however measured, with null hypothesis that the chance of success in both arms is p0 and alternative that it is p0 among controls and p1 > p0 among experimental patients. Standard rules will have the null hypothesis rejected when the number of successes in the (E)xperimental arm, E, sufficiently exceeds C, that among (C)ontrols. Here, we combine one-sample rejection decision rules, E ≥ m, with two-sample rules of the form E – C > r to achieve two-sample tests with low sample number and low type I error. We find designs with sample numbers not far from the minimum possible using standard two-sample rules, but with type I error of 5% rather than 15% or 20% associated with them, and of equal power. This level of type I error is achieved locally, near the stated null, and increases to 15% or 20% when the null is significantly higher than specified. We increase the attractiveness of these designs to patients by using 2:1 randomization. Examples of the application of this new design covering both high and low success rates under the null hypothesis are provided. PMID:28118686

  19. Recruitment of Older Adults: Success May Be in the Details.

    PubMed

    McHenry, Judith C; Insel, Kathleen C; Einstein, Gilles O; Vidrine, Amy N; Koerner, Kari M; Morrow, Daniel G

    2015-10-01

    Describe recruitment strategies used in a randomized clinical trial of a behavioral prospective memory intervention to improve medication adherence for older adults taking antihypertensive medication. Recruitment strategies represent 4 themes: accessing an appropriate population, communication and trust-building, providing comfort and security, and expressing gratitude. Recruitment activities resulted in 276 participants with a mean age of 76.32 years, and study enrollment included 207 women, 69 men, and 54 persons representing ethnic minorities. Recruitment success was linked to cultivating relationships with community-based organizations, face-to-face contact with potential study participants, and providing service (e.g., blood pressure checks) as an access point to eligible participants. Seventy-two percent of potential participants who completed a follow-up call and met eligibility criteria were enrolled in the study. The attrition rate was 14.34%. The projected increase in the number of older adults intensifies the need to study interventions that improve health outcomes. The challenge is to recruit sufficient numbers of participants who are also representative of older adults to test these interventions. Failing to recruit a sufficient and representative sample can compromise statistical power and the generalizability of study findings. © The Author 2012. Published by Oxford University Press on behalf of The Gerontological Society of America. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  20. Challenges of microtome‐based serial block‐face scanning electron microscopy in neuroscience

    PubMed Central

    WANNER, A. A.; KIRSCHMANN, M. A.

    2015-01-01

    Summary Serial block‐face scanning electron microscopy (SBEM) is becoming increasingly popular for a wide range of applications in many disciplines from biology to material sciences. This review focuses on applications for circuit reconstruction in neuroscience, which is one of the major driving forces advancing SBEM. Neuronal circuit reconstruction poses exceptional challenges to volume EM in terms of resolution, field of view, acquisition time and sample preparation. Mapping the connections between neurons in the brain is crucial for understanding information flow and information processing in the brain. However, information on the connectivity between hundreds or even thousands of neurons densely packed in neuronal microcircuits is still largely missing. Volume EM techniques such as serial section TEM, automated tape‐collecting ultramicrotome, focused ion‐beam scanning electron microscopy and SBEM (microtome serial block‐face scanning electron microscopy) are the techniques that provide sufficient resolution to resolve ultrastructural details such as synapses and provides sufficient field of view for dense reconstruction of neuronal circuits. While volume EM techniques are advancing, they are generating large data sets on the terabyte scale that require new image processing workflows and analysis tools. In this review, we present the recent advances in SBEM for circuit reconstruction in neuroscience and an overview of existing image processing and analysis pipelines. PMID:25907464

  1. In-vitro performance and fracture strength of thin monolithic zirconia crowns

    PubMed Central

    Weigl, Paul; Wu, Yanyun; Felber, Roland; Lauer, Hans-Christoph

    2018-01-01

    PURPOSE All-ceramic restorations required extensive tooth preparation. The purpose of this in vitro study was to investigate a minimally invasive preparation and thickness of monolithic zirconia crowns, which would provide sufficient mechanical endurance and strength. MATERIALS AND METHODS Crowns with thickness of 0.2 mm (group 0.2, n=32) or of 0.5 mm (group 0.5, n=32) were milled from zirconia and fixed with resin-based adhesives (groups 0.2A, 0.5A) or zinc phosphate cements (groups 0.2C, 0.5C). Half of the samples in each subgroup (n=8) underwent thermal cycling and mechanical loading (TCML)(TC: 5℃ and 55℃, 2×3,000 cycles, 2 min/cycle; ML: 50 N, 1.2×106 cycles), while the other samples were stored in water (37℃/24 h). Survival rates were compared (Kaplan-Maier). The specimens surviving TCML were loaded to fracture and the maximal fracture force was determined (ANOVA; Bonferroni; α=.05). The fracture mode was analyzed. RESULTS In both 0.5 groups, all crowns survived TCML, and the comparison of fracture strength among crowns with and without TCML showed no significant difference (P=.628). Four crowns in group 0.2A and all of the crowns in group 0.2C failed during TCML. The fracture strength after 24 hours of the cemented 0.2 mm-thick crowns was significantly lower than that of adhesive bonded crowns. All cemented crowns provided fracture in the crown, while about 80% of the adhesively bonded crowns fractured through crown and die. CONCLUSION 0.5 mm thick monolithic crowns possessed sufficient strength to endure physiologic performance, regardless of the type of cementation. Fracture strength of the 0.2 mm cemented crowns was too low for clinical application. PMID:29713427

  2. Experimental characterization and constitutive modeling of the mechanical behavior of molybdenum under electromagnetically applied compression-shear ramp loading

    DOE PAGES

    Alexander, C. Scott; Ding, Jow -Lian; Asay, James Russell

    2016-03-09

    Magnetically applied pressure-shear (MAPS) is a new experimental technique that provides a platform for direct measurement of material strength at extreme pressures. The technique employs an imposed quasi-static magnetic field and a pulsed power generator that produces an intense current on a planar driver panel, which in turn generates high amplitude magnetically induced longitudinal compression and transverse shear waves into a planar sample mounted on the drive panel. In order to apply sufficiently high shear traction to the test sample, a high strength material must be used for the drive panel. Molybdenum is a potential driver material for the MAPSmore » experiment because of its high yield strength and sufficient electrical conductivity. To properly interpret the results and gain useful information from the experiments, it is critical to have a good understanding and a predictive capability of the mechanical response of the driver. In this work, the inelastic behavior of molybdenum under uniaxial compression and biaxial compression-shear ramp loading conditions is experimentally characterized. It is observed that an imposed uniaxial magnetic field ramped to approximately 10 T through a period of approximately 2500 μs and held near the peak for about 250 μs before being tested appears to anneal the molybdenum panel. In order to provide a physical basis for model development, a general theoretical framework that incorporates electromagnetic loading and the coupling between the imposed field and the inelasticity of molybdenum was developed. Based on this framework, a multi-axial continuum model for molybdenum under electromagnetic loading is presented. The model reasonably captures all of the material characteristics displayed by the experimental data obtained from various experimental configurations. Additionally, data generated from shear loading provide invaluable information not only for validating but also for guiding the development of the material model for multiaxial loadings.« less

  3. Energy-efficient lighting system for television

    DOEpatents

    Cawthorne, Duane C.

    1987-07-21

    A light control system for a television camera comprises an artificial light control system which is cooperative with an iris control system. This artificial light control system adjusts the power to lamps illuminating the camera viewing area to provide only sufficient artificial illumination necessary to provide a sufficient video signal when the camera iris is substantially open.

  4. Analysis of beryllium and depleted uranium: An overview of detection methods in aerosols and soils

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Camins, I.; Shinn, J.H.

    We conducted a survey of commercially available methods for analysis of beryllium and depleted uranium in aerosols and soils to find a reliable, cost-effective, and sufficiently precise method for researchers involved in environmental testing at the Yuma Proving Ground, Yuma, Arizona. Criteria used for evaluation include cost, method of analysis, specificity, sensitivity, reproducibility, applicability, and commercial availability. We found that atomic absorption spectrometry with graphite furnace meets these criteria for testing samples for beryllium. We found that this method can also be used to test samples for depleted uranium. However, atomic absorption with graphite furnace is not as sensitive amore » measurement method for depleted uranium as it is for beryllium, so we recommend that quality control of depleted uranium analysis be maintained by testing 10 of every 1000 samples by neutron activation analysis. We also evaluated 45 companies and institutions that provide analyses of beryllium and depleted uranium. 5 refs., 1 tab.« less

  5. Satellite orbit and data sampling requirements

    NASA Technical Reports Server (NTRS)

    Rossow, William

    1993-01-01

    Climate forcings and feedbacks vary over a wide range of time and space scales. The operation of non-linear feedbacks can couple variations at widely separated time and space scales and cause climatological phenomena to be intermittent. Consequently, monitoring of global, decadal changes in climate requires global observations that cover the whole range of space-time scales and are continuous over several decades. The sampling of smaller space-time scales must have sufficient statistical accuracy to measure the small changes in the forcings and feedbacks anticipated in the next few decades, while continuity of measurements is crucial for unambiguous interpretation of climate change. Shorter records of monthly and regional (500-1000 km) measurements with similar accuracies can also provide valuable information about climate processes, when 'natural experiments' such as large volcanic eruptions or El Ninos occur. In this section existing satellite datasets and climate model simulations are used to test the satellite orbits and sampling required to achieve accurate measurements of changes in forcings and feedbacks at monthly frequency and 1000 km (regional) scale.

  6. A method for measuring total thiaminase activity in fish tissues

    USGS Publications Warehouse

    Zajicek, James L.; Tillitt, Donald E.; Honeyfield, Dale C.; Brown, Scott B.; Fitzsimons, John D.

    2005-01-01

    An accurate, quantitative, and rapid method for the measurement of thiaminase activity in fish samples is required to provide sufficient information to characterize the role of dietary thiaminase in the onset of thiamine deficiency in Great Lakes salmonines. A radiometric method that uses 14C-thiamine was optimized for substrate and co-substrate (nicotinic acid) concentrations, incubation time, and sample dilution. Total thiaminase activity was successfully determined in extracts of selected Great Lakes fishes and invertebrates. Samples included whole-body and selected tissues of forage fishes. Positive control material prepared from frozen alewives Alosa pseudoharengus collected in Lake Michigan enhanced the development and application of the method. The method allowed improved discrimination of thiaminolytic activity among forage fish species and their tissues. The temperature dependence of the thiaminase activity observed in crude extracts of Lake Michigan alewives followed a Q10 = 2 relationship for the 1-37??C temperature range, which is consistent with the bacterial-derived thiaminase I protein. ?? Copyright by the American Fisheries Society 2005.

  7. Simulations of a Thin Sampling Calorimeter with GEANT/FLUKA

    NASA Technical Reports Server (NTRS)

    Lee, Jeongin; Watts, John; Howell, Leonard; Rose, M. Franklin (Technical Monitor)

    2000-01-01

    The Advanced Cosmic-ray Composition Experiment for the Space Station (ACCESS) will investigate the origin, composition and acceleration mechanism of cosmic rays by measuring the elemental composition of the cosmic rays up to 10(exp 15) eV. These measurements will be made with a thin ionization calorimeter and a transition radiation detector. This paper reports studies of a thin sampling calorimeter concept for the ACCESS thin ionization calorimeter. For the past year, a Monte Carlo simulation study of a Thin Sampling Calorimeter (TSC) design has been conducted to predict the detector performance and to design the system for achieving the ACCESS scientific objectives. Simulation results show that the detector energy resolution function resembles a Gaussian distribution and the energy resolution of TSC is about 40%. In addition, simulations of the detector's response to an assumed broken power law cosmic ray spectra in the region where the 'knee' of the cosmic ray spectrum occurs have been conducted and clearly show that a thin sampling calorimeter can provide sufficiently accurate estimates of the spectral parameters to meet the science requirements of ACCESS. n

  8. Gender-Specific Barriers to Self-Sufficiency among Former Supplemental Security Income Drug Addiction and Alcoholism Beneficiaries: Implications for Welfare-To-Work Programs and Services

    PubMed Central

    Hogan, Sean R; Unick, George J.; Speiglman, Richard; Norris, Jean C.

    2011-01-01

    This study examines barriers to economic self-sufficiency among a panel of 219 former Supplemental Security Income (SSI) drug addiction and alcoholism (DA&A) recipients following elimination of DA&A as an eligibility category for SSI disability benefits. Study participants were comprehensively surveyed at six measurement points following the policy change. Generalized estimating equations were used to examine full-sample and gender-specific barriers to economic self-sufficiency. Results indicate that access to transportation, age, and time are the strongest predictors of achieving self-sufficiency for both men and women leaving the welfare system. Gender-specific barriers are also identified. Future research needs to assess the generalizability of these results to other public assistance recipients. PMID:21625301

  9. Method for detection of long-lived radioisotopes in small biochemical samples

    DOEpatents

    Turteltaub, K.W.; Vogel, J.S.; Felton, J.S.; Gledhill, B.L.; Davis, J.C.

    1994-11-22

    Disclosed is a method for detection of long-lived radioisotopes in small biochemical samples, comprising: a. selecting a biological host in which radioisotopes are present in concentrations equal to or less than those in the ambient biosphere, b. preparing a long-lived radioisotope labeled reactive chemical specie, c. administering the chemical specie to the biologist host in doses sufficiently low to avoid significant overt damage to the biological system, d. allowing a period of time to elapse sufficient for dissemination and interaction of the chemical specie with the host throughout the biological system of the host, e. isolating a reacted fraction of the biological substance from the host in a manner sufficient to avoid contamination of the substance from extraneous sources, f. converting the fraction of biological substance by suitable means to a material which efficiently produces charged ions in at least one of several possible ion sources without introduction of significant isotopic fractionation, and, g. measuring the radioisotope concentration in the material by means of direct isotopic counting. 5 figs.

  10. Method for detection of long-lived radioisotopes in small biochemical samples

    DOEpatents

    Turteltaub, Kenneth W.; Vogel, John S.; Felton, James S.; Gledhill, Barton L.; Davis, Jay C.

    1994-01-01

    Disclosed is a method for detection of long-lived radioisotopes in small bio-chemical samples, comprising: a. selecting a biological host in which radioisotopes are present in concentrations equal to or less than those in the ambient biosphere, b. preparing a long-lived radioisotope labeled reactive chemical specie, c. administering said chemical specie to said biologist host in doses sufficiently low to avoid significant overt damage to the biological system thereof, d. allowing a period of time to elapse sufficient for dissemination and interaction of said chemical specie with said host throughout said biological system of said host, e. isolating a reacted fraction of the biological substance from said host in a manner sufficient to avoid contamination of said substance from extraneous sources, f. converting said fraction of biological substance by suitable means to a material which efficiently produces charged ions in at least one of several possible ion sources without introduction of significant isotopic fractionation, and, g. measuring the radioisotope concentration in said material by means of direct isotopic counting.

  11. A Flight Investigation of Exhaust-heat De-icing

    NASA Technical Reports Server (NTRS)

    Jones, Alun R; Rodert, Lewis A

    1940-01-01

    The National Advisory Committee for Aeronautics conducted exhaust-heat de-icing tests in flight to provide data needed in the application of this method. The capacity to extract heat from the exhaust gas for de-icing purposes, the quantity of heat required, and other factors were examined. The results indicate that a wing-heating system employing a spanwise exhaust tube within the leading edge of the wing removed 30 to 35 percent of the heat from exhaust gas entering the wing. Data are given from which the heat required for ice prevention can be calculated. Sample calculations have been made on the basis of existing engine power/wing area ratios to show that sufficient heating can be obtained for ice protection on modern transportation airplanes, provided that uniform distribution of the heat can be secured.

  12. Usefulness of Cochrane Skin Group reviews for clinical practice.

    PubMed

    Davila-Seijo, P; Batalla, A; Garcia-Doval, I

    2013-10-01

    Systematic reviews are one of the most important sources of information for evidence-based medicine. However, there is a general impression that these reviews rarely report results that provide sufficient evidence to change clinical practice. The aim of this study was to determine the percentage of Cochrane Skin Group reviews reporting results with the potential to guide clinical decision-making. We performed a bibliometric analysis of all the systematic reviews published by the Cochrane Skin Group up to 16 August, 2012. We retrieved 55 reviews, which were analyzed and graded independently by 2 investigators into 3 categories: 0 (insufficient evidence to support or reject the use of an intervention), 1 (insufficient evidence to support or reject the use of an intervention but sufficient evidence to support recommendations or suggestions), and 2 (sufficient evidence to support or reject the use of an intervention). Our analysis showed that 25.5% (14/55) of the studies did not provide sufficient evidence to support or reject the use of the interventions studied, 45.5% (25/25) provided sufficient but not strong evidence to support recommendations or suggestions, and 29.1% (16/55) provided strong evidence to support or reject the use of 1 or more of the interventions studied. Most of the systematic reviews published by the Cochrane Skin Group provide useful information to improve clinical practice. Clinicians should read these reviews and reconsider their current practice. Copyright © 2012 Elsevier España, S.L. and AEDV. All rights reserved.

  13. Bennett ion mass spectrometers on the Pioneer Venus Bus and Orbiter

    NASA Technical Reports Server (NTRS)

    Taylor, H. A., Jr.; Brinton, H. C.; Wagner, T. C. G.; Blackwell, B. H.; Cordier, G. R.

    1980-01-01

    Identical Bennett radio-frequency ion mass spectrometer instruments on the Pioneer Venus Bus and Orbiter have provided the first in-situ measurements of the detailed composition of the planet's ionosphere. The sensitivity, resolution, and dynamic range are sufficient to provide measurements of the solar-wind-induced bow-shock, the ionopause, and highly structured distributions of up to 16 thermal ion species within the ionosphere. The use of adaptive scan and detection circuits and servo-controlled logic for ion mass and energy analysis permits detection of ion concentrations as low as 5 ions/cu cm and ion flow velocities as large as 9 km/sec for O(+). A variety of commandable modes provides ion sampling rates ranging from 0.1 to 1.6 sec between measurements of a single constituent. A lightweight sensor and electronics housing are features of a compact instrument package.

  14. A pharmacogenetics service experience for pharmacy students, residents, and fellows.

    PubMed

    Drozda, Katarzyna; Labinov, Yana; Jiang, Ruixuan; Thomas, Margaret R; Wong, Shan S; Patel, Shitalben; Nutescu, Edith A; Cavallari, Larisa H

    2013-10-14

    To utilize a comprehensive, pharmacist-led warfarin pharmacogenetics service to provide pharmacy students, residents, and fellows with clinical and research experiences involving genotype-guided therapy. First-year (P1) through fourth-year (P4) pharmacy students, pharmacy residents, and pharmacy fellows participated in a newly implemented warfarin pharmacogenetics service in a hospital setting. Students, residents, and fellows provided genotype-guided dosing recommendations as part of clinical care, or analyzed samples and data collected from patients on the service for research purposes. Students', residents', and fellows' achievement of learning objectives was assessed using a checklist based on established core competencies in pharmacogenetics. The mean competency score of the students, residents, and fellows who completed a clinical and/or research experience with the service was 97% ±3%. A comprehensive warfarin pharmacogenetics service provided unique experiential and research opportunities for pharmacy students, residents, and fellows and sufficiently addressed a number of core competencies in pharmacogenetics.

  15. Assessing Disfluencies in School-Age Children Who Stutter: How Much Speech Is Enough?

    ERIC Educational Resources Information Center

    Gregg, Brent A.; Sawyer, Jean

    2015-01-01

    The question of what size speech sample is sufficient to accurately identify stuttering and its myriad characteristics is a valid one. Short samples have a risk of over- or underrepresenting disfluency types or characteristics. In recent years, there has been a trend toward using shorter samples because they are less time-consuming for…

  16. The development and psychometric validation of the Ethical Awareness Scale.

    PubMed

    Milliken, Aimee; Ludlow, Larry; DeSanto-Madeya, Susan; Grace, Pamela

    2018-04-19

    To develop and psychometrically assess the Ethical Awareness Scale using Rasch measurement principles and a Rasch item response theory model. Critical care nurses must be equipped to provide good (ethical) patient care. This requires ethical awareness, which involves recognizing the ethical implications of all nursing actions. Ethical awareness is imperative in successfully addressing patient needs. Evidence suggests that the ethical import of everyday issues may often go unnoticed by nurses in practice. Assessing nurses' ethical awareness is a necessary first step in preparing nurses to identify and manage ethical issues in the highly dynamic critical care environment. A cross-sectional design was used in two phases of instrument development. Using Rasch principles, an item bank representing nursing actions was developed (33 items). Content validity testing was performed. Eighteen items were selected for face validity testing. Two rounds of operational testing were performed with critical care nurses in Boston between February-April 2017. A Rasch analysis suggests sufficient item invariance across samples and sufficient construct validity. The analysis further demonstrates a progression of items uniformly along a hierarchical continuum; items that match respondent ability levels; response categories that are sufficiently used; and adequate internal consistency. Mean ethical awareness scores were in the low/moderate range. The results suggest the Ethical Awareness Scale is a psychometrically sound, reliable and valid measure of ethical awareness in critical care nurses. © 2018 John Wiley & Sons Ltd.

  17. High-Grading Lunar Samples for Return to Earth

    NASA Technical Reports Server (NTRS)

    Allen, Carlton; Sellar, Glenn; Nunez, Jorge; Winterhalter, Daniel; Farmer, Jack

    2009-01-01

    Astronauts on long-duration lunar missions will need the capability to "high-grade" their samples to select the highest value samples for transport to Earth and to leave others on the Moon. We are supporting studies to defile the "necessary and sufficient" measurements and techniques for highgrading samples at a lunar outpost. A glovebox, dedicated to testing instruments and techniques for high-grading samples, is in operation at the JSC Lunar Experiment Laboratory.

  18. Exploring the temporal structure of heterochronous sequences using TempEst (formerly Path-O-Gen).

    PubMed

    Rambaut, Andrew; Lam, Tommy T; Max Carvalho, Luiz; Pybus, Oliver G

    2016-01-01

    Gene sequences sampled at different points in time can be used to infer molecular phylogenies on a natural timescale of months or years, provided that the sequences in question undergo measurable amounts of evolutionary change between sampling times. Data sets with this property are termed heterochronous and have become increasingly common in several fields of biology, most notably the molecular epidemiology of rapidly evolving viruses. Here we introduce the cross-platform software tool, TempEst (formerly known as Path-O-Gen), for the visualization and analysis of temporally sampled sequence data. Given a molecular phylogeny and the dates of sampling for each sequence, TempEst uses an interactive regression approach to explore the association between genetic divergence through time and sampling dates. TempEst can be used to (1) assess whether there is sufficient temporal signal in the data to proceed with phylogenetic molecular clock analysis, and (2) identify sequences whose genetic divergence and sampling date are incongruent. Examination of the latter can help identify data quality problems, including errors in data annotation, sample contamination, sequence recombination, or alignment error. We recommend that all users of the molecular clock models implemented in BEAST first check their data using TempEst prior to analysis.

  19. Further Results on Sufficient LMI Conditions for H∞ Static Output Feedback Control of Discrete-Time Systems

    NASA Astrophysics Data System (ADS)

    Feng, Zhi-Yong; Xu, Li; Matsushita, Shin-Ya; Wu, Min

    Further results on sufficient LMI conditions for H∞ static output feedback (SOF) control of discrete-time systems are presented in this paper, which provide some new insights into this issue. First, by introducing a slack variable with block-triangular structure and choosing the coordinate transformation matrix properly, the conservativeness of one kind of existing sufficient LMI condition is further reduced. Then, by introducing a slack variable with linear matrix equality constraint, another kind of sufficient LMI condition is proposed. Furthermore, the relation of these two kinds of LMI conditions are revealed for the first time through analyzing the effect of different choices of coordinate transformation matrices. Finally, a numerical example is provided to demonstrate the effectiveness and merits of the proposed methods.

  20. Fermented probiotic beverages based on acid whey.

    PubMed

    Skryplonek, Katarzyna; Jasińska, Małgorzata

    2015-01-01

    Production of fermented probiotic beverages can be a good method for acid whey usage. The obtained products combine a high nutritional value of whey with health benefits claimed for probiotic bacteria. The aim of the study was to define quality properties of beverages based on fresh acid whey and milk with addition of buttermilk powder or sweet whey powder. Samples were inoculated with two strains of commercial probiotic cultures: Lactobacillus acidophilus La-5 or Bifidobacterium animalis Bb-12. After fermentation, samples were stored at refrigerated conditions. After 1, 4, 7, 14 and 21 days sensory characteristics, hardness, acetaldehyde content, titratable acidity, pH acidity and count of bacteria cells were evaluated. Throughout all storage period, the number of bacteria was higher than 8 log cfu/ml in the all samples. Beverages with La-5 strain had higher hardness and acidity, whilst samples with Bb-12 contained more acetaldehyde. Samples with buttermilk powder had better sensory properties than with sweet whey powder. Obtained products made of acid whey combined with milk and fortified with buttermilk powder or sweet whey powder, are good medium for growth and survival of examined probiotic bacteria strains. The level of bacteria was sufficient to provide health benefits to consumers.

  1. Planar optical waveguide based sandwich assay sensors and processes for the detection of biological targets including protein markers, pathogens and cellular debris

    DOEpatents

    Martinez, Jennifer S [Santa Fe, NM; Swanson, Basil I [Los Alamos, NM; Grace, Karen M [Los Alamos, NM; Grace, Wynne K [Los Alamos, NM; Shreve, Andrew P [Santa Fe, NM

    2009-06-02

    An assay element is described including recognition ligands bound to a film on a single mode planar optical waveguide, the film from the group of a membrane, a polymerized bilayer membrane, and a self-assembled monolayer containing polyethylene glycol or polypropylene glycol groups therein and an assay process for detecting the presence of a biological target is described including injecting a biological target-containing sample into a sensor cell including the assay element, with the recognition ligands adapted for binding to selected biological targets, maintaining the sample within the sensor cell for time sufficient for binding to occur between selected biological targets within the sample and the recognition ligands, injecting a solution including a reporter ligand into the sensor cell; and, interrogating the sample within the sensor cell with excitation light from the waveguide, the excitation light provided by an evanescent field of the single mode penetrating into the biological target-containing sample to a distance of less than about 200 nanometers from the waveguide thereby exciting the fluorescent-label in any bound reporter ligand within a distance of less than about 200 nanometers from the waveguide and resulting in a detectable signal.

  2. Planar optical waveguide based sandwich assay sensors and processes for the detection of biological targets including early detection of cancers

    DOEpatents

    Martinez, Jennifer S [Santa Fe, NM; Swanson, Basil I [Los Alamos, NM; Shively, John E [Arcadia, CA; Li, Lin [Monrovia, CA

    2009-06-02

    An assay element is described including recognition ligands adapted for binding to carcinoembryonic antigen (CEA) bound to a film on a single mode planar optical waveguide, the film from the group of a membrane, a polymerized bilayer membrane, and a self-assembled monolayer containing polyethylene glycol or polypropylene glycol groups therein and an assay process for detecting the presence of CEA is described including injecting a possible CEA-containing sample into a sensor cell including the assay element, maintaining the sample within the sensor cell for time sufficient for binding to occur between CEA present within the sample and the recognition ligands, injecting a solution including a reporter ligand into the sensor cell; and, interrogating the sample within the sensor cell with excitation light from the waveguide, the excitation light provided by an evanescent field of the single mode penetrating into the biological target-containing sample to a distance of less than about 200 nanometers from the waveguide thereby exciting any bound reporter ligand within a distance of less than about 200 nanometers from the waveguide and resulting in a detectable signal.

  3. TEM preparation methods and influence of radiation damage on the beam sensitive CaCO3 shell of Emiliania huxleyi.

    PubMed

    Hoffmann, Ramona; Wochnik, Angela S; Betzler, Sophia B; Matich, Sonja; Griesshaber, Erika; Schmahl, Wolfgang W; Scheu, Christina

    2014-07-01

    The ultrastructure of biologically formed calcium carbonate crystals like the shell of Emiliania huxleyi depends on the environmental conditions such as pH value, temperature and salinity. Therefore, they can be used as indicator for climate changes. However, for this a detailed understanding of their crystal structure and chemical composition is required. High resolution methods like transmission electron microscopy can provide those information on the nanoscale, given that sufficiently thin samples can be prepared. In our study, we developed sample preparation techniques for cross-section and plan-view investigations and studied the sample stability under electron bombardment. In addition to the biological material (Emiliania huxleyi) we also prepared mineralogical samples (Iceland spar) for comparison. High resolution transmission electron microscopy imaging, electron diffraction and electron energy-loss spectroscopy studies revealed that all prepared samples are relatively stable under electron bombardment at an acceleration voltage of 300 kV when using a parallel illumination. Above an accumulated dose of ∼10(5) e/nm2 the material--independent whether its origin is biological or geological--transformed to poly-crystalline calcium oxide. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Isolation and identification of Salmonella spp. in environmental water by molecular technology in Taiwan

    NASA Astrophysics Data System (ADS)

    Kuo, Chun Wei; Hao Huang, Kuan; Hsu, Bing Mu; Tsai, Hsien Lung; Tseng, Shao Feng; Shen, Tsung Yu; Kao, Po Min; Shen, Shu Min; Chen, Jung Sheng

    2013-04-01

    Salmonella spp. is one of the most important causal agents of waterborne diseases. The taxonomy of Salmonella is very complicated and its genus comprises more than 2,500 serotypes. The detection of Salmonella in environmental water samples by routines culture methods using selective media and characterization of suspicious colonies based on biochemical tests and serological assay are generally time consuming. To overcome this drawback, it is desirable to use effective method which provides a higher discrimination and more rapid identification about Salmonella in environmental water. The aim of this study is to investigate the occurrence of Salmonella using molecular technology and to identify the serovars of Salmonella isolates from 70 environmental water samples in Taiwan. The analytical procedures include membrane filtration, non-selective pre-enrichment, selective enrichment of Salmonella. After that, we isolated Salmonella strains by selective culture plates. Both selective enrichment and culture plates were detected by Polymerase Chain Reaction (PCR). Finally, the serovars of Salmonella were confirmed by using biochemical tests and serological assay. In this study, 15 water samples (21.4%) were identified as Salmonella by PCR. The positive water samples will further identify their serotypes by culture method. The presence of Salmonella in environmental water indicates the possibility of waterborne transmission in drinking watershed. Consequently, the authorities need to provide sufficient source protection and to maintain the system for disease prevention. Keywords: Salmonella spp., serological assay, PCR

  5. Atomic force microscopy imaging of macromolecular complexes.

    PubMed

    Santos, Sergio; Billingsley, Daniel; Thomson, Neil

    2013-01-01

    This chapter reviews amplitude modulation (AM) AFM in air and its applications to high-resolution imaging and interpretation of macromolecular complexes. We discuss single DNA molecular imaging and DNA-protein interactions, such as those with topoisomerases and RNA polymerase. We show how relative humidity can have a major influence on resolution and contrast and how it can also affect conformational switching of supercoiled DNA. Four regimes of AFM tip-sample interaction in air are defined and described, and relate to water perturbation and/or intermittent mechanical contact of the tip with either the molecular sample or the surface. Precise control and understanding of the AFM operational parameters is shown to allow the user to switch between these different regimes: an interpretation of the origins of topographical contrast is given for each regime. Perpetual water contact is shown to lead to a high-resolution mode of operation, which we term SASS (small amplitude small set-point) imaging, and which maximizes resolution while greatly decreasing tip and sample wear and any noise due to perturbation of the surface water. Thus, this chapter provides sufficient information to reliably control the AFM in the AM AFM mode of operation in order to image both heterogeneous samples and single macromolecules including complexes, with high resolution and with reproducibility. A brief introduction to AFM, its versatility and applications to biology is also given while providing references to key work and general reviews in the field.

  6. Using Boreholes as Windows into Groundwater Ecosystems

    PubMed Central

    Sorensen, James P. R.; Maurice, Louise; Edwards, François K.; Lapworth, Daniel J.; Read, Daniel S.; Allen, Debbie; Butcher, Andrew S.; Newbold, Lindsay K.; Townsend, Barry R.; Williams, Peter J.

    2013-01-01

    Groundwater ecosystems remain poorly understood yet may provide ecosystem services, make a unique contribution to biodiversity and contain useful bio-indicators of water quality. Little is known about ecosystem variability, the distribution of invertebrates within aquifers, or how representative boreholes are of aquifers. We addressed these issues using borehole imaging and single borehole dilution tests to identify three potential aquifer habitats (fractures, fissures or conduits) intercepted by two Chalk boreholes at different depths beneath the surface (34 to 98 m). These habitats were characterised by sampling the invertebrates, microbiology and hydrochemistry using a packer system to isolate them. Samples were taken with progressively increasing pumped volume to assess differences between borehole and aquifer communities. The study provides a new conceptual framework to infer the origin of water, invertebrates and microbes sampled from boreholes. It demonstrates that pumping 5 m3 at 0.4–1.8 l/sec was sufficient to entrain invertebrates from five to tens of metres into the aquifer during these packer tests. Invertebrates and bacteria were more abundant in the boreholes than in the aquifer, with associated water chemistry variations indicating that boreholes act as sites of enhanced biogeochemical cycling. There was some variability in invertebrate abundance and bacterial community structure between habitats, indicating ecological heterogeneity within the aquifer. However, invertebrates were captured in all aquifer samples, and bacterial abundance, major ion chemistry and dissolved oxygen remained similar. Therefore the study demonstrates that in the Chalk, ecosystems comprising bacteria and invertebrates extend from around the water table to 70 m below it. Hydrogeological techniques provide excellent scope for tackling outstanding questions in groundwater ecology, provided an appropriate conceptual hydrogeological understanding is applied. PMID:23936176

  7. Management Plans Technical Appendix - Phase 1 (Central Puget Sound). Volume 4

    DTIC Science & Technology

    1988-06-01

    measure without substantially more samples and analysis or significantly reducing the desired confidence level . Consequently, the study participants...disposal occurs in ac- (9) An application and a lease fee will be charged at a cordance with permit conditions. Compliance measures rate sufficient to...site are sufficient to characterize the material. The bloassays are a cost effective measure of the biological effects of concern within the disposal

  8. How Important Are 'Entry Effects' in Financial Incentive Programs for Welfare Recipients? Experimental Evidence from the Self-Sufficiency Project. SRDC Working Papers.

    ERIC Educational Resources Information Center

    Card, David; Robins, Philip K.; Lin, Winston

    The Self-Sufficiency Project (SSP) entry effect experiment was designed to measure the effect of the future availability of an earnings supplement on the behavior of newly enrolled income assistance (IA) recipients. It used a classical randomized design. From a sample of 3,315 single parents who recently started a new period of IA, one-half were…

  9. Evaluation of the Ticket to Work Program: Assessment of Post-Rollout Implementation and Early Impacts, Volume 1

    ERIC Educational Resources Information Center

    Thornton, Craig; Livermore, Gina; Fraker, Thomas; Stapleton, David; O'Day, Bonnie; Wittenburg, David; Weathers, Robert; Goodman, Nanette; Silva, Tim; Martin, Emily Sama; Gregory, Jesse; Wright, Debra; Mamun, Arif

    2007-01-01

    Ticket to Work and Self-Sufficiency program (TTW) was designed to enhance the market for services that help disability beneficiaries become economically self-sufficient by providing beneficiaries with a wide range of choices for obtaining services and to give employment-support service providers new financial incentives to serve beneficiaries…

  10. Strain tolerant microfilamentary superconducting wire

    DOEpatents

    Finnemore, D.K.; Miller, T.A.; Ostenson, J.E.; Schwartzkopf, L.A.; Sanders, S.C.

    1993-02-23

    A strain tolerant microfilamentary wire capable of carrying superconducting currents is provided comprising a plurality of discontinuous filaments formed from a high temperature superconducting material. The discontinuous filaments have a length at least several orders of magnitude greater than the filament diameter and are sufficiently strong while in an amorphous state to withstand compaction. A normal metal is interposed between and binds the discontinuous filaments to form a normal metal matrix capable of withstanding heat treatment for converting the filaments to a superconducting state. The geometry of the filaments within the normal metal matrix provides substantial filament-to-filament overlap, and the normal metal is sufficiently thin to allow supercurrent transfer between the overlapped discontinuous filaments but is also sufficiently thick to provide strain relief to the filaments.

  11. The Clinical Ethnographic Interview: A user-friendly guide to the cultural formulation of distress and help seeking

    PubMed Central

    Arnault, Denise Saint; Shimabukuro, Shizuka

    2013-01-01

    Transcultural nursing, psychiatry, and medical anthropology have theorized that practitioners and researchers need more flexible instruments to gather culturally relevant illness experience, meaning, and help seeking. The state of the science is sufficiently developed to allow standardized yet ethnographically sound protocols for assessment. However, vigorous calls for culturally adapted assessment models have yielded little real change in routine practice. This paper describes the conversion of the Diagnostic and Statistical Manual IV, Appendix I Outline for Cultural Formulation into a user-friendly Clinical Ethnographic Interview (CEI), and provides clinical examples of its use in a sample of highly distressed Japanese women. PMID:22194348

  12. GILA User's Manual

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    CHRISTON, MARK A.

    2003-06-01

    GILA is a finite element code that has been developed specifically to attack the class of transient, incompressible, viscous, fluid dynamics problems that are predominant in the world that surrounds us. The purpose for this document is to provide sufficient information for an experienced analyst to use GILA in an effective way. The GILA User's Manual presents a technical outline of the governing equations for time-dependent incompressible flow, and the explicit and semi-implicit projection methods used in GILA to solve the equations. This manual also presents a brief overview of some of GILA's capabilities along with the keyword input syntaxmore » and sample problems.« less

  13. Minimum-fuel, 3-dimensional flightpath guidance of transfer jets

    NASA Technical Reports Server (NTRS)

    Neuman, F.; Kreindler, E.

    1984-01-01

    Minimum fuel, three dimensional flightpaths for commercial jet aircraft are discussed. The theoretical development is divided into two sections. In both sections, the necessary conditions of optimal control, including singular arcs and state constraints, are used. One section treats the initial and final portions (below 10,000 ft) of long optimal flightpaths. Here all possible paths can be derived by generating fields of extremals. Another section treats the complete intermediate length, three dimensional terminal area flightpaths. Here only representative sample flightpaths can be computed. Sufficient detail is provided to give the student of optimal control a complex example of a useful application of optimal control theory.

  14. Effects of different preservation methods on inter simple sequence repeat (ISSR) and random amplified polymorphic DNA (RAPD) molecular markers in botanic samples.

    PubMed

    Wang, Xiaolong; Li, Lin; Zhao, Jiaxin; Li, Fangliang; Guo, Wei; Chen, Xia

    2017-04-01

    To evaluate the effects of different preservation methods (stored in a -20°C ice chest, preserved in liquid nitrogen and dried in silica gel) on inter simple sequence repeat (ISSR) or random amplified polymorphic DNA (RAPD) analyses in various botanical specimens (including broad-leaved plants, needle-leaved plants and succulent plants) for different times (three weeks and three years), we used a statistical analysis based on the number of bands, genetic index and cluster analysis. The results demonstrate that methods used to preserve samples can provide sufficient amounts of genomic DNA for ISSR and RAPD analyses; however, the effect of different preservation methods on these analyses vary significantly, and the preservation time has little effect on these analyses. Our results provide a reference for researchers to select the most suitable preservation method depending on their study subject for the analysis of molecular markers based on genomic DNA. Copyright © 2017 Académie des sciences. Published by Elsevier Masson SAS. All rights reserved.

  15. Compact FPGA-based beamformer using oversampled 1-bit A/D converters.

    PubMed

    Tomov, Borislav Gueorguiev; Jensen, Jørgen Arendt

    2005-05-01

    A compact medical ultrasound beamformer architecture that uses oversampled 1-bit analog-to-digital (A/D) converters is presented. Sparse sample processing is used, as the echo signal for the image lines is reconstructed in 512 equidistant focal points along the line through its in-phase and quadrature components. That information is sufficient for presenting a B-mode image and creating a color flow map. The high sampling rate provides the necessary delay resolution for the focusing. The low channel data width (1-bit) makes it possible to construct a compact beamformer logic. The signal reconstruction is done using finite impulse reponse (FIR) filters, applied on selected bit sequences of the delta-sigma modulator output stream. The approach allows for a multichannel beamformer to fit in a single field programmable gate array (FPGA) device. A 32-channel beamformer is estimated to occupy 50% of the available logic resources in a commercially available mid-range FPGA, and to be able to operate at 129 MHz. Simulation of the architecture at 140 MHz provides images with a dynamic range approaching 60 dB for an excitation frequency of 3 MHz.

  16. Using perturbed handwriting to support writer identification in the presence of severe data constraints

    NASA Astrophysics Data System (ADS)

    Chen, Jin; Cheng, Wen; Lopresti, Daniel

    2011-01-01

    Since real data is time-consuming and expensive to collect and label, researchers have proposed approaches using synthetic variations for the tasks of signature verification, speaker authentication, handwriting recognition, keyword spotting, etc. However, the limitation of real data is particularly critical in the field of writer identification in that in forensics, adversaries cannot be expected to provide sufficient data to train a classifier. Therefore, it is unrealistic to always assume sufficient real data to train classifiers extensively for writer identification. In addition, this field differs from many others in that we strive to preserve as much inter-writer variations, but model-perturbed handwriting might break such discriminability among writers. Building on work described in another paper where human subjects were involved in calibrating realistic-looking transformation, we then measured the effects of incorporating perturbed handwriting into the training dataset. Experimental results justified our hypothesis that with limited real data, model-perturbed handwriting improved the performance of writer identification. Particularly, if only one single sample for each writer was available, incorporating perturbed data achieved a 36x performance gain.

  17. Isolation and characterization of microsatellite DNA loci in the threatened flat-spired three-toothed land snail Triodopsis platysayoides

    USGS Publications Warehouse

    King, Timothy L.; Eackles, Michael S.; Garner, B. A.; van Tuinen, M.; Arbogast, B. S.

    2015-01-01

    The hermaphroditic flat-spired three-tooth land snail (Triodopsis platysayoides) is endemic to a 21-km stretch of the Cheat River Gorge of northeastern West Virginia, USA. We document isolation and characterization of ten microsatellite DNA markers in this at-risk species. The markers displayed a moderate level of allelic diversity (averaging 7.1 alleles/locus) and heterozygosity (averaging 58.6 %). Allelic diversity at seven loci was sufficient to produce unique multilocus genotypes; no indication of selfing was detected in this cosexual species. Minimal deviations from Hardy–Weinberg equilibrium and no linkage disequilibrium were observed within subpopulations. All loci deviated from Hardy–Weinberg expectations when individuals from subpopulations were pooled. Microsatellite markers developed for T. platysayoides yielded sufficient genetic diversity to (1) distinguish all individuals sampled and the level of selfing; (2) be appropriate for addressing fine-scale population structuring; (3) provide novel demographic insights for the species; and (4) cross-amplify and detect allelic diversity in the congeneric T. juxtidens.

  18. Support for children identified with acute flaccid paralysis under the global polio eradication programme in Uttar Pradesh, India: a qualitative study

    PubMed Central

    2012-01-01

    Background Cases of polio in India declined after the implementation of the polio eradication programme especially in these recent years. The programme includes surveillance of acute flaccid paralysis (AFP) to detect and diagnose cases of polio at early stage. Under this surveillance, over 40,000 cases of AFP are reported annually since 2007 regardless of the number of actual polio cases. Yet, not much is known about these children. We conducted a qualitative research to explore care and support for children with AFP after their diagnosis. Methods The research was conducted in a district of western Uttar Pradesh classified as high-risk area for polio. In-depth interviews with parents of children with polio (17), with non-polio AFP (9), healthcare providers (40), and key informants from community including international and government officers, religious leaders, community leaders, journalists, and academics (21) were performed. Results Minimal medicine and attention were provided at government hospitals. Therefore, most parents preferred private-practice doctors for their children with AFP. Many were visited at homes to have stool samples collected by authorities. Some were visited repetitively following the sample collection, but had difficulty in understanding the reasons for these visits that pertained no treatment. Financial burden was a common concern among all families. Many parents expressed resentment for their children's disease, notably have been affected despite receiving multiple doses of polio vaccine. Both parents and healthcare providers lacked information and knowledge, furthermore poverty minimised the access to available healthcare services. Medicines, education, and transportation means were identified as foremost needs for children with AFP and residual paralysis. Conclusions Despite the high number of children diagnosed with AFP as part of the global polio eradication programme, we found they were not provided with sufficient medical support following their diagnosis. Improvement in the quality and sufficiency of the healthcare system together with integration of AFP surveillance with other services in these underprivileged areas may serve as a key solution. PMID:22439606

  19. Adaptive control of turbulence intensity is accelerated by frugal flow sampling.

    PubMed

    Quinn, Daniel B; van Halder, Yous; Lentink, David

    2017-11-01

    The aerodynamic performance of vehicles and animals, as well as the productivity of turbines and energy harvesters, depends on the turbulence intensity of the incoming flow. Previous studies have pointed at the potential benefits of active closed-loop turbulence control. However, it is unclear what the minimal sensory and algorithmic requirements are for realizing this control. Here we show that very low-bandwidth anemometers record sufficient information for an adaptive control algorithm to converge quickly. Our online Newton-Raphson algorithm tunes the turbulence in a recirculating wind tunnel by taking readings from an anemometer in the test section. After starting at 9% turbulence intensity, the algorithm converges on values ranging from 10% to 45% in less than 12 iterations within 1% accuracy. By down-sampling our measurements, we show that very-low-bandwidth anemometers record sufficient information for convergence. Furthermore, down-sampling accelerates convergence by smoothing gradients in turbulence intensity. Our results explain why low-bandwidth anemometers in engineering and mechanoreceptors in biology may be sufficient for adaptive control of turbulence intensity. Finally, our analysis suggests that, if certain turbulent eddy sizes are more important to control than others, frugal adaptive control schemes can be particularly computationally effective for improving performance. © 2017 The Author(s).

  20. A comparison study on the use of Dowex 1 and TEVA-resin in determination of 99Tc in environmental and nuclear coolant samples in a SIA system with ICP-MS detection.

    PubMed

    Kołacińska, Kamila; Samczyński, Zbigniew; Dudek, Jakub; Bojanowska-Czajka, Anna; Trojanowicz, Marek

    2018-07-01

    This work refers to a comparative study of sorbents widely used in determinations of 99 Tc such as TEVA resin and Dowex 1. Despite having a similar functional group of quaternary amines, both materials represent different chromatographic methods-extraction (TEVA resin) and anion exchange (Dowex 1)-which provides a diverse range of their properties significant in determination of 99 Tc in flow conditions. The comparative tests, carried out in a SIA-LOV (Sequential Injection Analysis-Lab-on-Valve) system combined with mass spectrometric (ICP-MS) detection, considered several factors that are crucial from the standpoint of resin´s utility such as sorption capacity, durability, or selectivity, critical in 99 Tc separation from interferences. The developed and optimized analytical procedure based on the application of the TEVA resin provided determinations of 99 Tc at minimum detectable limit (MDL) 6.00 mBq L -1 in 50 min and has been successfully employed in analyses of samples from nuclear industrial and research units (reactor coolant and sewage) as well as from the river surrounding the nuclear reactor. The method proved to be sufficient for routine analysis of water samples in accordance with EPA standards. The reliability of the method was confirmed in the analysis of the BH standard provided by the NPL for inter-laboratory proficiency tests. The 99 Tc recovery for all real samples was evaluated as 80-100%. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Evaluation of a Gas Chromatograph-Differential Mobility Spectrometer for Potential Water Monitoring on the International Space Station

    NASA Technical Reports Server (NTRS)

    Wallace, William T.; Limero, Thomas F.; Gazda, Daniel B.; Macatangay, Ariel V.; Dwivedi, Prabha; Fernandez, Facundo M.

    2015-01-01

    Environmental monitoring for manned spaceflight has long depended on archival sampling, which was sufficient for short missions. However, the longer mission durations aboard the International Space Station (ISS) have shown that enhanced, real-time monitoring capabilities are necessary in order to protect both the crewmembers and the spacecraft systems. Over the past several years, a number of real-time environmental monitors have been deployed on the ISS. Currently, volatile organic compounds (VOCs) in the station air are monitored by the Air Quality Monitor (AQM), a small, lightweight gas chromatograph-differential mobility spectrometer. For water monitoring, real-time monitors are used for total organic carbon (TOC) and biocide analysis. No information on the actual makeup of the TOC is provided presently, however. An improvement to the current state of environmental monitoring could be realized by modifying a single instrument to analyze both air and water. As the AQM currently provides quantitative, compound-specific information for VOCs in air samples, this instrument provides a logical starting point to evaluate the feasibility of this approach. The major hurdle for this effort lies in the liberation of the target analytes from the water matrix. In this presentation, we will discuss our recent studies, in which an electro-thermal vaporization unit has been interfaced with the AQM to analyze target VOCs at the concentrations at which they are routinely detected in archival water samples from the ISS. We will compare the results of these studies with those obtained from the instrumentation routinely used to analyze archival water samples.

  2. HPSEC reveals ubiquitous components in fluorescent dissolved organic matter across aquatic ecosystems

    NASA Astrophysics Data System (ADS)

    Wünsch, Urban; Murphy, Kathleen; Stedmon, Colin

    2017-04-01

    Absorbance and fluorescence spectroscopy are efficient tools for tracing the supply, turnover and fate of dissolved organic matter (DOM). The fluorescent fraction of DOM (FDOM) can be characterized by measuring excitation-emission matrices and decomposing the combined fluorescence signal into independent underlying fraction using Parallel Factor Analysis (PARAFAC). Comparisons between studies, facilitated by the OpenFluor database, reveal highly similar components across different aquatic systems and between studies. To obtain PARAFAC models in sufficient quality, scientists traditionally rely on analyzing dozens to hundreds of samples spanning environmental gradients. A cross-validation of this approach using different analytical tools has not yet been accomplished. In this study, we applied high-performance size-exclusion chromatography (HPSEC) to characterize the size-dependent optical properties of dissolved organic matter of samples from contrasting aquatic environments with online absorbance and fluorescence detectors. Each sample produced hundreds of absorbance spectra of colored DOM (CDOM) and hundreds of matrices of FDOM intensities. This approach facilitated the detailed study of CDOM spectral slopes and further allowed the reliable implementation of PARAFAC on individual samples. This revealed a high degree of overlap in the spectral properties of components identified from different sites. Moreover, many of the model components showed significant spectral congruence with spectra in the OpenFluor database. Our results provide evidence of the presence of ubiquitous FDOM components and additionally provide further evidence for the supramolecular assembly hypothesis. They demonstrate the potential for HPSEC to provide a wealth of new insights into the relationship between optical and chemical properties of DOM.

  3. ELECTROFISHING DISTANCE AND NUMBER OF SPECIES COLLECTED FROM THREE RAFTABLE WESTERN RIVERS

    EPA Science Inventory

    A key issue in assessing a fish assemblage at a site is determining a sufficient sampling effort to adequately represent the species in an assemblage. Inadequate effort produces considerable noise in multiple samples at the site or under-represents the species present. Excessiv...

  4. High density FTA plates serve as efficient long-term sample storage for HLA genotyping.

    PubMed

    Lange, V; Arndt, K; Schwarzelt, C; Boehme, I; Giani, A S; Schmidt, A H; Ehninger, G; Wassmuth, R

    2014-02-01

    Storage of dried blood spots (DBS) on high-density FTA(®) plates could constitute an appealing alternative to frozen storage. However, it remains controversial whether DBS are suitable for high-resolution sequencing of human leukocyte antigen (HLA) alleles. Therefore, we extracted DNA from DBS that had been stored for up to 4 years, using six different methods. We identified those extraction methods that recovered sufficient high-quality DNA for reliable high-resolution HLA sequencing. Further, we confirmed that frozen whole blood samples that had been stored for several years can be transferred to filter paper without compromising HLA genotyping upon extraction. Concluding, DNA derived from high-density FTA(®) plates is suitable for high-resolution HLA sequencing, provided that appropriate extraction protocols are employed. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Following the dynamics of matter with femtosecond precision using the X-ray streaking method

    DOE PAGES

    David, C.; Karvinen, P.; Sikorski, M.; ...

    2015-01-06

    X-ray Free Electron Lasers (FELs) can produce extremely intense and very short pulses, down to below 10 femtoseconds (fs). Among the key applications are ultrafast time-resolved studies of dynamics of matter by observing responses to fast excitation pulses in a pump-probe manner. Detectors with sufficient time resolution for observing these processes are not available. Therefore, such experiments typically measure a sample's full dynamics by repeating multiple pump-probe cycles at different delay times. This conventional method assumes that the sample returns to an identical or very similar state after each cycle. Here we describe a novel approach that can provide amore » time trace of responses following a single excitation pulse, jitter-free, with fs timing precision. We demonstrate, in an X-ray diffraction experiment, how it can be applied to the investigation of ultrafast irreversible processes.« less

  6. Depression Treatment Among Rural Older Adults: Preferences and Factors Influencing Future Service Use

    PubMed Central

    Kitchen, Katherine A.; McKibbin, Christine L.; Wykes, Thomas L.; Lee, Aaron A.; Carrico, Catherine P.; McConnell, Katelynn A.

    2013-01-01

    The purpose of this study was to investigate depression treatment preferences and anticipated service use in a sample of adults aged 55 years or older who reside in rural Wyoming. Sixteen participants (mean age = 59) completed 30- to 60-minute, semi-structured interviews. Qualitative methods were used to characterize common themes. Social/provider support and community gatekeepers were perceived by participants as important potential facilitators for seeking depression treatment. In contrast, perceived stigma and the value placed on self-sufficiency emerged as key barriers to seeking treatment for depression in this rural, young-old sample. Participants anticipated presenting for treatment in the primary care sector and preferred a combination of medication and psychotherapy for treatment. Participants were, however, more willing to see mental health professionals if they were first referred by a clergy member or primary care physician. PMID:24409008

  7. Development of an X-ray surface analyzer for planetary exploration

    NASA Technical Reports Server (NTRS)

    Clark, B. C.

    1972-01-01

    An ultraminiature X-ray fluorescence spectrometer was developed which can obtain data on element composition not provided by present spacecraft instrumentation. The apparatus employs two radioisotope sources (Fe-55 and Cd-109) which irradiate adjacent areas on a soil sample. Fluorescent X-rays emitted by the sample are detected by four thin-window proportional counters. Using pulse-height discrimination, the energy spectra are determined. Virtually all elements above sodium in the periodic table are detected if present at sufficient levels. Minimum detection limits range from 30 ppm to several percent, depending upon the element and the matrix. For most elements, they are below 0.5 percent. Accuracies likewise depend upon the matrix, but are generally better than plus or minus 0.5 percent for all elements of atomic number greater than 14. Elements below sodium are also detected, but as a single group.

  8. Towards real-time metabolic profiling of a biopsy specimen during a surgical operation by 1H high resolution magic angle spinning nuclear magnetic resonance: a case report.

    PubMed

    Piotto, Martial; Moussallieh, François-Marie; Neuville, Agnès; Bellocq, Jean-Pierre; Elbayed, Karim; Namer, Izzie Jacques

    2012-01-18

    Providing information on cancerous tissue samples during a surgical operation can help surgeons delineate the limits of a tumoral invasion more reliably. Here, we describe the use of metabolic profiling of a colon biopsy specimen by high resolution magic angle spinning nuclear magnetic resonance spectroscopy to evaluate tumoral invasion during a simulated surgical operation. Biopsy specimens (n = 9) originating from the excised right colon of a 66-year-old Caucasian women with an adenocarcinoma were automatically analyzed using a previously built statistical model. Metabolic profiling results were in full agreement with those of a histopathological analysis. The time-response of the technique is sufficiently fast for it to be used effectively during a real operation (17 min/sample). Metabolic profiling has the potential to become a method to rapidly characterize cancerous biopsies in the operation theater.

  9. a Novel Deep Convolutional Neural Network for Spectral-Spatial Classification of Hyperspectral Data

    NASA Astrophysics Data System (ADS)

    Li, N.; Wang, C.; Zhao, H.; Gong, X.; Wang, D.

    2018-04-01

    Spatial and spectral information are obtained simultaneously by hyperspectral remote sensing. Joint extraction of these information of hyperspectral image is one of most import methods for hyperspectral image classification. In this paper, a novel deep convolutional neural network (CNN) is proposed, which extracts spectral-spatial information of hyperspectral images correctly. The proposed model not only learns sufficient knowledge from the limited number of samples, but also has powerful generalization ability. The proposed framework based on three-dimensional convolution can extract spectral-spatial features of labeled samples effectively. Though CNN has shown its robustness to distortion, it cannot extract features of different scales through the traditional pooling layer that only have one size of pooling window. Hence, spatial pyramid pooling (SPP) is introduced into three-dimensional local convolutional filters for hyperspectral classification. Experimental results with a widely used hyperspectral remote sensing dataset show that the proposed model provides competitive performance.

  10. Inferring Biological Structures from Super-Resolution Single Molecule Images Using Generative Models

    PubMed Central

    Maji, Suvrajit; Bruchez, Marcel P.

    2012-01-01

    Localization-based super resolution imaging is presently limited by sampling requirements for dynamic measurements of biological structures. Generating an image requires serial acquisition of individual molecular positions at sufficient density to define a biological structure, increasing the acquisition time. Efficient analysis of biological structures from sparse localization data could substantially improve the dynamic imaging capabilities of these methods. Using a feature extraction technique called the Hough Transform simple biological structures are identified from both simulated and real localization data. We demonstrate that these generative models can efficiently infer biological structures in the data from far fewer localizations than are required for complete spatial sampling. Analysis at partial data densities revealed efficient recovery of clathrin vesicle size distributions and microtubule orientation angles with as little as 10% of the localization data. This approach significantly increases the temporal resolution for dynamic imaging and provides quantitatively useful biological information. PMID:22629348

  11. Markov chain sampling of the O(n) loop models on the infinite plane

    NASA Astrophysics Data System (ADS)

    Herdeiro, Victor

    2017-07-01

    A numerical method was recently proposed in Herdeiro and Doyon [Phys. Rev. E 94, 043322 (2016), 10.1103/PhysRevE.94.043322] showing a precise sampling of the infinite plane two-dimensional critical Ising model for finite lattice subsections. The present note extends the method to a larger class of models, namely the O(n) loop gas models for n ∈(1 ,2 ] . We argue that even though the Gibbs measure is nonlocal, it is factorizable on finite subsections when sufficient information on the loops touching the boundaries is stored. Our results attempt to show that provided an efficient Markov chain mixing algorithm and an improved discrete lattice dilation procedure the planar limit of the O(n) models can be numerically studied with efficiency similar to the Ising case. This confirms that scale invariance is the only requirement for the present numerical method to work.

  12. Quantitative nanoscopy: Tackling sampling limitations in (S)TEM imaging of polymers and composites.

    PubMed

    Gnanasekaran, Karthikeyan; Snel, Roderick; de With, Gijsbertus; Friedrich, Heiner

    2016-01-01

    Sampling limitations in electron microscopy questions whether the analysis of a bulk material is representative, especially while analyzing hierarchical morphologies that extend over multiple length scales. We tackled this problem by automatically acquiring a large series of partially overlapping (S)TEM images with sufficient resolution, subsequently stitched together to generate a large-area map using an in-house developed acquisition toolbox (TU/e Acquisition ToolBox) and stitching module (TU/e Stitcher). In addition, we show that quantitative image analysis of the large scale maps provides representative information that can be related to the synthesis and process conditions of hierarchical materials, which moves electron microscopy analysis towards becoming a bulk characterization tool. We demonstrate the power of such an analysis by examining two different multi-phase materials that are structured over multiple length scales. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Forecasting electricity usage using univariate time series models

    NASA Astrophysics Data System (ADS)

    Hock-Eam, Lim; Chee-Yin, Yip

    2014-12-01

    Electricity is one of the important energy sources. A sufficient supply of electricity is vital to support a country's development and growth. Due to the changing of socio-economic characteristics, increasing competition and deregulation of electricity supply industry, the electricity demand forecasting is even more important than before. It is imperative to evaluate and compare the predictive performance of various forecasting methods. This will provide further insights on the weakness and strengths of each method. In literature, there are mixed evidences on the best forecasting methods of electricity demand. This paper aims to compare the predictive performance of univariate time series models for forecasting the electricity demand using a monthly data of maximum electricity load in Malaysia from January 2003 to December 2013. Results reveal that the Box-Jenkins method produces the best out-of-sample predictive performance. On the other hand, Holt-Winters exponential smoothing method is a good forecasting method for in-sample predictive performance.

  14. Women's health: periodontitis and its relation to hormonal changes, adverse pregnancy outcomes and osteoporosis.

    PubMed

    Krejci, Charlene B; Bissada, Nabil F

    2012-01-01

    To examine the literature with respect to periodontitis and issues specific to women's health, namely, hormonal changes, adverse pregnancy outcomes and osteoporosis. The literature was evaluated to review reported associations between periodontitis and genderspecific issues, namely, hormonal changes, adverse pregnancy outcomes and osteoporosis. Collectively, the literature provided a large body of evidence that supports various associations between periodontitis and hormonal changes, adverse pregnancy outcomes and osteoporosis; however, certain shortcomings were noted with respect to biases involving definitions, sample sizes and confounding variables. Specific cause and effect relationships could not be delineated at this time and neither could definitive treatment interventions. Future research must include randomised controlled trials with consistent definitions, adequate controls and sufficiently large sample sizes in order to clarify specific associations, identify cause and effect relationships, define treatment options and determine treatment interventions which will lessen the untoward effects on the at-risk populations.

  15. Dependence of LTX plasma performance on surface conditions as determined by in situ analysis of plasma facing components

    NASA Astrophysics Data System (ADS)

    Lucia, M.; Kaita, R.; Majeski, R.; Bedoya, F.; Allain, J. P.; Abrams, T.; Bell, R. E.; Boyle, D. P.; Jaworski, M. A.; Schmitt, J. C.

    2015-08-01

    The Materials Analysis and Particle Probe (MAPP) diagnostic has been implemented on the Lithium Tokamak Experiment (LTX) at PPPL, providing the first in situ X-ray photoelectron spectroscopy (XPS) surface characterization of tokamak plasma facing components (PFCs). MAPP samples were exposed to argon glow discharge conditioning (GDC), lithium evaporations, and hydrogen tokamak discharges inside LTX. Samples were analyzed with XPS, and alterations to surface conditions were correlated against observed LTX plasma performance changes. Argon GDC caused the accumulation of nm-scale metal oxide layers on the PFC surface, which appeared to bury surface carbon and oxygen contamination and thus improve plasma performance. Lithium evaporation led to the rapid formation of a lithium oxide (Li2O) surface; plasma performance was strongly improved for sufficiently thick evaporative coatings. Results indicate that a 5 h argon GDC or a 50 nm evaporative lithium coating will both significantly improve LTX plasma performance.

  16. Testing the use of juvenile Salmo trutta L. as biomonitors of heavy metal pollution in freshwater.

    PubMed

    Lamas, S; Fernández, J A; Aboal, J R; Carballeira, A

    2007-02-01

    Individual specimens of Salmo trutta were captured, from four sampling sites in Galician rivers (NW Spain) affected by different types of contamination: diffuse urban waste, run-off from an unrestored dump at a copper mine and waste from a fish farm. The ages of the captured trouts were established and only those belonging to the 1+ age class were selected for study. The liver and kidney were removed from each fish and analysed to determine the tissue concentrations of Cu, Fe and Zn. The results obtained showed that: (i) the use of 1+ individuals allowed differentiation of contamination scenarios on the basis of the tissue concentrations of metal; (ii) the use of 1+ individuals allowed standardization of the time of exposure, which was sufficiently long for differential uptake to have taken place; (iii) liver tissue provided the best results as, less effort was required than for processing kidney tissue, and significant differences between sampling sites were detected because the intrapopulational variability in metal levels was lower than for kidney, and (iv) the levels of elements detected were not affected by basal tissue concentrations or residual concentrations due to past contamination, which older trouts may have been exposed to. In addition, the use of 1+ trout may provide better results in annual environmental sampling surveys.

  17. How many stakes are required to measure the mass balance of a glacier?

    USGS Publications Warehouse

    Fountain, A.G.; Vecchia, A.

    1999-01-01

    Glacier mass balance is estimated for South Cascade Glacier and Maclure Glacier using a one-dimensional regression of mass balance with altitude as an alternative to the traditional approach of contouring mass balance values. One attractive feature of regression is that it can be applied to sparse data sets where contouring is not possible and can provide an objective error of the resulting estimate. Regression methods yielded mass balance values equivalent to contouring methods. The effect of the number of mass balance measurements on the final value for the glacier showed that sample sizes as small as five stakes provided reasonable estimates, although the error estimates were greater than for larger sample sizes. Different spatial patterns of measurement locations showed no appreciable influence on the final value as long as different surface altitudes were intermittently sampled over the altitude range of the glacier. Two different regression equations were examined, a quadratic, and a piecewise linear spline, and comparison of results showed little sensitivity to the type of equation. These results point to the dominant effect of the gradient of mass balance with altitude of alpine glaciers compared to transverse variations. The number of mass balance measurements required to determine the glacier balance appears to be scale invariant for small glaciers and five to ten stakes are sufficient.

  18. Freeway travel speed calculation model based on ETC transaction data.

    PubMed

    Weng, Jiancheng; Yuan, Rongliang; Wang, Ru; Wang, Chang

    2014-01-01

    Real-time traffic flow operation condition of freeway gradually becomes the critical information for the freeway users and managers. In fact, electronic toll collection (ETC) transaction data effectively records operational information of vehicles on freeway, which provides a new method to estimate the travel speed of freeway. First, the paper analyzed the structure of ETC transaction data and presented the data preprocess procedure. Then, a dual-level travel speed calculation model was established under different levels of sample sizes. In order to ensure a sufficient sample size, ETC data of different enter-leave toll plazas pairs which contain more than one road segment were used to calculate the travel speed of every road segment. The reduction coefficient α and reliable weight θ for sample vehicle speed were introduced in the model. Finally, the model was verified by the special designed field experiments which were conducted on several freeways in Beijing at different time periods. The experiments results demonstrated that the average relative error was about 6.5% which means that the freeway travel speed could be estimated by the proposed model accurately. The proposed model is helpful to promote the level of the freeway operation monitoring and the freeway management, as well as to provide useful information for the freeway travelers.

  19. Performance Analysis of Motion-Sensor Behavior for User Authentication on Smartphones

    PubMed Central

    Shen, Chao; Yu, Tianwen; Yuan, Sheng; Li, Yunpeng; Guan, Xiaohong

    2016-01-01

    The growing trend of using smartphones as personal computing platforms to access and store private information has stressed the demand for secure and usable authentication mechanisms. This paper investigates the feasibility and applicability of using motion-sensor behavior data for user authentication on smartphones. For each sample of the passcode, sensory data from motion sensors are analyzed to extract descriptive and intensive features for accurate and fine-grained characterization of users’ passcode-input actions. One-class learning methods are applied to the feature space for performing user authentication. Analyses are conducted using data from 48 participants with 129,621 passcode samples across various operational scenarios and different types of smartphones. Extensive experiments are included to examine the efficacy of the proposed approach, which achieves a false-rejection rate of 6.85% and a false-acceptance rate of 5.01%. Additional experiments on usability with respect to passcode length, sensitivity with respect to training sample size, scalability with respect to number of users, and flexibility with respect to screen size were provided to further explore the effectiveness and practicability. The results suggest that sensory data could provide useful authentication information, and this level of performance approaches sufficiency for two-factor authentication on smartphones. Our dataset is publicly available to facilitate future research. PMID:27005626

  20. Performance Analysis of Motion-Sensor Behavior for User Authentication on Smartphones.

    PubMed

    Shen, Chao; Yu, Tianwen; Yuan, Sheng; Li, Yunpeng; Guan, Xiaohong

    2016-03-09

    The growing trend of using smartphones as personal computing platforms to access and store private information has stressed the demand for secure and usable authentication mechanisms. This paper investigates the feasibility and applicability of using motion-sensor behavior data for user authentication on smartphones. For each sample of the passcode, sensory data from motion sensors are analyzed to extract descriptive and intensive features for accurate and fine-grained characterization of users' passcode-input actions. One-class learning methods are applied to the feature space for performing user authentication. Analyses are conducted using data from 48 participants with 129,621 passcode samples across various operational scenarios and different types of smartphones. Extensive experiments are included to examine the efficacy of the proposed approach, which achieves a false-rejection rate of 6.85% and a false-acceptance rate of 5.01%. Additional experiments on usability with respect to passcode length, sensitivity with respect to training sample size, scalability with respect to number of users, and flexibility with respect to screen size were provided to further explore the effectiveness and practicability. The results suggest that sensory data could provide useful authentication information, and this level of performance approaches sufficiency for two-factor authentication on smartphones. Our dataset is publicly available to facilitate future research.

  1. Accurate calibration and control of relative humidity close to 100% by X-raying a DOPC multilayer

    DOE PAGES

    Ma, Yicong; Ghosh, Sajal K.; Bera, Sambhunath; ...

    2015-01-01

    Here in this study, we have designed a compact sample chamber that can achieve accurate and continuous control of the relative humidity (RH) in the vicinity of 100%. A 1,2-dioleoyl-sn-glycero-3-phosphocholine (DOPC) multilayer can be used as a humidity sensor by measuring its inter-layer repeat distance (d-spacing) via X-ray diffraction. We convert from DOPC d-spacing to RH according to a theory given in the literature and previously measured data of DOPC multilamellar vesicles in polyvinylpyrrolidone (PVP) solutions. This curve can be used for calibration of RH close to 100%, a regime where conventional sensors do not have sufficient accuracy. We demonstratemore » that this control method can provide RH accuracies of 0.1 to 0.01%, which is a factor of 10–100 improvement compared to existing methods of humidity control. Our method provides fine tuning capability of RH continuously for a single sample, whereas the PVP solution method requires new samples to be made for each PVP concentration. The use of this cell also potentially removes the need for an X-ray or neutron beam to pass through bulk water if one wishes to work close to biologically relevant conditions of nearly 100% RH.« less

  2. Characterization of aerosol scattering and spectral absorption by unique methods: a polar/imaging nephelometer and spectral reflectance measurements of aerosol samples collected on filters

    NASA Astrophysics Data System (ADS)

    Dolgos, Gergely; Martins, J. Vanderlei; Remer, Lorraine A.; Correia, Alexandre L.; Tabacniks, Manfredo; Lima, Adriana R.

    2010-02-01

    Characterization of aerosol scattering and absorption properties is essential to accurate radiative transfer calculations in the atmosphere. Applications of this work include remote sensing of aerosols, corrections for aerosol distortions in satellite imagery of the surface, global climate models, and atmospheric beam propagation. Here we demonstrate successful instrument development at the Laboratory for Aerosols, Clouds and Optics at UMBC that better characterizes aerosol scattering phase matrix using an imaging polar nephelometer (LACO-I-Neph) and enables measurement of spectral aerosol absorption from 200 nm to 2500 nm. The LACO-I-Neph measures the scattering phase function from 1.5° to 178.5° scattering angle with sufficient sensitivity to match theoretical expectations of Rayleigh scattering of various gases. Previous measurements either lack a sufficiently wide range of measured scattering angles or their sensitivity is too low and therefore the required sample amount is prohibitively high for in situ measurements. The LACO-I-Neph also returns expected characterization of the linear polarization signal of Rayleigh scattering. Previous work demonstrated the ability of measuring spectral absorption of aerosol particles using a reflectance technique characterization of aerosol samples collected on Nuclepore filters. This first generation methodology yielded absorption measurements from 350 nm to 2500 nm. Here we demonstrate the possibility of extending this wavelength range into the deep UV, to 200 nm. This extended UV region holds much promise in identifying and characterizing aerosol types and species. The second generation, deep UV, procedure requires careful choice of filter substrates. Here the choice of substrates is explored and preliminary results are provided.

  3. Digital fast neutron radiography of steel reinforcing bar in concrete

    NASA Astrophysics Data System (ADS)

    Mitton, K.; Jones, A.; Joyce, M. J.

    2014-12-01

    Neutron imaging has previously been used in order to test for cracks, degradation and water content in concrete. However, these techniques often fall short of alternative non-destructive testing methods, such as γ-ray and X-ray imaging, particularly in terms of resolution. Further, thermal neutron techniques can be compromised by the significant expense associated with thermal neutron sources of sufficient intensity to yield satisfactory results that can often precipitate the need for a reactor. Such embodiments are clearly not portable in the context of the needs of field applications. This paper summarises the results of a study to investigate the potential for transmission radiography based on fast neutrons. The objective of this study was to determine whether the presence of heterogeneities in concrete, such as reinforcement structures, could be identified on the basis of variation in transmitted fast-neutron flux. Monte-Carlo simulations have been performed and the results from these are compared to those arising from practical tests using a 252Cf source. The experimental data have been acquired using a digital pulse-shape discrimination system that enables fast neutron transmission to be studied across an array of liquid scintillators placed in close proximity to samples under test, and read out in real time. Whilst this study does not yield sufficient spatial resolution, a comparison of overall flux ratios does provide a basis for the discrimination between samples with contrasting rebar content. This approach offers the potential for non-destructive testing that gives less dose, better transportability and better accessibility than competing approaches. It is also suitable for thick samples where γ-ray and X-ray methods can be limited.

  4. Evaluation and guidelines for using polyurethane foam (PUF) passive air samplers in double-dome chambers to assess semi-volatile organic compounds (SVOCs) in non-industrial indoor environments.

    PubMed

    Bohlin, Pernilla; Audy, Ondřej; Škrdlíková, Lenka; Kukučka, Petr; Vojta, Šimon; Přibylová, Petra; Prokeš, Roman; Čupr, Pavel; Klánová, Jana

    2014-11-01

    Indoor air pollution has been recognized as an important risk factor for human health, especially in areas where people tend to spend most of their time indoors. Many semi-volatile organic compounds (SVOCs) have primarily indoor sources and are present in orders of magnitude higher concentrations indoors than outdoors. Despite this, awareness of SVOCs in indoor air and assessment of the link between indoor concentrations and human health have lagged behind those of outdoor air. This is partially related to challenges associated with indoor sampling of SVOCs. Passive air samplers (PASs), which are widely accepted in established outdoor air monitoring networks, have been used to fill the knowledge gaps on indoor SVOCs distribution. However, their applicability for indoor environments and the assessment of human health risks lack sufficient experimental data. To address this issue, we performed an indoor calibration study of polyurethane foam (PUF) PAS deployed in a double-dome chamber, covering both legacy and new SVOC classes. PUF-PAS and a continuous low-volume active air sampler (AAS) were co-deployed for a calibration period of twelve weeks. Based on the results from this evaluation, PUF-PAS in a double-bowl chamber is recommended for indoor sampling and health risk assessment of gas phase SVOCs, including novel brominated flame retardants (nBFR) providing sufficient exposure time is applied. Data for particle associated SVOCs suffered from significant uncertainties caused by low level of detection and low precision in this study. A more open chamber design for indoor studies may allow for higher sampling rates (RS) and better performance for the particle associated SVOCs.

  5. Methods for treating a metathesis feedstock with metal alkoxides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cohen, Steven A.; Anderson, Donde R.; Wang, Zhe

    Various methods are provided for treating and reacting a metathesis feedstock. In one embodiment, the method includes providing a feedstock comprising a natural oil, chemically treating the feedstock with a metal alkoxide under conditions sufficient to diminish catalyst poisons in the feedstock, and, following the treating, combining a metathesis catalyst with the feedstock under conditions sufficient to metathesize the feedstock.

  6. Compact ultrahigh vacuum sample environments for x-ray nanobeam diffraction and imaging.

    PubMed

    Evans, P G; Chahine, G; Grifone, R; Jacques, V L R; Spalenka, J W; Schülli, T U

    2013-11-01

    X-ray nanobeams present the opportunity to obtain structural insight in materials with small volumes or nanoscale heterogeneity. The effective spatial resolution of the information derived from nanobeam techniques depends on the stability and precision with which the relative position of the x-ray optics and sample can be controlled. Nanobeam techniques include diffraction, imaging, and coherent scattering, with applications throughout materials science and condensed matter physics. Sample positioning is a significant mechanical challenge for x-ray instrumentation providing vacuum or controlled gas environments at elevated temperatures. Such environments often have masses that are too large for nanopositioners capable of the required positional accuracy of the order of a small fraction of the x-ray spot size. Similarly, the need to place x-ray optics as close as 1 cm to the sample places a constraint on the overall size of the sample environment. We illustrate a solution to the mechanical challenge in which compact ion-pumped ultrahigh vacuum chambers with masses of 1-2 kg are integrated with nanopositioners. The overall size of the environment is sufficiently small to allow their use with zone-plate focusing optics. We describe the design of sample environments for elevated-temperature nanobeam diffraction experiments demonstrate in situ diffraction, reflectivity, and scanning nanobeam imaging of the ripening of Au crystallites on Si substrates.

  7. Compact ultrahigh vacuum sample environments for x-ray nanobeam diffraction and imaging

    NASA Astrophysics Data System (ADS)

    Evans, P. G.; Chahine, G.; Grifone, R.; Jacques, V. L. R.; Spalenka, J. W.; Schülli, T. U.

    2013-11-01

    X-ray nanobeams present the opportunity to obtain structural insight in materials with small volumes or nanoscale heterogeneity. The effective spatial resolution of the information derived from nanobeam techniques depends on the stability and precision with which the relative position of the x-ray optics and sample can be controlled. Nanobeam techniques include diffraction, imaging, and coherent scattering, with applications throughout materials science and condensed matter physics. Sample positioning is a significant mechanical challenge for x-ray instrumentation providing vacuum or controlled gas environments at elevated temperatures. Such environments often have masses that are too large for nanopositioners capable of the required positional accuracy of the order of a small fraction of the x-ray spot size. Similarly, the need to place x-ray optics as close as 1 cm to the sample places a constraint on the overall size of the sample environment. We illustrate a solution to the mechanical challenge in which compact ion-pumped ultrahigh vacuum chambers with masses of 1-2 kg are integrated with nanopositioners. The overall size of the environment is sufficiently small to allow their use with zone-plate focusing optics. We describe the design of sample environments for elevated-temperature nanobeam diffraction experiments demonstrate in situ diffraction, reflectivity, and scanning nanobeam imaging of the ripening of Au crystallites on Si substrates.

  8. Estimating fish swimming metrics and metabolic rates with accelerometers: the influence of sampling frequency.

    PubMed

    Brownscombe, J W; Lennox, R J; Danylchuk, A J; Cooke, S J

    2018-06-21

    Accelerometry is growing in popularity for remotely measuring fish swimming metrics, but appropriate sampling frequencies for accurately measuring these metrics are not well studied. This research examined the influence of sampling frequency (1-25 Hz) with tri-axial accelerometer biologgers on estimates of overall dynamic body acceleration (ODBA), tail-beat frequency, swimming speed and metabolic rate of bonefish Albula vulpes in a swim-tunnel respirometer and free-swimming in a wetland mesocosm. In the swim tunnel, sampling frequencies of ≥ 5 Hz were sufficient to establish strong relationships between ODBA, swimming speed and metabolic rate. However, in free-swimming bonefish, estimates of metabolic rate were more variable below 10 Hz. Sampling frequencies should be at least twice the maximum tail-beat frequency to estimate this metric effectively, which is generally higher than those required to estimate ODBA, swimming speed and metabolic rate. While optimal sampling frequency probably varies among species due to tail-beat frequency and swimming style, this study provides a reference point with a medium body-sized sub-carangiform teleost fish, enabling researchers to measure these metrics effectively and maximize study duration. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  9. Method of using deuterium-cluster foils for an intense pulsed neutron source

    DOEpatents

    Miley, George H.; Yang, Xiaoling

    2013-09-03

    A method is provided for producing neutrons, comprising: providing a converter foil comprising deuterium clusters; focusing a laser on the foil with power and energy sufficient to cause deuteron ions to separate from the foil; and striking a surface of a target with the deuteron ions from the converter foil with energy sufficient to cause neutron production by a reaction selected from the group consisting of D-D fusion, D-T fusion, D-metal nuclear spallation, and p-metal. A further method is provided for assembling a plurality of target assemblies for a target injector to be used in the previously mentioned manner. A further method is provided for producing neutrons, comprising: splitting a laser beam into a first beam and a second beam; striking a first surface of a target with the first beam, and an opposite second surface of the target with the second beam with energy sufficient to cause neutron production.

  10. An Evaluation of the Plant Density Estimator the Point-Centred Quarter Method (PCQM) Using Monte Carlo Simulation.

    PubMed

    Khan, Md Nabiul Islam; Hijbeek, Renske; Berger, Uta; Koedam, Nico; Grueters, Uwe; Islam, S M Zahirul; Hasan, Md Asadul; Dahdouh-Guebas, Farid

    2016-01-01

    In the Point-Centred Quarter Method (PCQM), the mean distance of the first nearest plants in each quadrant of a number of random sample points is converted to plant density. It is a quick method for plant density estimation. In recent publications the estimator equations of simple PCQM (PCQM1) and higher order ones (PCQM2 and PCQM3, which uses the distance of the second and third nearest plants, respectively) show discrepancy. This study attempts to review PCQM estimators in order to find the most accurate equation form. We tested the accuracy of different PCQM equations using Monte Carlo Simulations in simulated (having 'random', 'aggregated' and 'regular' spatial patterns) plant populations and empirical ones. PCQM requires at least 50 sample points to ensure a desired level of accuracy. PCQM with a corrected estimator is more accurate than with a previously published estimator. The published PCQM versions (PCQM1, PCQM2 and PCQM3) show significant differences in accuracy of density estimation, i.e. the higher order PCQM provides higher accuracy. However, the corrected PCQM versions show no significant differences among them as tested in various spatial patterns except in plant assemblages with a strong repulsion (plant competition). If N is number of sample points and R is distance, the corrected estimator of PCQM1 is 4(4N - 1)/(π ∑ R2) but not 12N/(π ∑ R2), of PCQM2 is 4(8N - 1)/(π ∑ R2) but not 28N/(π ∑ R2) and of PCQM3 is 4(12N - 1)/(π ∑ R2) but not 44N/(π ∑ R2) as published. If the spatial pattern of a plant association is random, PCQM1 with a corrected equation estimator and over 50 sample points would be sufficient to provide accurate density estimation. PCQM using just the nearest tree in each quadrant is therefore sufficient, which facilitates sampling of trees, particularly in areas with just a few hundred trees per hectare. PCQM3 provides the best density estimations for all types of plant assemblages including the repulsion process. Since in practice, the spatial pattern of a plant association remains unknown before starting a vegetation survey, for field applications the use of PCQM3 along with the corrected estimator is recommended. However, for sparse plant populations, where the use of PCQM3 may pose practical limitations, the PCQM2 or PCQM1 would be applied. During application of PCQM in the field, care should be taken to summarize the distance data based on 'the inverse summation of squared distances' but not 'the summation of inverse squared distances' as erroneously published.

  11. Two-sample binary phase 2 trials with low type I error and low sample size.

    PubMed

    Litwin, Samuel; Basickes, Stanley; Ross, Eric A

    2017-04-30

    We address design of two-stage clinical trials comparing experimental and control patients. Our end point is success or failure, however measured, with null hypothesis that the chance of success in both arms is p 0 and alternative that it is p 0 among controls and p 1  > p 0 among experimental patients. Standard rules will have the null hypothesis rejected when the number of successes in the (E)xperimental arm, E, sufficiently exceeds C, that among (C)ontrols. Here, we combine one-sample rejection decision rules, E⩾m, with two-sample rules of the form E - C > r to achieve two-sample tests with low sample number and low type I error. We find designs with sample numbers not far from the minimum possible using standard two-sample rules, but with type I error of 5% rather than 15% or 20% associated with them, and of equal power. This level of type I error is achieved locally, near the stated null, and increases to 15% or 20% when the null is significantly higher than specified. We increase the attractiveness of these designs to patients by using 2:1 randomization. Examples of the application of this new design covering both high and low success rates under the null hypothesis are provided. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  12. Method for producing strain tolerant multifilamentary oxide superconducting wire

    DOEpatents

    Finnemore, D.K.; Miller, T.A.; Ostenson, J.E.; Schwartzkopf, L.A.; Sanders, S.C.

    1994-07-19

    A strain tolerant multifilamentary wire capable of carrying superconducting currents is provided comprising a plurality of discontinuous filaments formed from a high temperature superconducting material. The discontinuous filaments have a length at least several orders of magnitude greater than the filament diameter and are sufficiently strong while in an amorphous state to withstand compaction. A normal metal is interposed between and binds the discontinuous filaments to form a normal metal matrix capable of withstanding heat treatment for converting the filaments to a superconducting state. The geometry of the filaments within the normal metal matrix provides substantial filament-to-filament overlap, and the normal metal is sufficiently thin to allow supercurrent transfer between the overlapped discontinuous filaments but is also sufficiently thick to provide strain relief to the filaments. 6 figs.

  13. Method for producing strain tolerant multifilamentary oxide superconducting wire

    DOEpatents

    Finnemore, Douglas K.; Miller, Theodore A.; Ostenson, Jerome E.; Schwartzkopf, Louis A.; Sanders, Steven C.

    1994-07-19

    A strain tolerant multifilamentary wire capable of carrying superconducting currents is provided comprising a plurality of discontinuous filaments formed from a high temperature superconducting material. The discontinuous filaments have a length at least several orders of magnitude greater than the filament diameter and are sufficiently strong while in an amorphous state to withstand compaction. A normal metal is interposed between and binds the discontinuous filaments to form a normal metal matrix capable of withstanding heat treatment for converting the filaments to a superconducting state. The geometry of the filaments within the normal metal matrix provides substantial filament-to-filament overlap, and the normal metal is sufficiently thin to allow supercurrent transfer between the overlapped discontinuous filaments but is also sufficiently thick to provide strain relief to the filaments.

  14. Three Dimensional Imaging of Paraffin Embedded Human Lung Tissue Samples by Micro-Computed Tomography

    PubMed Central

    Scott, Anna E.; Vasilescu, Dragos M.; Seal, Katherine A. D.; Keyes, Samuel D.; Mavrogordato, Mark N.; Hogg, James C.; Sinclair, Ian; Warner, Jane A.; Hackett, Tillie-Louise; Lackie, Peter M.

    2015-01-01

    Background Understanding the three-dimensional (3-D) micro-architecture of lung tissue can provide insights into the pathology of lung disease. Micro computed tomography (µCT) has previously been used to elucidate lung 3D histology and morphometry in fixed samples that have been stained with contrast agents or air inflated and dried. However, non-destructive microstructural 3D imaging of formalin-fixed paraffin embedded (FFPE) tissues would facilitate retrospective analysis of extensive tissue archives of lung FFPE lung samples with linked clinical data. Methods FFPE human lung tissue samples (n = 4) were scanned using a Nikon metrology µCT scanner. Semi-automatic techniques were used to segment the 3D structure of airways and blood vessels. Airspace size (mean linear intercept, Lm) was measured on µCT images and on matched histological sections from the same FFPE samples imaged by light microscopy to validate µCT imaging. Results The µCT imaging protocol provided contrast between tissue and paraffin in FFPE samples (15mm x 7mm). Resolution (voxel size 6.7 µm) in the reconstructed images was sufficient for semi-automatic image segmentation of airways and blood vessels as well as quantitative airspace analysis. The scans were also used to scout for regions of interest, enabling time-efficient preparation of conventional histological sections. The Lm measurements from µCT images were not significantly different to those from matched histological sections. Conclusion We demonstrated how non-destructive imaging of routinely prepared FFPE samples by laboratory µCT can be used to visualize and assess the 3D morphology of the lung including by morphometric analysis. PMID:26030902

  15. A molecular simulation protocol to avoid sampling redundancy and discover new states.

    PubMed

    Bacci, Marco; Vitalis, Andreas; Caflisch, Amedeo

    2015-05-01

    For biomacromolecules or their assemblies, experimental knowledge is often restricted to specific states. Ambiguity pervades simulations of these complex systems because there is no prior knowledge of relevant phase space domains, and sampling recurrence is difficult to achieve. In molecular dynamics methods, ruggedness of the free energy surface exacerbates this problem by slowing down the unbiased exploration of phase space. Sampling is inefficient if dwell times in metastable states are large. We suggest a heuristic algorithm to terminate and reseed trajectories run in multiple copies in parallel. It uses a recent method to order snapshots, which provides notions of "interesting" and "unique" for individual simulations. We define criteria to guide the reseeding of runs from more "interesting" points if they sample overlapping regions of phase space. Using a pedagogical example and an α-helical peptide, the approach is demonstrated to amplify the rate of exploration of phase space and to discover metastable states not found by conventional sampling schemes. Evidence is provided that accurate kinetics and pathways can be extracted from the simulations. The method, termed PIGS for Progress Index Guided Sampling, proceeds in unsupervised fashion, is scalable, and benefits synergistically from larger numbers of replicas. Results confirm that the underlying ideas are appropriate and sufficient to enhance sampling. In molecular simulations, errors caused by not exploring relevant domains in phase space are always unquantifiable and can be arbitrarily large. Our protocol adds to the toolkit available to researchers in reducing these types of errors. This article is part of a Special Issue entitled "Recent developments of molecular dynamics". Copyright © 2014 Elsevier B.V. All rights reserved.

  16. 76 FR 62044 - Alternative Testing Requirements for Small Batch Manufacturers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-10-06

    ... every manufacturer of a children's product that is subject to a children's product safety rule shall submit sufficient samples of the children's product, or samples that are identical in all material... compliance with such children's product safety rule. Further, section 14(i)(2) requires continued testing of...

  17. A COMPARISON OF SIX BENTHIC MACROINVERTEBRATE SAMPLING METHODS IN FOUR LARGE RIVERS

    EPA Science Inventory

    In 1999, a study was conducted to compare six macroinvertebrate sampling methods in four large (boatable) rivers that drain into the Ohio River. Two methods each were adapted from existing methods used by the USEPA, USGS and Ohio EPA. Drift nets were unable to collect a suffici...

  18. Identification of Novel Prostate Cancer-Causitive Gene Mutations by Representational Difference Analysis of Microdissected Prostate Cancer

    DTIC Science & Technology

    2001-03-01

    paired samples of microdissected benign and malignant prostate epithelium. The resulting subtraction products were cloned and screened in Southern blots... benign and malignant human prostate cancer. Data is given to show that microdissected tissue samples retain RNA of sufficient quality to perform gene

  19. Analysis of variograms with various sample sizes from a multispectral image

    USDA-ARS?s Scientific Manuscript database

    Variogram plays a crucial role in remote sensing application and geostatistics. It is very important to estimate variogram reliably from sufficient data. In this study, the analysis of variograms with various sample sizes of remotely sensed data was conducted. A 100x100-pixel subset was chosen from ...

  20. ESTIMATION OF TOTAL DISSOLVED NITRATE LOAD IN NATURAL STREAM FLOWS USING AN IN-STREAM MONITOR

    EPA Science Inventory

    Estuaries respond rapidly to rain events and the nutrients carried by inflowing rivers such that discrete samples at weekly or monthly intervals are inadequate to catch the maxima and minima in nutrient variability. To acquire data with sufficient sampling frequency to realistica...

  1. Impact of Market Behavior, Fleet Composition, and Ancillary Services on Revenue Sufficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frew, Bethany

    This presentation provides an overview of new and ongoing NREL research that aims to improve our understanding of reliability and revenue sufficiency challenges through modeling tools within a markets framework.

  2. Method of dehalogenation using diamonds

    DOEpatents

    Farcasiu, Malvina; Kaufman, Phillip B.; Ladner, Edward P.; Anderson, Richard R.

    2000-01-01

    A method for preparing olefins and halogenated olefins is provided comprising contacting halogenated compounds with diamonds for a sufficient time and at a sufficient temperature to convert the halogenated compounds to olefins and halogenated olefins via elimination reactions.

  3. An iterative and targeted sampling design informed by habitat suitability models for detecting focal plant species over extensive areas.

    PubMed

    Wang, Ophelia; Zachmann, Luke J; Sesnie, Steven E; Olsson, Aaryn D; Dickson, Brett G

    2014-01-01

    Prioritizing areas for management of non-native invasive plants is critical, as invasive plants can negatively impact plant community structure. Extensive and multi-jurisdictional inventories are essential to prioritize actions aimed at mitigating the impact of invasions and changes in disturbance regimes. However, previous work devoted little effort to devising sampling methods sufficient to assess the scope of multi-jurisdictional invasion over extensive areas. Here we describe a large-scale sampling design that used species occurrence data, habitat suitability models, and iterative and targeted sampling efforts to sample five species and satisfy two key management objectives: 1) detecting non-native invasive plants across previously unsampled gradients, and 2) characterizing the distribution of non-native invasive plants at landscape to regional scales. Habitat suitability models of five species were based on occurrence records and predictor variables derived from topography, precipitation, and remotely sensed data. We stratified and established field sampling locations according to predicted habitat suitability and phenological, substrate, and logistical constraints. Across previously unvisited areas, we detected at least one of our focal species on 77% of plots. In turn, we used detections from 2011 to improve habitat suitability models and sampling efforts in 2012, as well as additional spatial constraints to increase detections. These modifications resulted in a 96% detection rate at plots. The range of habitat suitability values that identified highly and less suitable habitats and their environmental conditions corresponded to field detections with mixed levels of agreement. Our study demonstrated that an iterative and targeted sampling framework can address sampling bias, reduce time costs, and increase detections. Other studies can extend the sampling framework to develop methods in other ecosystems to provide detection data. The sampling methods implemented here provide a meaningful tool when understanding the potential distribution and habitat of species over multi-jurisdictional and extensive areas is needed for achieving management objectives.

  4. An Iterative and Targeted Sampling Design Informed by Habitat Suitability Models for Detecting Focal Plant Species over Extensive Areas

    PubMed Central

    Wang, Ophelia; Zachmann, Luke J.; Sesnie, Steven E.; Olsson, Aaryn D.; Dickson, Brett G.

    2014-01-01

    Prioritizing areas for management of non-native invasive plants is critical, as invasive plants can negatively impact plant community structure. Extensive and multi-jurisdictional inventories are essential to prioritize actions aimed at mitigating the impact of invasions and changes in disturbance regimes. However, previous work devoted little effort to devising sampling methods sufficient to assess the scope of multi-jurisdictional invasion over extensive areas. Here we describe a large-scale sampling design that used species occurrence data, habitat suitability models, and iterative and targeted sampling efforts to sample five species and satisfy two key management objectives: 1) detecting non-native invasive plants across previously unsampled gradients, and 2) characterizing the distribution of non-native invasive plants at landscape to regional scales. Habitat suitability models of five species were based on occurrence records and predictor variables derived from topography, precipitation, and remotely sensed data. We stratified and established field sampling locations according to predicted habitat suitability and phenological, substrate, and logistical constraints. Across previously unvisited areas, we detected at least one of our focal species on 77% of plots. In turn, we used detections from 2011 to improve habitat suitability models and sampling efforts in 2012, as well as additional spatial constraints to increase detections. These modifications resulted in a 96% detection rate at plots. The range of habitat suitability values that identified highly and less suitable habitats and their environmental conditions corresponded to field detections with mixed levels of agreement. Our study demonstrated that an iterative and targeted sampling framework can address sampling bias, reduce time costs, and increase detections. Other studies can extend the sampling framework to develop methods in other ecosystems to provide detection data. The sampling methods implemented here provide a meaningful tool when understanding the potential distribution and habitat of species over multi-jurisdictional and extensive areas is needed for achieving management objectives. PMID:25019621

  5. No independent association between insufficient sleep and childhood obesity in the National Survey of Children's Health.

    PubMed

    Hassan, Fauziya; Davis, Matthew M; Chervin, Ronald D

    2011-04-15

    Prior studies have supported an association between insufficient sleep and childhood obesity, but most have not examined nationally representative samples or considered potential sociodemographic confounders. The main objective of this study was to use a large, nationally representative dataset to examine the possibility that insufficient sleep is associated with obesity in children, independent of sociodemographic factors. The National Survey of Children's Health is a national survey of U.S. households contacted by random digit dialing. In 2003, caregivers of 102,353 US children were surveyed. Age- and sex-specific body mass index (BMI) based on parental report of child height and weight, was available for 81,390 children aged 6-17 years. Caregivers were asked, "How many nights of sufficient sleep did your child have in the past week?" The odds of obesity (BMI ≥ 95th percentile) versus healthy weight (BMI 5th-84th percentile) was regressed on reported nights of sufficient sleep per week (categorized as 0-2, 3-5, or 6-7). Sociodemographic variables included gender, race, household education, and family income. Analyses incorporated sampling weights to derive nationally representative estimates for a 2003 population of 34 million youth. Unadjusted bivariate analyses indicated that children aged 6-11 years with 0-2 nights of sufficient sleep, in comparison to those with 6-7 nights, were more likely to be obese (OR = 1.7, 95% CI [1.2-2.3]). Among children aged 12-17 years, odds of obesity were lower among children with 3-5 nights of sufficient sleep in comparison to those with 6-7 nights (0.8, 95% CI: 0.7-0.9). However, in both age groups, adjustment for race/ethnicity, gender, family income, and household education left no remaining statistical significance for the association between sufficient nights of sleep and BMI. In this national sample, insufficient sleep, as judged by parents, is inconsistently associated with obesity in bivariate analyses, and not associated with obesity after adjustment for sociodemographic variables. These findings from a nationally representative sample are necessarily subject to parental perceptions, but nonetheless serve as an important reminder that the role of insufficient sleep in the childhood obesity epidemic remains unproven.

  6. Orbital Motion of Young Binaries in Ophiuchus and Upper Centaurus–Lupus

    NASA Astrophysics Data System (ADS)

    Schaefer, G. H.; Prato, L.; Simon, M.

    2018-03-01

    We present measurements of the orbital positions and flux ratios of 17 binary and triple systems in the Ophiuchus star-forming region and the Upper Centaurus–Lupus cluster based on adaptive optics imaging at the Keck Observatory. We report the detection of visual companions in MML 50 and MML 53 for the first time, as well as the possible detection of a third component in WSB 21. For six systems in our sample, our measurements provide a second orbital position following their initial discoveries over a decade ago. For eight systems with sufficient orbital coverage, we analyze the range of orbital solutions that fit the data. Ultimately, these observations will help provide the groundwork toward measuring precise masses for these pre-main-sequence stars and understanding the distribution of orbital parameters in young multiple systems.

  7. ADHD Symptoms in Preschool Children: Examining Psychometric Properties using IRT

    PubMed Central

    Purpura, David J.; Wilson, Shauna B.; Lonigan, Christopher J.

    2010-01-01

    Clear and empirically supported diagnostic symptoms are important for proper diagnosis and treatment of psychological disorders. Unfortunately, symptoms of many disorders presented in the DSM-IV-TR lack sufficient psychometric evaluation. In this study, an Item Response Theory analysis was applied to ratings of the 18 Attention-Deficit/Hyperactivity Disorder (ADHD) symptoms in 268 preschool children. Children (55% boys) in this sample ranged in age from 37 to 74 months; 80.4% were identified as African American, 15.1% Caucasian, and 4.5% other ethnicity. Dichotomous and polytomous scoring methods for rating ADHD symptoms were compared and psychometric properties of these symptoms were calculated. Symptom-level analyses revealed that, in general, the current symptoms provided useful information in diagnosing ADHD in preschool children; however, several symptoms provided redundant information and should be examined further. PMID:20822267

  8. A prospective gating method to acquire a diverse set of free-breathing CT images for model-based 4DCT

    NASA Astrophysics Data System (ADS)

    O'Connell, D.; Ruan, D.; Thomas, D. H.; Dou, T. H.; Lewis, J. H.; Santhanam, A.; Lee, P.; Low, D. A.

    2018-02-01

    Breathing motion modeling requires observation of tissues at sufficiently distinct respiratory states for proper 4D characterization. This work proposes a method to improve sampling of the breathing cycle with limited imaging dose. We designed and tested a prospective free-breathing acquisition protocol with a simulation using datasets from five patients imaged with a model-based 4DCT technique. Each dataset contained 25 free-breathing fast helical CT scans with simultaneous breathing surrogate measurements. Tissue displacements were measured using deformable image registration. A correspondence model related tissue displacement to the surrogate. Model residual was computed by comparing predicted displacements to image registration results. To determine a stopping criteria for the prospective protocol, i.e. when the breathing cycle had been sufficiently sampled, subsets of N scans where 5  ⩽  N  ⩽  9 were used to fit reduced models for each patient. A previously published metric was employed to describe the phase coverage, or ‘spread’, of the respiratory trajectories of each subset. Minimum phase coverage necessary to achieve mean model residual within 0.5 mm of the full 25-scan model was determined and used as the stopping criteria. Using the patient breathing traces, a prospective acquisition protocol was simulated. In all patients, phase coverage greater than the threshold necessary for model accuracy within 0.5 mm of the 25 scan model was achieved in six or fewer scans. The prospectively selected respiratory trajectories ranked in the (97.5  ±  4.2)th percentile among subsets of the originally sampled scans on average. Simulation results suggest that the proposed prospective method provides an effective means to sample the breathing cycle with limited free-breathing scans. One application of the method is to reduce the imaging dose of a previously published model-based 4DCT protocol to 25% of its original value while achieving mean model residual within 0.5 mm.

  9. Toward MRI microimaging of single biological cells

    NASA Astrophysics Data System (ADS)

    Seeber, Derek Allan

    There is a great advantage in signal to noise ratio (SNR) that can be obtained in nuclear magnetic resonance (NMR) on very small samples (having spatial dimensions ˜100 mum or less) if one employs NMR "microcoils" that are of similarly small dimensions. These gains in SNR could enable magnetic resonance imaging (MRI) microscopy with spatial resolutions of ˜1--2 mum, much better than currently available. We report the design and testing of a NMR microcoil receiver apparatus, employing solenoidal microcoils of dimensions of tens to hundreds of microns, using an applied field of 9 Tesla (proton frequency 383 MHz). For the smallest receiver coils we attain sensitivity sufficient to observe proton NMR with SNR one in a single scan applied to ˜10 mum3 (10 fl) water sample, containing 7 x 1011 total proton spins. In addition to the NMR applications, microcoils have been applied to MRI producing images with spatial resolutions as low as 2 mum x 3.5 mum x 14.8 mum on phantom images of rods and beads. This resolution can be further improved. MRI imaging of small sample volumes requires significant hardware modifications and improvements, all of which are discussed. Specifically, MRI microscopy requires very strong (>10 T/m), rapidly switchable triaxial magnetic field gradients. We report the design and construction of such a triaxial gradient system, producing gradient substantially greater than 15 T/m in all three directions, x, y, and z (as high as 50 T/m for the x direction). The gradients are power by a custom designed power supply capable of providing currents in excess of 200 amps and switching times of less than 5 mus corresponding to slew rates of greater that 107 T/m/s. The gradients are adequately uniform (within 5% over a volume of 600 mum3) and sufficient for microcoil MRI of small samples.

  10. The effect of sampling rate on interpretation of the temporal characteristics of radiative and convective heating in wildland flames

    Treesearch

    David Frankman; Brent W. Webb; Bret W. Butler; Daniel Jimenez; Michael Harrington

    2012-01-01

    Time-resolved radiative and convective heating measurements were collected on a prescribed burn in coniferous fuels at a sampling frequency of 500 Hz. Evaluation of the data in the time and frequency domain indicate that this sampling rate was sufficient to capture the temporal fluctuations of radiative and convective heating. The convective heating signal contained...

  11. Review of Sample Size for Structural Equation Models in Second Language Testing and Learning Research: A Monte Carlo Approach

    ERIC Educational Resources Information Center

    In'nami, Yo; Koizumi, Rie

    2013-01-01

    The importance of sample size, although widely discussed in the literature on structural equation modeling (SEM), has not been widely recognized among applied SEM researchers. To narrow this gap, we focus on second language testing and learning studies and examine the following: (a) Is the sample size sufficient in terms of precision and power of…

  12. An Efficient Referencing And Sample Positioning System To Investigate Heterogeneous Substances With Combined Microfocused Synchrotron X-ray Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spangenberg, Thomas; Goettlicher, Joerg; Steininger, Ralph

    2009-01-29

    A referencing and sample positioning system has been developed to transfer object positions measured with an offline microscope to a synchrotron experimental station. The accuracy should be sufficient to deal with heterogeneous samples on micrometer scale. Together with an online fluorescence mapping visualisation the optical alignment helps to optimize measuring procedures for combined microfocused X-ray techniques.

  13. Minimum Required Attention: A Human-Centered Approach to Driver Inattention.

    PubMed

    Kircher, Katja; Ahlstrom, Christer

    2017-05-01

    To propose a driver attention theory based on the notion of driving as a satisficing and partially self-paced task and, within this framework, present a definition for driver inattention. Many definitions of driver inattention and distraction have been proposed, but they are difficult to operationalize, and they are either unreasonably strict and inflexible or suffer from hindsight bias. Existing definitions of driver distraction are reviewed and their shortcomings identified. We then present the minimum required attention (MiRA) theory to overcome these shortcomings. Suggestions on how to operationalize MiRA are also presented. MiRA describes which role the attention of the driver plays in the shared "situation awareness of the traffic system." A driver is considered attentive when sampling sufficient information to meet the demands of the system, namely, that he or she fulfills the preconditions to be able to form and maintain a good enough mental representation of the situation. A driver should only be considered inattentive when information sampling is not sufficient, regardless of whether the driver is concurrently executing an additional task or not. The MiRA theory builds on well-established driver attention theories. It goes beyond available driver distraction definitions by first defining what a driver needs to be attentive to, being free from hindsight bias, and allowing the driver to adapt to the current demands of the traffic situation through satisficing and self-pacing. MiRA has the potential to provide the stepping stone for unbiased and operationalizable inattention detection and classification.

  14. Selection of reliable reference genes for quantitative real-time PCR gene expression analysis in Jute (Corchorus capsularis) under stress treatments

    PubMed Central

    Niu, Xiaoping; Qi, Jianmin; Zhang, Gaoyang; Xu, Jiantang; Tao, Aifen; Fang, Pingping; Su, Jianguang

    2015-01-01

    To accurately measure gene expression using quantitative reverse transcription PCR (qRT-PCR), reliable reference gene(s) are required for data normalization. Corchorus capsularis, an annual herbaceous fiber crop with predominant biodegradability and renewability, has not been investigated for the stability of reference genes with qRT-PCR. In this study, 11 candidate reference genes were selected and their expression levels were assessed using qRT-PCR. To account for the influence of experimental approach and tissue type, 22 different jute samples were selected from abiotic and biotic stress conditions as well as three different tissue types. The stability of the candidate reference genes was evaluated using geNorm, NormFinder, and BestKeeper programs, and the comprehensive rankings of gene stability were generated by aggregate analysis. For the biotic stress and NaCl stress subsets, ACT7 and RAN were suitable as stable reference genes for gene expression normalization. For the PEG stress subset, UBC, and DnaJ were sufficient for accurate normalization. For the tissues subset, four reference genes TUBβ, UBI, EF1α, and RAN were sufficient for accurate normalization. The selected genes were further validated by comparing expression profiles of WRKY15 in various samples, and two stable reference genes were recommended for accurate normalization of qRT-PCR data. Our results provide researchers with appropriate reference genes for qRT-PCR in C. capsularis, and will facilitate gene expression study under these conditions. PMID:26528312

  15. Influenza forecasting with Google Flu Trends.

    PubMed

    Dugas, Andrea Freyer; Jalalpour, Mehdi; Gel, Yulia; Levin, Scott; Torcaso, Fred; Igusa, Takeru; Rothman, Richard E

    2013-01-01

    We developed a practical influenza forecast model based on real-time, geographically focused, and easy to access data, designed to provide individual medical centers with advanced warning of the expected number of influenza cases, thus allowing for sufficient time to implement interventions. Secondly, we evaluated the effects of incorporating a real-time influenza surveillance system, Google Flu Trends, and meteorological and temporal information on forecast accuracy. Forecast models designed to predict one week in advance were developed from weekly counts of confirmed influenza cases over seven seasons (2004-2011) divided into seven training and out-of-sample verification sets. Forecasting procedures using classical Box-Jenkins, generalized linear models (GLM), and generalized linear autoregressive moving average (GARMA) methods were employed to develop the final model and assess the relative contribution of external variables such as, Google Flu Trends, meteorological data, and temporal information. A GARMA(3,0) forecast model with Negative Binomial distribution integrating Google Flu Trends information provided the most accurate influenza case predictions. The model, on the average, predicts weekly influenza cases during 7 out-of-sample outbreaks within 7 cases for 83% of estimates. Google Flu Trend data was the only source of external information to provide statistically significant forecast improvements over the base model in four of the seven out-of-sample verification sets. Overall, the p-value of adding this external information to the model is 0.0005. The other exogenous variables did not yield a statistically significant improvement in any of the verification sets. Integer-valued autoregression of influenza cases provides a strong base forecast model, which is enhanced by the addition of Google Flu Trends confirming the predictive capabilities of search query based syndromic surveillance. This accessible and flexible forecast model can be used by individual medical centers to provide advanced warning of future influenza cases.

  16. Ancient DNA from marine mammals: studying long-lived species over ecological and evolutionary timescales.

    PubMed

    Foote, Andrew D; Hofreiter, Michael; Morin, Phillip A

    2012-01-20

    Marine mammals have long generation times and broad, difficult to sample distributions, which makes inferring evolutionary and demographic changes using field studies of extant populations challenging. However, molecular analyses from sub-fossil or historical materials of marine mammals such as bone, tooth, baleen, skin, fur, whiskers and scrimshaw using ancient DNA (aDNA) approaches provide an opportunity for investigating such changes over evolutionary and ecological timescales. Here, we review the application of aDNA techniques to the study of marine mammals. Most of the studies have focused on detecting changes in genetic diversity following periods of exploitation and environmental change. To date, these studies have shown that even small sample sizes can provide useful information on historical genetic diversity. Ancient DNA has also been used in investigations of changes in distribution and range of marine mammal species; we review these studies and discuss the limitations of such 'presence only' studies. Combining aDNA data with stable isotopes can provide further insights into changes in ecology and we review past studies and suggest future potential applications. We also discuss studies reconstructing inter- and intra-specific phylogenies from aDNA sequences and discuss how aDNA sequences could be used to estimate mutation rates. Finally, we highlight some of the problems of aDNA studies on marine mammals, such as obtaining sufficient sample sizes and calibrating for the marine reservoir effect when radiocarbon-dating such wide-ranging species. Copyright © 2011 Elsevier GmbH. All rights reserved.

  17. Optimizing cord blood sample cryopreservation.

    PubMed

    Harris, David T

    2012-03-01

    Cord blood (CB) banking is becoming more and more commonplace throughout the medical community, both in the USA and elsewhere. It is now generally recognized that storage of CB samples in multiple aliquots is the preferred approach to banking because it allows the greatest number of uses of the sample. However, it is unclear which are the best methodologies for cryopreservation and storage of the sample aliquots. In the current study we analyzed variables that could affect these processes. CB were processed into mononuclear cells (MNC) and frozen in commercially available human serum albumin (HSA) or autologous CB plasma using cryovials of various sizes and cryobags. The bacteriophage phiX174 was used as a model virus to test for cross-contamination. We observed that cryopreservation of CB in HSA, undiluted autologous human plasma and 50% diluted plasma was equivalent in terms of cell recovery and cell viability. We also found that cryopreservation of CB samples in either cryovials or cryobags displayed equivalent thermal characteristics. Finally, we demonstrated that overwrapping the CB storage container in an impermeable plastic sheathing was sufficient to prevent cross-sample viral contamination during prolonged storage in the liquid phase of liquid nitrogen dewar storage. CB may be cryopreserved in either vials or bags without concern for temperature stability. Sample overwrapping is sufficient to prevent microbiologic contamination of the samples while in liquid-phase liquid nitrogen storage.

  18. Likelihood inference of non-constant diversification rates with incomplete taxon sampling.

    PubMed

    Höhna, Sebastian

    2014-01-01

    Large-scale phylogenies provide a valuable source to study background diversification rates and investigate if the rates have changed over time. Unfortunately most large-scale, dated phylogenies are sparsely sampled (fewer than 5% of the described species) and taxon sampling is not uniform. Instead, taxa are frequently sampled to obtain at least one representative per subgroup (e.g. family) and thus to maximize diversity (diversified sampling). So far, such complications have been ignored, potentially biasing the conclusions that have been reached. In this study I derive the likelihood of a birth-death process with non-constant (time-dependent) diversification rates and diversified taxon sampling. Using simulations I test if the true parameters and the sampling method can be recovered when the trees are small or medium sized (fewer than 200 taxa). The results show that the diversification rates can be inferred and the estimates are unbiased for large trees but are biased for small trees (fewer than 50 taxa). Furthermore, model selection by means of Akaike's Information Criterion favors the true model if the true rates differ sufficiently from alternative models (e.g. the birth-death model is recovered if the extinction rate is large and compared to a pure-birth model). Finally, I applied six different diversification rate models--ranging from a constant-rate pure birth process to a decreasing speciation rate birth-death process but excluding any rate shift models--on three large-scale empirical phylogenies (ants, mammals and snakes with respectively 149, 164 and 41 sampled species). All three phylogenies were constructed by diversified taxon sampling, as stated by the authors. However only the snake phylogeny supported diversified taxon sampling. Moreover, a parametric bootstrap test revealed that none of the tested models provided a good fit to the observed data. The model assumptions, such as homogeneous rates across species or no rate shifts, appear to be violated.

  19. Control of the repeatability of high frequency multibeam echosounder backscatter by using natural reference areas

    NASA Astrophysics Data System (ADS)

    Roche, Marc; Degrendele, Koen; Vrignaud, Christophe; Loyer, Sophie; Le Bas, Tim; Augustin, Jean-Marie; Lurton, Xavier

    2018-06-01

    The increased use of backscatter measurements in time series for environmental monitoring necessitates the comparability of individual results. With the current lack of pre-calibrated multibeam echosounder systems for absolute backscatter measurement, a pragmatic solution is the use of natural reference areas for ensuring regular assessment of the backscatter measurement repeatability. This method mainly relies on the assumption of a sufficiently stable reference area regarding its backscatter signature. The aptitude of a natural area to provide a stable and uniform backscatter response must be carefully considered and demonstrated by a sufficiently long time-series of measurements. Furthermore, this approach requires a strict control of the acquisition and processing parameters. If all these conditions are met, stability check and relative calibration of a system are possible by comparison with the averaged backscatter values for the area. Based on a common multibeam echosounder and sampling campaign completed by available bathymetric and backscatter time series, the suitability as a backscatter reference area of three different candidates was evaluated. Two among them, Carré Renard and Kwinte, prove to be excellent choices, while the third one, Western Solent, lacks sufficient data over time, but remains a valuable candidate. The case studies and the available backscatter data on these areas prove the applicability of this method. The expansion of the number of commonly used reference areas and the growth of the number of multibeam echosounder controlled thereon could greatly contribute to the further development of quantitative applications based on multibeam echosounder backscatter measurements.

  20. Characterization of compounds by time-of-flight measurement utilizing random fast ions

    DOEpatents

    Conzemius, R.J.

    1989-04-04

    An apparatus is described for characterizing the mass of sample and daughter particles, comprising a source for providing sample ions; a fragmentation region wherein a fraction of the sample ions may fragment to produce daughter ion particles; an electrostatic field region held at a voltage level sufficient to effect ion-neutral separation and ion-ion separation of fragments from the same sample ion and to separate ions of different kinetic energy; a detector system for measuring the relative arrival times of particles; and processing means operatively connected to the detector system to receive and store the relative arrival times and operable to compare the arrival times with times detected at the detector when the electrostatic field region is held at a different voltage level and to thereafter characterize the particles. Sample and daughter particles are characterized with respect to mass and other characteristics by detecting at a particle detector the relative time of arrival for fragments of a sample ion at two different electrostatic voltage levels. The two sets of particle arrival times are used in conjunction with the known altered voltage levels to mathematically characterize the sample and daughter fragments. In an alternative embodiment the present invention may be used as a detector for a conventional mass spectrometer. In this embodiment, conventional mass spectrometry analysis is enhanced due to further mass resolving of the detected ions. 8 figs.

  1. Characterization of compounds by time-of-flight measurement utilizing random fast ions

    DOEpatents

    Conzemius, Robert J.

    1989-01-01

    An apparatus for characterizing the mass of sample and daughter particles, comprising a source for providing sample ions; a fragmentation region wherein a fraction of the sample ions may fragment to produce daughter ion particles; an electrostatic field region held at a voltage level sufficient to effect ion-neutral separation and ion-ion separation of fragments from the same sample ion and to separate ions of different kinetic energy; a detector system for measuring the relative arrival times of particles; and processing means operatively connected to the detector system to receive and store the relative arrival times and operable to compare the arrival times with times detected at the detector when the electrostatic field region is held at a different voltage level and to thereafter characterize the particles. Sample and daughter particles are characterized with respect to mass and other characteristics by detecting at a particle detector the relative time of arrival for fragments of a sample ion at two different electrostatic voltage levels. The two sets of particle arrival times are used in conjunction with the known altered voltage levels to mathematically characterize the sample and daughter fragments. In an alternative embodiment the present invention may be used as a detector for a conventional mass spectrometer. In this embodiment, conventional mass spectrometry analysis is enhanced due to further mass resolving of the detected ions.

  2. A Comparison of RNA-Seq Results from Paired Formalin-Fixed Paraffin-Embedded and Fresh-Frozen Glioblastoma Tissue Samples

    PubMed Central

    Esteve-Codina, Anna; Arpi, Oriol; Martinez-García, Maria; Pineda, Estela; Mallo, Mar; Gut, Marta; Carrato, Cristina; Rovira, Anna; Lopez, Raquel; Tortosa, Avelina; Dabad, Marc; Del Barco, Sonia; Heath, Simon; Bagué, Silvia; Ribalta, Teresa; Alameda, Francesc; de la Iglesia, Nuria

    2017-01-01

    The molecular classification of glioblastoma (GBM) based on gene expression might better explain outcome and response to treatment than clinical factors. Whole transcriptome sequencing using next-generation sequencing platforms is rapidly becoming accepted as a tool for measuring gene expression for both research and clinical use. Fresh frozen (FF) tissue specimens of GBM are difficult to obtain since tumor tissue obtained at surgery is often scarce and necrotic and diagnosis is prioritized over freezing. After diagnosis, leftover tissue is usually stored as formalin-fixed paraffin-embedded (FFPE) tissue. However, RNA from FFPE tissues is usually degraded, which could hamper gene expression analysis. We compared RNA-Seq data obtained from matched pairs of FF and FFPE GBM specimens. Only three FFPE out of eleven FFPE-FF matched samples yielded informative results. Several quality-control measurements showed that RNA from FFPE samples was highly degraded but maintained transcriptomic similarities to RNA from FF samples. Certain issues regarding mutation analysis and subtype prediction were detected. Nevertheless, our results suggest that RNA-Seq of FFPE GBM specimens provides reliable gene expression data that can be used in molecular studies of GBM if the RNA is sufficiently preserved. PMID:28122052

  3. In-Line Phase-Contrast X-ray Imaging and Tomography for Materials Science

    PubMed Central

    Mayo, Sheridan C.; Stevenson, Andrew W.; Wilkins, Stephen W.

    2012-01-01

    X-ray phase-contrast imaging and tomography make use of the refraction of X-rays by the sample in image formation. This provides considerable additional information in the image compared to conventional X-ray imaging methods, which rely solely on X-ray absorption by the sample. Phase-contrast imaging highlights edges and internal boundaries of a sample and is thus complementary to absorption contrast, which is more sensitive to the bulk of the sample. Phase-contrast can also be used to image low-density materials, which do not absorb X-rays sufficiently to form a conventional X-ray image. In the context of materials science, X-ray phase-contrast imaging and tomography have particular value in the 2D and 3D characterization of low-density materials, the detection of cracks and voids and the analysis of composites and multiphase materials where the different components have similar X-ray attenuation coefficients. Here we review the use of phase-contrast imaging and tomography for a wide variety of materials science characterization problems using both synchrotron and laboratory sources and further demonstrate the particular benefits of phase contrast in the laboratory setting with a series of case studies. PMID:28817018

  4. In-Line Phase-Contrast X-ray Imaging and Tomography for Materials Science.

    PubMed

    Mayo, Sheridan C; Stevenson, Andrew W; Wilkins, Stephen W

    2012-05-24

    X-ray phase-contrast imaging and tomography make use of the refraction of X-rays by the sample in image formation. This provides considerable additional information in the image compared to conventional X-ray imaging methods, which rely solely on X-ray absorption by the sample. Phase-contrast imaging highlights edges and internal boundaries of a sample and is thus complementary to absorption contrast, which is more sensitive to the bulk of the sample. Phase-contrast can also be used to image low-density materials, which do not absorb X-rays sufficiently to form a conventional X-ray image. In the context of materials science, X-ray phase-contrast imaging and tomography have particular value in the 2D and 3D characterization of low-density materials, the detection of cracks and voids and the analysis of composites and multiphase materials where the different components have similar X-ray attenuation coefficients. Here we review the use of phase-contrast imaging and tomography for a wide variety of materials science characterization problems using both synchrotron and laboratory sources and further demonstrate the particular benefits of phase contrast in the laboratory setting with a series of case studies.

  5. Human metabolic profiles are stably controlled by genetic and environmental variation

    PubMed Central

    Nicholson, George; Rantalainen, Mattias; Maher, Anthony D; Li, Jia V; Malmodin, Daniel; Ahmadi, Kourosh R; Faber, Johan H; Hallgrímsdóttir, Ingileif B; Barrett, Amy; Toft, Henrik; Krestyaninova, Maria; Viksna, Juris; Neogi, Sudeshna Guha; Dumas, Marc-Emmanuel; Sarkans, Ugis; The MolPAGE Consortium; Silverman, Bernard W; Donnelly, Peter; Nicholson, Jeremy K; Allen, Maxine; Zondervan, Krina T; Lindon, John C; Spector, Tim D; McCarthy, Mark I; Holmes, Elaine; Baunsgaard, Dorrit; Holmes, Chris C

    2011-01-01

    1H Nuclear Magnetic Resonance spectroscopy (1H NMR) is increasingly used to measure metabolite concentrations in sets of biological samples for top-down systems biology and molecular epidemiology. For such purposes, knowledge of the sources of human variation in metabolite concentrations is valuable, but currently sparse. We conducted and analysed a study to create such a resource. In our unique design, identical and non-identical twin pairs donated plasma and urine samples longitudinally. We acquired 1H NMR spectra on the samples, and statistically decomposed variation in metabolite concentration into familial (genetic and common-environmental), individual-environmental, and longitudinally unstable components. We estimate that stable variation, comprising familial and individual-environmental factors, accounts on average for 60% (plasma) and 47% (urine) of biological variation in 1H NMR-detectable metabolite concentrations. Clinically predictive metabolic variation is likely nested within this stable component, so our results have implications for the effective design of biomarker-discovery studies. We provide a power-calculation method which reveals that sample sizes of a few thousand should offer sufficient statistical precision to detect 1H NMR-based biomarkers quantifying predisposition to disease. PMID:21878913

  6. Sampling methods for terrestrial amphibians and reptiles.

    Treesearch

    Paul Stephen Corn; R. Bruce Bury

    1990-01-01

    Methods described for sampling amphibians and reptiles in Douglas-fir forests in the Pacific Northwest include pitfall trapping, time-constrained collecting, and surveys of coarse woody debris. The herpetofauna of this region differ in breeding and nonbreeding habitats and vagility, so that no single technique is sufficient for a community study. A combination of...

  7. 29 CFR 1926.752 - Site layout, site-specific erection plan and construction sequence.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... standard test method of field-cured samples, either 75 percent of the intended minimum compressive design... the basis of an appropriate ASTM standard test method of field-cured samples, either 75 percent of the intended minimum compressive design strength or sufficient strength to support the loads imposed during...

  8. 29 CFR 1926.752 - Site layout, site-specific erection plan and construction sequence.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... standard test method of field-cured samples, either 75 percent of the intended minimum compressive design... the basis of an appropriate ASTM standard test method of field-cured samples, either 75 percent of the intended minimum compressive design strength or sufficient strength to support the loads imposed during...

  9. 29 CFR 1926.752 - Site layout, site-specific erection plan and construction sequence.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... standard test method of field-cured samples, either 75 percent of the intended minimum compressive design... the basis of an appropriate ASTM standard test method of field-cured samples, either 75 percent of the intended minimum compressive design strength or sufficient strength to support the loads imposed during...

  10. 29 CFR 1926.752 - Site layout, site-specific erection plan and construction sequence.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... standard test method of field-cured samples, either 75 percent of the intended minimum compressive design... the basis of an appropriate ASTM standard test method of field-cured samples, either 75 percent of the intended minimum compressive design strength or sufficient strength to support the loads imposed during...

  11. 29 CFR 1926.752 - Site layout, site-specific erection plan and construction sequence.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... standard test method of field-cured samples, either 75 percent of the intended minimum compressive design... the basis of an appropriate ASTM standard test method of field-cured samples, either 75 percent of the intended minimum compressive design strength or sufficient strength to support the loads imposed during...

  12. Simple and accurate quantification of BTEX in ambient air by SPME and GC-MS.

    PubMed

    Baimatova, Nassiba; Kenessov, Bulat; Koziel, Jacek A; Carlsen, Lars; Bektassov, Marat; Demyanenko, Olga P

    2016-07-01

    Benzene, toluene, ethylbenzene and xylenes (BTEX) comprise one of the most ubiquitous and hazardous groups of ambient air pollutants of concern. Application of standard analytical methods for quantification of BTEX is limited by the complexity of sampling and sample preparation equipment, and budget requirements. Methods based on SPME represent simpler alternative, but still require complex calibration procedures. The objective of this research was to develop a simpler, low-budget, and accurate method for quantification of BTEX in ambient air based on SPME and GC-MS. Standard 20-mL headspace vials were used for field air sampling and calibration. To avoid challenges with obtaining and working with 'zero' air, slope factors of external standard calibration were determined using standard addition and inherently polluted lab air. For polydimethylsiloxane (PDMS) fiber, differences between the slope factors of calibration plots obtained using lab and outdoor air were below 14%. PDMS fiber provided higher precision during calibration while the use of Carboxen/PDMS fiber resulted in lower detection limits for benzene and toluene. To provide sufficient accuracy, the use of 20mL vials requires triplicate sampling and analysis. The method was successfully applied for analysis of 108 ambient air samples from Almaty, Kazakhstan. Average concentrations of benzene, toluene, ethylbenzene and o-xylene were 53, 57, 11 and 14µgm(-3), respectively. The developed method can be modified for further quantification of a wider range of volatile organic compounds in air. In addition, the new method is amenable to automation. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. First year of practical experiences of the new Arctic AWIPEV-COSYNA cabled Underwater Observatory in Kongsfjorden, Spitsbergen

    NASA Astrophysics Data System (ADS)

    Fischer, Philipp; Schwanitz, Max; Loth, Reiner; Posner, Uwe; Brand, Markus; Schröder, Friedhelm

    2017-04-01

    A combined year-round assessment of selected oceanographic data and a macrobiotic community assessment was performed from October 2013 to November 2014 in the littoral zone of the Kongsfjorden polar fjord system on the western coast of Svalbard (Norway). State of the art remote controlled cabled underwater observatory technology was used for daily vertical profiles of temperature, salinity, and turbidity together with a stereo-optical assessment of the macrobiotic community, including fish. The results reveal a distinct seasonal cycle in total species abundances, with a significantly higher total abundance and species richness during the polar winter when no light is available underwater compared to the summer months when 24 h light is available. During the winter months, a temporally highly segmented community was observed with respect to species occurrence, with single species dominating the winter community for restricted times. In contrast, the summer community showed an overall lower total abundance as well as a significantly lower number of species. The study clearly demonstrates the high potential of cable connected remote controlled digital sampling devices, especially in remote areas, such as polar fjord systems, with harsh environmental conditions and limited accessibility. A smart combination of such new digital sampling methods with classic sampling procedures can provide a possibility to significantly extend the sampling time and frequency, especially in remote and difficult to access areas. This can help to provide a sufficient data density and therefore statistical power for a sound scientific analysis without increasing the invasive sampling pressure in ecologically sensitive environments.

  14. Reconciling PM10 analyses by different sampling methods for Iron King Mine tailings dust.

    PubMed

    Li, Xu; Félix, Omar I; Gonzales, Patricia; Sáez, Avelino Eduardo; Ela, Wendell P

    2016-03-01

    The overall project objective at the Iron King Mine Superfund site is to determine the level and potential risk associated with heavy metal exposure of the proximate population emanating from the site's tailings pile. To provide sufficient size-fractioned dust for multi-discipline research studies, a dust generator was built and is now being used to generate size-fractioned dust samples for toxicity investigations using in vitro cell culture and animal exposure experiments as well as studies on geochemical characterization and bioassay solubilization with simulated lung and gastric fluid extractants. The objective of this study is to provide a robust method for source identification by comparing the tailing sample produced by dust generator and that collected by MOUDI sampler. As and Pb concentrations of the PM10 fraction in the MOUDI sample were much lower than in tailing samples produced by the dust generator, indicating a dilution of Iron King tailing dust by dust from other sources. For source apportionment purposes, single element concentration method was used based on the assumption that the PM10 fraction comes from a background source plus the Iron King tailing source. The method's conclusion that nearly all arsenic and lead in the PM10 dust fraction originated from the tailings substantiates our previous Pb and Sr isotope study conclusion. As and Pb showed a similar mass fraction from Iron King for all sites suggesting that As and Pb have the same major emission source. Further validation of this simple source apportionment method is needed based on other elements and sites.

  15. Direct trace-elemental analysis of urine samples by laser ablation-inductively coupled plasma mass spectrometry after sample deposition on clinical filter papers.

    PubMed

    Aramendía, Maite; Rello, Luis; Vanhaecke, Frank; Resano, Martín

    2012-10-16

    Collection of biological fluids on clinical filter papers shows important advantages from a logistic point of view, although analysis of these specimens is far from straightforward. Concerning urine analysis, and particularly when direct trace elemental analysis by laser ablation-inductively coupled plasma mass spectrometry (LA-ICPMS) is aimed at, several problems arise, such as lack of sensitivity or different distribution of the analytes on the filter paper, rendering obtaining reliable quantitative results quite difficult. In this paper, a novel approach for urine collection is proposed, which circumvents many of these problems. This methodology consists on the use of precut filter paper discs where large amounts of sample can be retained upon a single deposition. This provides higher amounts of the target analytes and, thus, sufficient sensitivity, and allows addition of an adequate internal standard at the clinical lab prior to analysis, therefore making it suitable for a strategy based on unsupervised sample collection and ulterior analysis at referral centers. On the basis of this sampling methodology, an analytical method was developed for the direct determination of several elements in urine (Be, Bi, Cd, Co, Cu, Ni, Sb, Sn, Tl, Pb, and V) at the low μg L(-1) level by means of LA-ICPMS. The method developed provides good results in terms of accuracy and LODs (≤1 μg L(-1) for most of the analytes tested), with a precision in the range of 15%, fit-for-purpose for clinical control analysis.

  16. Low temperature route to uranium nitride

    DOEpatents

    Burrell, Anthony K.; Sattelberger, Alfred P.; Yeamans, Charles; Hartmann, Thomas; Silva, G. W. Chinthaka; Cerefice, Gary; Czerwinski, Kenneth R.

    2009-09-01

    A method of preparing an actinide nitride fuel for nuclear reactors is provided. The method comprises the steps of a) providing at least one actinide oxide and optionally zirconium oxide; b) mixing the oxide with a source of hydrogen fluoride for a period of time and at a temperature sufficient to convert the oxide to a fluoride salt; c) heating the fluoride salt to remove water; d) heating the fluoride salt in a nitrogen atmosphere for a period of time and at a temperature sufficient to convert the fluorides to nitrides; and e) heating the nitrides under vacuum and/or inert atmosphere for a period of time sufficient to convert the nitrides to mononitrides.

  17. Electric-field-driven phase transition in vanadium dioxide

    NASA Astrophysics Data System (ADS)

    Wu, B.; Zimmers, A.; Aubin, H.; Ghosh, R.; Liu, Y.; Lopez, R.

    2011-12-01

    We report on local probe measurements of current-voltage and electrostatic force-voltage characteristics of electric-field-induced insulator to metal transition in VO2 thin film. In conducting AFM mode, switching from the insulating to metallic state occurs for electric-field threshold E˜6.5×107Vm-1 at 300K. Upon lifting the tip above the sample surface, we find that the transition can also be observed through a change in electrostatic force and in tunneling current. In this noncontact regime, the transition is characterized by random telegraphic noise. These results show that electric field alone is sufficient to induce the transition; however, the electronic current provides a positive feedback effect that amplifies the phenomena.

  18. Vocational students' learning preferences: the interpretability of ipsative data.

    PubMed

    Smith, P J

    2000-02-01

    A number of researchers have argued that ipsative data are not suitable for statistical procedures designed for normative data. Others have argued that the interpretability of such analyses of ipsative data are little affected where the number of variables and the sample size are sufficiently large. The research reported here represents a factor analysis of the scores on the Canfield Learning Styles Inventory for 1,252 students in vocational education. The results of the factor analysis of these ipsative data were examined in a context of existing theory and research on vocational students and lend support to the argument that the factor analysis of ipsative data can provide sensibly interpretable results.

  19. Development of techniques for advanced optical contamination measurement with internal reflection spectroscopy, phase 1, volume 1

    NASA Technical Reports Server (NTRS)

    Hayes, J. D.

    1972-01-01

    The feasibility of monitoring volatile contaminants in a large space simulation chamber using techniques of internal reflection spectroscopy was demonstrated analytically and experimentally. The infrared spectral region was selected as the operational spectral range in order to provide unique identification of the contaminants along with sufficient sensitivity to detect trace contaminant concentrations. It was determined theoretically that a monolayer of the contaminants could be detected and identified using optimized experimental procedures. This ability was verified experimentally. Procedures were developed to correct the attenuated total reflectance spectra for thick sample distortion. However, by using two different element designs the need for such correction can be avoided.

  20. The 2010 ILSO-ISRU Field Test at Mauna Kea, Hawaii: Results from the Miniaturised Mossbauer Spectrometers Mimos II and Mimos IIA

    NASA Technical Reports Server (NTRS)

    Klingelhoefer, G.; Morris, R. V.; Blumers, M.; Bernhardt, B.; Graff, T.

    2011-01-01

    For the advanced Moessbauer instrument MIMOS IIA, the new detector technologies and electronic components increase sensitivity and performance significantly. In combination with the high energy resolution of the SDD it is possible to perform X-ray fluorescence analysis simultaneously to Moessbauer spectroscopy. In addition to the Fe-mineralogy, information on the sample's elemental composition will be gathered. The ISRU 2010 field campaign demonstrated that in-situ Moessbauer spectroscopy is an effective tool for both science and feedstock exploration and process monitoring. Engineering tests showed that a compact nickel metal hydride battery provided sufficient power for over 12 hr of continuous operation for the MIMOS instruments.

  1. Application of laboratory permeability data

    USGS Publications Warehouse

    Johnson, A.I.

    1963-01-01

    Some of the basic material contained in this report originally was prepared in 1952 as instructional handouts for ground-water short courses and for training of foreign participants. The material has been revised and expanded and is presented in the present form to make it more readily available to the field hydrologist. Illustrations now present published examples of the applications suggested in the 1952 material. For small areas, a field pumping test is sufficient to predict the characteristics of an aquifer. With a large area under study, the aquifer properties must be determined at many different locations and it is not usually economically feasible to make sufficient field tests to define the aquifer properties in detail for the whole aquifer. By supplementing a few field tests with laboratory permeability data and geologic interpretation, more point measurements representative of the hydrologic properties of the aquifer may be obtained. A sufficient number of samples seldom can be obtained to completely identify the permeability or transmissibility in detail for a project area. However, a few judiciously chosen samples of high quality, combined with good geologic interpretation, often will permit the extrapolation of permeability information over a large area with a fair degree of reliability. The importance of adequate geologic information, as well as the importance of collecting samples representative of at least all major textural units lying within the section or area of study, cannot be overemphasized.

  2. Trace analysis of energetic materials via direct analyte-probed nanoextraction coupled to direct analysis in real time mass spectrometry.

    PubMed

    Clemons, Kristina; Dake, Jeffrey; Sisco, Edward; Verbeck, Guido F

    2013-09-10

    Direct analysis in real time mass spectrometry (DART-MS) has proven to be a useful forensic tool for the trace analysis of energetic materials. While other techniques for detecting trace amounts of explosives involve extraction, derivatization, solvent exchange, or sample clean-up, DART-MS requires none of these. Typical DART-MS analyses directly from a solid sample or from a swab have been quite successful; however, these methods may not always be an optimal sampling technique in a forensic setting. For example, if the sample were only located in an area which included a latent fingerprint of interest, direct DART-MS analysis or the use of a swab would almost certainly destroy the print. To avoid ruining such potentially invaluable evidence, another method has been developed which will leave the fingerprint virtually untouched. Direct analyte-probed nanoextraction coupled to nanospray ionization-mass spectrometry (DAPNe-NSI-MS) has demonstrated excellent sensitivity and repeatability in forensic analyses of trace amounts of illicit drugs from various types of surfaces. This technique employs a nanomanipulator in conjunction with bright-field microscopy to extract single particles from a surface of interest and has provided a limit of detection of 300 attograms for caffeine. Combining DAPNe with DART-MS provides another level of flexibility in forensic analysis, and has proven to be a sufficient detection method for trinitrotoluene (TNT), RDX, and 1-methylaminoanthraquinone (MAAQ). Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  3. Bi-PROF

    PubMed Central

    Gries, Jasmin; Schumacher, Dirk; Arand, Julia; Lutsik, Pavlo; Markelova, Maria Rivera; Fichtner, Iduna; Walter, Jörn; Sers, Christine; Tierling, Sascha

    2013-01-01

    The use of next generation sequencing has expanded our view on whole mammalian methylome patterns. In particular, it provides a genome-wide insight of local DNA methylation diversity at single nucleotide level and enables the examination of single chromosome sequence sections at a sufficient statistical power. We describe a bisulfite-based sequence profiling pipeline, Bi-PROF, which is based on the 454 GS-FLX Titanium technology that allows to obtain up to one million sequence stretches at single base pair resolution without laborious subcloning. To illustrate the performance of the experimental workflow connected to a bioinformatics program pipeline (BiQ Analyzer HT) we present a test analysis set of 68 different epigenetic marker regions (amplicons) in five individual patient-derived xenograft tissue samples of colorectal cancer and one healthy colon epithelium sample as a control. After the 454 GS-FLX Titanium run, sequence read processing and sample decoding, the obtained alignments are quality controlled and statistically evaluated. Comprehensive methylation pattern interpretation (profiling) assessed by analyzing 102-104 sequence reads per amplicon allows an unprecedented deep view on pattern formation and methylation marker heterogeneity in tissues concerned by complex diseases like cancer. PMID:23803588

  4. Bayesian focalization: quantifying source localization with environmental uncertainty.

    PubMed

    Dosso, Stan E; Wilmut, Michael J

    2007-05-01

    This paper applies a Bayesian formulation to study ocean acoustic source localization as a function of uncertainty in environmental properties (water column and seabed) and of data information content [signal-to-noise ratio (SNR) and number of frequencies]. The approach follows that of the optimum uncertain field processor [A. M. Richardson and L. W. Nolte, J. Acoust. Soc. Am. 89, 2280-2284 (1991)], in that localization uncertainty is quantified by joint marginal probability distributions for source range and depth integrated over uncertain environmental properties. The integration is carried out here using Metropolis Gibbs' sampling for environmental parameters and heat-bath Gibbs' sampling for source location to provide efficient sampling over complicated parameter spaces. The approach is applied to acoustic data from a shallow-water site in the Mediterranean Sea where previous geoacoustic studies have been carried out. It is found that reliable localization requires a sufficient combination of prior (environmental) information and data information. For example, sources can be localized reliably for single-frequency data at low SNR (-3 dB) only with small environmental uncertainties, whereas successful localization with large environmental uncertainties requires higher SNR and/or multifrequency data.

  5. A parametric study of helium retention in beryllium and its effect on deuterium retention

    NASA Astrophysics Data System (ADS)

    Alegre, D.; Baldwin, M. J.; Simmonds, M.; Nishijima, D.; Hollmann, E. M.; Brezinsek, S.; Doerner, R. P.

    2017-12-01

    Beryllium samples have been exposed in the PISCES-B linear plasma device to conditions relevant to the International Thermonuclear Experimental Reactor (ITER) in pure He, D, and D/He mixed plasmas. Except at intermediate sample exposure temperatures (573-673 K) He addition to a D plasma is found to have a beneficial effect as it reduces the D retention in Be (up to ˜55%), although the mechanism is unclear. Retention of He is typically around 1020-1021 He m-2, and is affected primarily by the Be surface temperature during exposition, by the ion fluence at <500 K exposure, but not by the ion impact energy at 573 K. Contamination of the Be surface with high-Z elements from the mask of the sample holder in pure He plasmas is also observed under certain conditions, and leads to unexpectedly large He retention values, as well as changes in the surface morphology. An estimation of the tritium retention in the Be first wall of ITER is provided, being sufficiently low to allow a safe operation of ITER.

  6. Active antioxidants in ex-vivo examination of burn wound healing by means of IR and Raman spectroscopies-Preliminary comparative research

    NASA Astrophysics Data System (ADS)

    Pielesz, Anna; Biniaś, Dorota; Sarna, Ewa; Bobiński, Rafał; Kawecki, Marek; Glik, Justyna; Klama-Baryła, Agnieszka; Kitala, Diana; Łabuś, Wojciech; Paluch, Jadwiga; Kraut, Małgorzata

    2017-02-01

    Being a complex traumatic event, burn injury also affects other organ systems apart from the skin. Wounds undergo various pathological changes which are accompanied by alterations in the molecular environment. Information about molecules may be obtained with the use of Raman spectroscopy and Fourier-transform infrared spectroscopy, and when combined, both methods are a powerful tool for providing material characterization. Alterations in the molecular environment may lead to identifying objective markers of acute wound healing. In general, incubation of samples in solutions of L-ascorbic acid and 5% and 7% orthosilicic acid organizes the collagen structure, whereas the increased intensity of the Raman bands in the region of 1500-800 cm- 1 reveals regeneration of the burn tissue. Since oxidative damage is one of the mechanisms responsible for local and distant pathophysiological events after burn, antioxidant therapy can prove to be beneficial in minimizing burn wounds, which was examined on the basis of human skin samples and chicken skin samples, the latter being subject to modification when heated to a temperature sufficient for the simulation of a burn incident.

  7. Accelerated Optical Projection Tomography Applied to In Vivo Imaging of Zebrafish

    PubMed Central

    Correia, Teresa; Yin, Jun; Ramel, Marie-Christine; Andrews, Natalie; Katan, Matilda; Bugeon, Laurence; Dallman, Margaret J.; McGinty, James; Frankel, Paul; French, Paul M. W.; Arridge, Simon

    2015-01-01

    Optical projection tomography (OPT) provides a non-invasive 3-D imaging modality that can be applied to longitudinal studies of live disease models, including in zebrafish. Current limitations include the requirement of a minimum number of angular projections for reconstruction of reasonable OPT images using filtered back projection (FBP), which is typically several hundred, leading to acquisition times of several minutes. It is highly desirable to decrease the number of required angular projections to decrease both the total acquisition time and the light dose to the sample. This is particularly important to enable longitudinal studies, which involve measurements of the same fish at different time points. In this work, we demonstrate that the use of an iterative algorithm to reconstruct sparsely sampled OPT data sets can provide useful 3-D images with 50 or fewer projections, thereby significantly decreasing the minimum acquisition time and light dose while maintaining image quality. A transgenic zebrafish embryo with fluorescent labelling of the vasculature was imaged to acquire densely sampled (800 projections) and under-sampled data sets of transmitted and fluorescence projection images. The under-sampled OPT data sets were reconstructed using an iterative total variation-based image reconstruction algorithm and compared against FBP reconstructions of the densely sampled data sets. To illustrate the potential for quantitative analysis following rapid OPT data acquisition, a Hessian-based method was applied to automatically segment the reconstructed images to select the vasculature network. Results showed that 3-D images of the zebrafish embryo and its vasculature of sufficient visual quality for quantitative analysis can be reconstructed using the iterative algorithm from only 32 projections—achieving up to 28 times improvement in imaging speed and leading to total acquisition times of a few seconds. PMID:26308086

  8. Methods of Soil Resampling to Monitor Changes in the Chemical Concentrations of Forest Soils.

    PubMed

    Lawrence, Gregory B; Fernandez, Ivan J; Hazlett, Paul W; Bailey, Scott W; Ross, Donald S; Villars, Thomas R; Quintana, Angelica; Ouimet, Rock; McHale, Michael R; Johnson, Chris E; Briggs, Russell D; Colter, Robert A; Siemion, Jason; Bartlett, Olivia L; Vargas, Olga; Antidormi, Michael R; Koppers, Mary M

    2016-11-25

    Recent soils research has shown that important chemical soil characteristics can change in less than a decade, often the result of broad environmental changes. Repeated sampling to monitor these changes in forest soils is a relatively new practice that is not well documented in the literature and has only recently been broadly embraced by the scientific community. The objective of this protocol is therefore to synthesize the latest information on methods of soil resampling in a format that can be used to design and implement a soil monitoring program. Successful monitoring of forest soils requires that a study unit be defined within an area of forested land that can be characterized with replicate sampling locations. A resampling interval of 5 years is recommended, but if monitoring is done to evaluate a specific environmental driver, the rate of change expected in that driver should be taken into consideration. Here, we show that the sampling of the profile can be done by horizon where boundaries can be clearly identified and horizons are sufficiently thick to remove soil without contamination from horizons above or below. Otherwise, sampling can be done by depth interval. Archiving of sample for future reanalysis is a key step in avoiding analytical bias and providing the opportunity for additional analyses as new questions arise.

  9. Broad Surveys of DNA Viral Diversity Obtained through Viral Metagenomics of Mosquitoes

    PubMed Central

    Ng, Terry Fei Fan; Willner, Dana L.; Lim, Yan Wei; Schmieder, Robert; Chau, Betty; Nilsson, Christina; Anthony, Simon; Ruan, Yijun; Rohwer, Forest; Breitbart, Mya

    2011-01-01

    Viruses are the most abundant and diverse genetic entities on Earth; however, broad surveys of viral diversity are hindered by the lack of a universal assay for viruses and the inability to sample a sufficient number of individual hosts. This study utilized vector-enabled metagenomics (VEM) to provide a snapshot of the diversity of DNA viruses present in three mosquito samples from San Diego, California. The majority of the sequences were novel, suggesting that the viral community in mosquitoes, as well as the animal and plant hosts they feed on, is highly diverse and largely uncharacterized. Each mosquito sample contained a distinct viral community. The mosquito viromes contained sequences related to a broad range of animal, plant, insect and bacterial viruses. Animal viruses identified included anelloviruses, circoviruses, herpesviruses, poxviruses, and papillomaviruses, which mosquitoes may have obtained from vertebrate hosts during blood feeding. Notably, sequences related to human papillomaviruses were identified in one of the mosquito samples. Sequences similar to plant viruses were identified in all mosquito viromes, which were potentially acquired through feeding on plant nectar. Numerous bacteriophages and insect viruses were also detected, including a novel densovirus likely infecting Culex erythrothorax. Through sampling insect vectors, VEM enables broad survey of viral diversity and has significantly increased our knowledge of the DNA viruses present in mosquitoes. PMID:21674005

  10. Innovations in Sampling Pore Fluids From Deep-Sea Hydrate Sites

    NASA Astrophysics Data System (ADS)

    Lapham, L. L.; Chanton, J. P.; Martens, C. S.; Schaefer, H.; Chapman, N. R.; Pohlman, J. W.

    2003-12-01

    We have developed a sea-floor probe capable of collecting and returning undecompressed pore water samples at in situ pressures for determination of dissolved gas concentrations and isotopic values in deep-sea sediments. In the summer of 2003, we tested this instrument in sediments containing gas hydrates off Vancouver Island, Cascadia Margin from ROPOS (a remotely operated vehicle) and in the Gulf of Mexico from Johnson-Sea-Link I (a manned submersible). Sediment push cores were collected alongside the probe to compare methane concentrations and stable carbon isotope compositions in decompressed samples vs. in situ samples obtained by probe. When sufficient gas was available, ethane and propane concentrations and isotopes were also compared. Preliminary data show maximum concentrations of dissolved methane to be 5mM at the Cascadia Margin Fish Boat site (850m water depth) and 12mM in the Gulf of Mexico Bush Hill hydrate site (550m water depth). Methane concentrations were, on average, five times as high in probe samples as in the cores. Carbon isotopic values show a thermogenic input and oxidative effects approaching the sediment-water interface at both sites. This novel data set will provide information that is critical to the understanding of the in situ processes and environmental conditions controlling gas hydrate occurrences in sediments.

  11. Methods of Soil Resampling to Monitor Changes in the Chemical Concentrations of Forest Soils

    PubMed Central

    Lawrence, Gregory B.; Fernandez, Ivan J.; Hazlett, Paul W.; Bailey, Scott W.; Ross, Donald S.; Villars, Thomas R.; Quintana, Angelica; Ouimet, Rock; McHale, Michael R.; Johnson, Chris E.; Briggs, Russell D.; Colter, Robert A.; Siemion, Jason; Bartlett, Olivia L.; Vargas, Olga; Antidormi, Michael R.; Koppers, Mary M.

    2016-01-01

    Recent soils research has shown that important chemical soil characteristics can change in less than a decade, often the result of broad environmental changes. Repeated sampling to monitor these changes in forest soils is a relatively new practice that is not well documented in the literature and has only recently been broadly embraced by the scientific community. The objective of this protocol is therefore to synthesize the latest information on methods of soil resampling in a format that can be used to design and implement a soil monitoring program. Successful monitoring of forest soils requires that a study unit be defined within an area of forested land that can be characterized with replicate sampling locations. A resampling interval of 5 years is recommended, but if monitoring is done to evaluate a specific environmental driver, the rate of change expected in that driver should be taken into consideration. Here, we show that the sampling of the profile can be done by horizon where boundaries can be clearly identified and horizons are sufficiently thick to remove soil without contamination from horizons above or below. Otherwise, sampling can be done by depth interval. Archiving of sample for future reanalysis is a key step in avoiding analytical bias and providing the opportunity for additional analyses as new questions arise. PMID:27911419

  12. Computational fragment-based screening using RosettaLigand: the SAMPL3 challenge

    NASA Astrophysics Data System (ADS)

    Kumar, Ashutosh; Zhang, Kam Y. J.

    2012-05-01

    SAMPL3 fragment based virtual screening challenge provides a valuable opportunity for researchers to test their programs, methods and screening protocols in a blind testing environment. We participated in SAMPL3 challenge and evaluated our virtual fragment screening protocol, which involves RosettaLigand as the core component by screening a 500 fragments Maybridge library against bovine pancreatic trypsin. Our study reaffirmed that the real test for any virtual screening approach would be in a blind testing environment. The analyses presented in this paper also showed that virtual screening performance can be improved, if a set of known active compounds is available and parameters and methods that yield better enrichment are selected. Our study also highlighted that to achieve accurate orientation and conformation of ligands within a binding site, selecting an appropriate method to calculate partial charges is important. Another finding is that using multiple receptor ensembles in docking does not always yield better enrichment than individual receptors. On the basis of our results and retrospective analyses from SAMPL3 fragment screening challenge we anticipate that chances of success in a fragment screening process could be increased significantly with careful selection of receptor structures, protein flexibility, sufficient conformational sampling within binding pocket and accurate assignment of ligand and protein partial charges.

  13. Adaption of G-TAG Software for Validating Touch and Go Asteroid Sample Return Design Methodology

    NASA Technical Reports Server (NTRS)

    Blackmore, Lars James C.; Acikmese, Behcet; Mandic, Milan

    2012-01-01

    A software tool is used to demonstrate the feasibility of Touch and Go (TAG) sampling for Asteroid Sample Return missions. TAG is a concept whereby a spacecraft is in contact with the surface of a small body, such as a comet or asteroid, for a few seconds or less before ascending to a safe location away from the small body. Previous work at JPL developed the G-TAG simulation tool, which provides a software environment for fast, multi-body simulations of the TAG event. G-TAG is described in Multibody Simulation Software Testbed for Small-Body Exploration and Sampling, (NPO-47196) NASA Tech Briefs, Vol. 35, No. 11 (November 2011), p.54. This current innovation adapts this tool to a mission that intends to return a sample from the surface of an asteroid. In order to demonstrate the feasibility of the TAG concept, the new software tool was used to generate extensive simulations that demonstrate the designed spacecraft meets key requirements. These requirements state that contact force and duration must be sufficient to ensure that enough material from the surface is collected in the brushwheel sampler (BWS), and that the spacecraft must survive the contact and must be able to recover and ascend to a safe position, and maintain velocity and orientation after the contact.

  14. Methods of soil resampling to monitor changes in the chemical concentrations of forest soils

    USGS Publications Warehouse

    Lawrence, Gregory B.; Fernandez, Ivan J.; Hazlett, Paul W.; Bailey, Scott W.; Ross, Donald S.; Villars, Thomas R.; Quintana, Angelica; Ouimet, Rock; McHale, Michael; Johnson, Chris E.; Briggs, Russell D.; Colter, Robert A.; Siemion, Jason; Bartlett, Olivia L.; Vargas, Olga; Antidormi, Michael; Koppers, Mary Margaret

    2016-01-01

    Recent soils research has shown that important chemical soil characteristics can change in less than a decade, often the result of broad environmental changes. Repeated sampling to monitor these changes in forest soils is a relatively new practice that is not well documented in the literature and has only recently been broadly embraced by the scientific community. The objective of this protocol is therefore to synthesize the latest information on methods of soil resampling in a format that can be used to design and implement a soil monitoring program. Successful monitoring of forest soils requires that a study unit be defined within an area of forested land that can be characterized with replicate sampling locations. A resampling interval of 5 years is recommended, but if monitoring is done to evaluate a specific environmental driver, the rate of change expected in that driver should be taken into consideration. Here, we show that the sampling of the profile can be done by horizon where boundaries can be clearly identified and horizons are sufficiently thick to remove soil without contamination from horizons above or below. Otherwise, sampling can be done by depth interval. Archiving of sample for future reanalysis is a key step in avoiding analytical bias and providing the opportunity for additional analyses as new questions arise.

  15. Ethical issues in consumer genome sequencing: Use of consumers' samples and data

    PubMed Central

    Niemiec, Emilia; Howard, Heidi Carmen

    2016-01-01

    High throughput approaches such as whole genome sequencing (WGS) and whole exome sequencing (WES) create an unprecedented amount of data providing powerful resources for clinical care and research. Recently, WGS and WES services have been made available by commercial direct-to-consumer (DTC) companies. The DTC offer of genetic testing (GT) has already brought attention to potentially problematic issues such as the adequacy of consumers' informed consent and transparency of companies' research activities. In this study, we analysed the websites of four DTC GT companies offering WGS and/or WES with regard to their policies governing storage and future use of consumers' data and samples. The results are discussed in relation to recommendations and guiding principles such as the “Statement of the European Society of Human Genetics on DTC GT for health-related purposes” (2010) and the “Framework for responsible sharing of genomic and health-related data” (Global Alliance for Genomics and Health, 2014). The analysis reveals that some companies may store and use consumers' samples or sequencing data for unspecified research and share the data with third parties. Moreover, the companies do not provide sufficient or clear information to consumers about this, which can undermine the validity of the consent process. Furthermore, while all companies state that they provide privacy safeguards for data and mention the limitations of these, information about the possibility of re-identification is lacking. Finally, although the companies that may conduct research do include information regarding proprietary claims and commercialisation of the results, it is not clear whether consumers are aware of the consequences of these policies. These results indicate that DTC GT companies still need to improve the transparency regarding handling of consumers' samples and data, including having an explicit and clear consent process for research activities. PMID:27047756

  16. Dentists' self-perceived role in offering tobacco cessation services: results from a nationally representative survey, United States, 2010-2011.

    PubMed

    Jannat-Khah, Deanna P; McNeely, Jennifer; Pereyra, Margaret R; Parish, Carrigan; Pollack, Harold A; Ostroff, Jamie; Metsch, Lisa; Shelley, Donna R

    2014-11-06

    Dental visits represent an opportunity to identify and help patients quit smoking, yet dental settings remain an untapped venue for treatment of tobacco dependence. The purpose of this analysis was to assess factors that may influence patterns of tobacco-use-related practice among a national sample of dental providers. We surveyed a representative sample of general dentists practicing in the United States (N = 1,802). Multivariable analysis was used to assess correlates of adherence to tobacco use treatment guidelines and to analyze factors that influence providers' willingness to offer tobacco cessation assistance if reimbursed for this service. More than 90% of dental providers reported that they routinely ask patients about tobacco use, 76% counsel patients, and 45% routinely offer cessation assistance, defined as referring patients for cessation counseling, providing a cessation prescription, or both. Results from multivariable analysis indicated that cessation assistance was associated with having a practice with 1 or more hygienists, having a chart system that includes a tobacco use question, having received training on treating tobacco dependence, and having positive attitudes toward treating tobacco use. Providers who did not offer assistance but who reported that they would change their practice patterns if sufficiently reimbursed were more likely to be in a group practice, treat patients insured through Medicaid, and have positive attitudes toward treating tobacco dependence. Findings indicate the potential benefit of increasing training opportunities and promoting system changes to increase involvement of dental providers in conducting tobacco use treatment. Reimbursement models should be tested to assess the effect on dental provider practice patterns.

  17. Preservation Methods Differ in Fecal Microbiome Stability, Affecting Suitability for Field Studies

    PubMed Central

    Amir, Amnon; Metcalf, Jessica L.; Amato, Katherine R.; Xu, Zhenjiang Zech; Humphrey, Greg

    2016-01-01

    ABSTRACT Immediate freezing at −20°C or below has been considered the gold standard for microbiome preservation, yet this approach is not feasible for many field studies, ranging from anthropology to wildlife conservation. Here we tested five methods for preserving human and dog fecal specimens for periods of up to 8 weeks, including such types of variation as freeze-thaw cycles and the high temperature fluctuations often encountered under field conditions. We found that three of the methods—95% ethanol, FTA cards, and the OMNIgene Gut kit—can preserve samples sufficiently well at ambient temperatures such that differences at 8 weeks are comparable to differences among technical replicates. However, even the worst methods, including those with no fixative, were able to reveal microbiome differences between species at 8 weeks and between individuals after a week, allowing meta-analyses of samples collected using various methods when the effect of interest is expected to be larger than interindividual variation (although use of a single method within a study is strongly recommended to reduce batch effects). Encouragingly for FTA cards, the differences caused by this method are systematic and can be detrended. As in other studies, we strongly caution against the use of 70% ethanol. The results, spanning 15 individuals and over 1,200 samples, provide our most comprehensive view to date of storage effects on stool and provide a paradigm for the future studies of other sample types that will be required to provide a global view of microbial diversity and its interaction among humans, animals, and the environment. IMPORTANCE Our study, spanning 15 individuals and over 1,200 samples, provides our most comprehensive view to date of storage and stabilization effects on stool. We tested five methods for preserving human and dog fecal specimens for periods of up to 8 weeks, including the types of variation often encountered under field conditions, such as freeze-thaw cycles and high temperature fluctuations. We show that several cost-effective methods provide excellent microbiome stability out to 8 weeks, opening up a range of field studies with humans and wildlife that would otherwise be cost-prohibitive. PMID:27822526

  18. Preservation Methods Differ in Fecal Microbiome Stability, Affecting Suitability for Field Studies.

    PubMed

    Song, Se Jin; Amir, Amnon; Metcalf, Jessica L; Amato, Katherine R; Xu, Zhenjiang Zech; Humphrey, Greg; Knight, Rob

    2016-01-01

    Immediate freezing at -20°C or below has been considered the gold standard for microbiome preservation, yet this approach is not feasible for many field studies, ranging from anthropology to wildlife conservation. Here we tested five methods for preserving human and dog fecal specimens for periods of up to 8 weeks, including such types of variation as freeze-thaw cycles and the high temperature fluctuations often encountered under field conditions. We found that three of the methods-95% ethanol, FTA cards, and the OMNIgene Gut kit-can preserve samples sufficiently well at ambient temperatures such that differences at 8 weeks are comparable to differences among technical replicates. However, even the worst methods, including those with no fixative, were able to reveal microbiome differences between species at 8 weeks and between individuals after a week, allowing meta-analyses of samples collected using various methods when the effect of interest is expected to be larger than interindividual variation (although use of a single method within a study is strongly recommended to reduce batch effects). Encouragingly for FTA cards, the differences caused by this method are systematic and can be detrended. As in other studies, we strongly caution against the use of 70% ethanol. The results, spanning 15 individuals and over 1,200 samples, provide our most comprehensive view to date of storage effects on stool and provide a paradigm for the future studies of other sample types that will be required to provide a global view of microbial diversity and its interaction among humans, animals, and the environment. IMPORTANCE Our study, spanning 15 individuals and over 1,200 samples, provides our most comprehensive view to date of storage and stabilization effects on stool. We tested five methods for preserving human and dog fecal specimens for periods of up to 8 weeks, including the types of variation often encountered under field conditions, such as freeze-thaw cycles and high temperature fluctuations. We show that several cost-effective methods provide excellent microbiome stability out to 8 weeks, opening up a range of field studies with humans and wildlife that would otherwise be cost-prohibitive.

  19. Prescribing exercise for older adults: A needs assessment comparing primary care physicians, nurse practitioners, and physician assistants.

    PubMed

    Dauenhauer, Jason A; Podgorski, Carol A; Karuza, Jurgis

    2006-01-01

    To inform the development of educational programming designed to teach providers appropriate methods of exercise prescription for older adults, the authors conducted a survey of 177 physicians, physician assistants, and nurse practitioners (39% response rate). The survey was designed to better understand the prevalence of exercise prescriptions, attitudes, barriers, and educational needs of primary care practitioners toward older adults. Forty-seven percent of primary care providers report not prescribing exercise for older adults; 85% of the sample report having no formal training in exercise prescription. Practitioner attitudes were positive toward exercise, but were not predictive of their exercise prescribing behavior, which indicates that education efforts aimed at changing attitudes as a way of increasing exercise-prescribing behaviors would not be sufficient. In order to facilitate and reinforce practice changes to increase exercise-prescribing behaviors of primary care providers, results suggest the need for specific skill training on how to write an exercise prescription and motivate older adults to follow these prescriptions.

  20. A Pharmacogenetics Service Experience for Pharmacy Students, Residents, and Fellows

    PubMed Central

    Drozda, Katarzyna; Labinov, Yana; Jiang, Ruixuan; Thomas, Margaret R.; Wong, Shan S.; Patel, Shitalben; Nutescu, Edith A.

    2013-01-01

    Objective. To utilize a comprehensive, pharmacist-led warfarin pharmacogenetics service to provide pharmacy students, residents, and fellows with clinical and research experiences involving genotype-guided therapy. Design. First-year (P1) through fourth-year (P4) pharmacy students, pharmacy residents, and pharmacy fellows participated in a newly implemented warfarin pharmacogenetics service in a hospital setting. Students, residents, and fellows provided genotype-guided dosing recommendations as part of clinical care, or analyzed samples and data collected from patients on the service for research purposes. Assessment. Students’, residents’, and fellows’ achievement of learning objectives was assessed using a checklist based on established core competencies in pharmacogenetics. The mean competency score of the students, residents, and fellows who completed a clinical and/or research experience with the service was 97% ±3%. Conclusion. A comprehensive warfarin pharmacogenetics service provided unique experiential and research opportunities for pharmacy students, residents, and fellows and sufficiently addressed a number of core competencies in pharmacogenetics. PMID:24159216

  1. An improved sampling method of complex network

    NASA Astrophysics Data System (ADS)

    Gao, Qi; Ding, Xintong; Pan, Feng; Li, Weixing

    2014-12-01

    Sampling subnet is an important topic of complex network research. Sampling methods influence the structure and characteristics of subnet. Random multiple snowball with Cohen (RMSC) process sampling which combines the advantages of random sampling and snowball sampling is proposed in this paper. It has the ability to explore global information and discover the local structure at the same time. The experiments indicate that this novel sampling method could keep the similarity between sampling subnet and original network on degree distribution, connectivity rate and average shortest path. This method is applicable to the situation where the prior knowledge about degree distribution of original network is not sufficient.

  2. 77 FR 31220 - Microloan Operating Loans

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-25

    ... sufficient credit from other sources; have sufficient applicable education, on-the-job training, or farming... Administration's Lets Move initiative, offering opportunities for niche-type urban farms to market directly to... in their farming ventures. FSA has the responsibility of providing credit counseling and supervision...

  3. Assessing sufficiency of thermal riverscapes for resilient salmon and steelhead populations

    EPA Science Inventory

    Resilient salmon populations require river networks that provide water temperature regimes sufficient to support a diversity of salmonid life histories across space and time. Efforts to protect, enhance and restore watershed thermal regimes for salmon may target specific location...

  4. Heteronuclear Correlation SSNMR Spectroscopy with Indirect Detection under Fast Magic-Angle Spinning [Book Chapter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kobayshi, Takeshi; Nishiyama, Yusuke; Pruski, Marek

    The main focus of this chapter is to address experimental strategies on the subject by providing a hands-on guide to fast MAS experiments, with a particular focus on indirect detection. Although our experience is limited to our respective laboratories in Ames and Yokohama, we hope that our descriptions of experimental setups and optimization procedures are sufficiently general to be applicable to all modern instruments. The chapter is organized as follows. Section 2 below introduces briefly the fast MAS technology and its main advantages. In Section 3, we describe the hardware associated with this remarkable technology and provide practical advices onmore » its use, including procedures for loading and unloading the samples, maintaining the probe, reducing t 1 noise, etc. In Section 4, we describe the principles and hands-on aspects of experiments involving the indirect detection of spin-1/2 and 14N nuclei« less

  5. An automated data management/analysis system for space shuttle orbiter tiles. [stress analysis

    NASA Technical Reports Server (NTRS)

    Giles, G. L.; Ballas, M.

    1982-01-01

    An engineering data management system was combined with a nonlinear stress analysis program to provide a capability for analyzing a large number of tiles on the space shuttle orbiter. Tile geometry data and all data necessary of define the tile loads environment accessed automatically as needed for the analysis of a particular tile or a set of tiles. User documentation provided includes: (1) description of computer programs and data files contained in the system; (2) definitions of all engineering data stored in the data base; (3) characteristics of the tile anaytical model; (4) instructions for preparation of user input; and (5) a sample problem to illustrate use of the system. Description of data, computer programs, and analytical models of the tile are sufficiently detailed to guide extension of the system to include additional zones of tiles and/or additional types of analyses

  6. An Approach to In-Situ Observations of Volcanic Plumes

    NASA Technical Reports Server (NTRS)

    Smythe, W. D.; Lopes, M. C.; Pieri, D. C.; Hall, J. L.

    2005-01-01

    Volcanoes have long been recognized as playing a dominant role in the birth, and possibly the death, of biological populations. They are possible sources of primordial gases, provide conditions sufficient for creating amino acids, strongly affect the heat balance in the atmosphere, and have been shown to sustain life (in oceanic vents.) Eruptions can have profound effects on local flora and fauna, and for very large eruptions, may alter global weather patterns and cause entire species to fail. Measurements of particulates, gases, and dynamics within a volcanic plume are critical to understanding both how volcanoes work and how plumes affect populations, environment, and aviation. Volcanic plumes and associated eruption columns are a miasma of toxic gases, corrosive condensates, and abrasive particulates that makes them hazardous to nearby populations and poses a significant risk to all forms of aviation. Plumes also provide a mechanism for sampling the volcanic interior, which, for hydrothermal environments, may host unique biological populations.

  7. Method for quantitative determination and separation of trace amounts of chemical elements in the presence of large quantities of other elements having the same atomic mass

    DOEpatents

    Miller, C.M.; Nogar, N.S.

    1982-09-02

    Photoionization via autoionizing atomic levels combined with conventional mass spectroscopy provides a technique for quantitative analysis of trace quantities of chemical elements in the presence of much larger amounts of other elements with substantially the same atomic mass. Ytterbium samples smaller than 10 ng have been detected using an ArF* excimer laser which provides the atomic ions for a time-of-flight mass spectrometer. Elemental selectivity of greater than 5:1 with respect to lutetium impurity has been obtained. Autoionization via a single photon process permits greater photon utilization efficiency because of its greater absorption cross section than bound-free transitions, while maintaining sufficient spectroscopic structure to allow significant photoionization selectivity between different atomic species. Separation of atomic species from others of substantially the same atomic mass is also described.

  8. New constraints on Lyman-α opacity using 92 quasar lines of sight

    NASA Astrophysics Data System (ADS)

    Bosman, Sarah E. I.; Fan, Xiaohui; Jiang, Linhua; Reed, Sophie; Matsuoka, Yoshiki; Becker, George; Rorai, Albert

    2018-05-01

    The large scatter in Lyman-α opacity at z > 5.3 has been an ongoing mystery, prompting a flurry of numerical models. A uniform ultra-violet background has been ruled out at those redshifts, but it is unclear whether any proposed models produce sufficient inhomogeneities. In this paper we provide an update on the measurement which first highlighted the issue: Lyman-α effective optical depth along high-z quasar lines of sight. We nearly triple on the previous sample size in such a study thanks to the cooperation of the DES-VHS, SHELLQs, and SDSS collaborations as well as new reductions and spectra. We find that a uniform UVB model is ruled out at 5.1 < z < 5.3, as well as higher redshifts, which is perplexing. We provide the first such measurements at z ~ 6. None of the numerical models we confronted to this data could reproduce the observed scatter.

  9. The multiple deficit model of dyslexia: what does it mean for identification and intervention?

    PubMed

    Ring, Jeremiah; Black, Jeffrey L

    2018-04-24

    Research demonstrates that phonological skills provide the basis of reading acquisition and are a primary processing deficit in dyslexia. This consensus has led to the development of effective methods of reading intervention. However, a single phonological deficit is not sufficient to account for the heterogeneity of individuals with dyslexia, and recent research provides evidence that supports a multiple-deficit model of reading disorders. Two studies are presented that investigate (1) the prevalence of phonological and cognitive processing deficit profiles in children with significant reading disability and (2) the effects of those same phonological and cognitive processing skills on reading development in a sample of children that received treatment for dyslexia. The results are discussed in the context of implications for identification and an intervention approach that accommodates multiple deficits within a comprehensive skills-based reading program.

  10. Myocardial Viability: From Proof of Concept to Clinical Practice

    PubMed Central

    Tan, Timothy C.; Hsu, Chijen; Denniss, Alan Robert

    2016-01-01

    Ischaemic left ventricular (LV) dysfunction can arise from myocardial stunning, hibernation, or necrosis. Imaging modalities have become front-line methods in the assessment of viable myocardial tissue, with the aim to stratify patients into optimal treatment pathways. Initial studies, although favorable, lacked sufficient power and sample size to provide conclusive outcomes of viability assessment. Recent trials, including the STICH and HEART studies, have failed to confer prognostic benefits of revascularisation therapy over standard medical management in ischaemic cardiomyopathy. In lieu of these recent findings, assessment of myocardial viability therefore should not be the sole factor for therapy choice. Optimization of medical therapy is paramount, and physicians should feel comfortable in deferring coronary revascularisation in patients with coronary artery disease with reduced LV systolic function. Newer trials are currently underway and will hopefully provide a more complete understanding of the pathos and management of ischaemic cardiomyopathy. PMID:27313943

  11. A study of optimum cowl shapes and flow port locations for minimum drag with effective engine cooling, volume 2

    NASA Technical Reports Server (NTRS)

    Fox, S. R.; Smetana, F. O.

    1980-01-01

    The listings, user's instructions, sample inputs, and sample outputs of two computer programs which are especially useful in obtaining an approximate solution of the viscous flow over an arbitrary nonlifting three dimensional body are provided. The first program performs a potential flow solution by a well known panel method and readjusts this initial solution to account for the effects of the boundary layer displacement thickness, a nonuniform but unidirectional onset flow field, and the presence of air intakes and exhausts. The second program is effectually a geometry package which allows the user to change or refine the shape of a body to satisfy particular needs without a significant amount of human intervention. An effort to reduce the cruise drag of light aircraft through an analytical study of the contributions to the drag arising from the engine cowl shape and the foward fuselage area and also that resulting from the cooling air mass flowing through intake and exhaust sites on the nacelle is presented. The programs may be effectively used to determine the appropriate body modifications or flow port locations to reduce the cruise drag as well as to provide sufficient air flow for cooling the engine.

  12. The outlook for precipitation measurements from space

    NASA Technical Reports Server (NTRS)

    Atlas, D.; Eckerman, J.; Meneghini, R.; Moore, R. K.

    1981-01-01

    To provide useful precipitation measurements from space, two requirements must be met: adequate spatial and temporal sampling of the storm and sufficient accuracy in the estimate of precipitation intensity. Although presently no single instrument or method completely satisfies both requirements, the visible/IR, microwave radiometer and radar methods can be used in a complementary manner. Visible/IR instruments provide good temporal sampling and rain area depiction, but recourse must be made to microwave measurements for quantitative rainfall estimates. The inadequacy of microwave radiometer measurements over land suggests, in turn, the use of radar. Several recently developed attenuating-wavelength radar methods are discussed in terms of their accuracy, dynamic range and system implementation. Traditionally, the requirements of high resolution and adequate dynamic range led to fairly costly and complex radar systems. Some simplications and cost reduction can be made; however, by using K-band wavelengths which have the advantages of greater sensitivity at the low rain rates and higher resolution capabilities. Several recently proposed methods of this kind are reviewed in terms of accuracy and system implementation. Finally, an adaptive-pointing multi-sensor instrument is described that would exploit certain advantages of the IR, radiometric and radar methods.

  13. Socioeconomic Factors Influence Physical Activity and Sport in Quebec Schools.

    PubMed

    Morin, Pascale; Lebel, Alexandre; Robitaille, Éric; Bisset, Sherri

    2016-11-01

    School environments providing a wide selection of physical activities and sufficient facilities are both essential and formative to ensure young people adopt active lifestyles. We describe the association between school opportunities for physical activity and socioeconomic factors measured by low-income cutoff index, school size (number of students), and neighborhood population density. A cross-sectional survey using a 2-stage stratified sampling method built a representative sample of 143 French-speaking public schools in Quebec, Canada. Self-administered questionnaires collected data describing the physical activities offered and schools' sports facilities. Descriptive and bivariate analyses were performed separately for primary and secondary schools. In primary schools, school size was positively associated with more intramural and extracurricular activities, more diverse interior facilities, and activities promoting active transportation. Low-income primary schools were more likely to offer a single gym. Low-income secondary schools offered lower diversity of intramural activities and fewer exterior sporting facilities. High-income secondary schools with a large school size provided a greater number of opportunities, larger infrastructures, and a wider selection of physical activities than smaller low-income schools. Results reveal an overall positive association between school availability of physical and sport activity and socioeconomic factors. © 2016, American School Health Association.

  14. 40 CFR Appendix B to Part 61 - Test Methods

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... below 28 liters/min (1.0 cfm). 8.2.2Perform test runs such that samples are obtained over a period or... cyclic operations, run sufficient tests for the accurate determination of the emissions that occur over... indicated by reddening (liberation of free iodine) in the first impinger. In these cases, the sample run may...

  15. 40 CFR Appendix B to Part 61 - Test Methods

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... below 28 liters/min (1.0 cfm). 8.2.2Perform test runs such that samples are obtained over a period or... cyclic operations, run sufficient tests for the accurate determination of the emissions that occur over... indicated by reddening (liberation of free iodine) in the first impinger. In these cases, the sample run may...

  16. 40 CFR Appendix B to Part 61 - Test Methods

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... below 28 liters/min (1.0 cfm). 8.2.2Perform test runs such that samples are obtained over a period or... cyclic operations, run sufficient tests for the accurate determination of the emissions that occur over... indicated by reddening (liberation of free iodine) in the first impinger. In these cases, the sample run may...

  17. 40 CFR Appendix B to Part 61 - Test Methods

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... below 28 liters/min (1.0 cfm). 8.2.2Perform test runs such that samples are obtained over a period or... cyclic operations, run sufficient tests for the accurate determination of the emissions that occur over... indicated by reddening (liberation of free iodine) in the first impinger. In these cases, the sample run may...

  18. 40 CFR Appendix B to Part 61 - Test Methods

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... below 28 liters/min (1.0 cfm). 8.2.2Perform test runs such that samples are obtained over a period or... cyclic operations, run sufficient tests for the accurate determination of the emissions that occur over... indicated by reddening (liberation of free iodine) in the first impinger. In these cases, the sample run may...

  19. Planning Community-Based Assessments of HIV Educational Intervention Programs in Sub-Saharan Africa

    ERIC Educational Resources Information Center

    Kelcey, Ben; Shen, Zuchao

    2017-01-01

    A key consideration in planning studies of community-based HIV education programs is identifying a sample size large enough to ensure a reasonable probability of detecting program effects if they exist. Sufficient sample sizes for community- or group-based designs are proportional to the correlation or similarity of individuals within communities.…

  20. Analysis of Invasion Dynamics of Matrix-Embedded Cells in a Multisample Format.

    PubMed

    Van Troys, Marleen; Masuzzo, Paola; Huyck, Lynn; Bakkali, Karima; Waterschoot, Davy; Martens, Lennart; Ampe, Christophe

    2018-01-01

    In vitro tests of cancer cell invasion are the "first line" tools of preclinical researchers for screening the multitude of chemical compounds or cell perturbations that may aid in halting or treating cancer malignancy. In order to have predictive value or to contribute to designing personalized treatment regimes, these tests need to take into account the cancer cell environment and measure effects on invasion in sufficient detail. The in vitro invasion assays presented here are a trade-off between feasibility in a multisample format and mimicking the complexity of the tumor microenvironment. They allow testing multiple samples and conditions in parallel using 3D-matrix-embedded cells and deal with the heterogeneous behavior of an invading cell population in time. We describe the steps to take, the technical problems to tackle and useful software tools for the entire workflow: from the experimental setup to the quantification of the invasive capacity of the cells. The protocol is intended to guide researchers to standardize experimental set-ups and to annotate their invasion experiments in sufficient detail. In addition, it provides options for image processing and a solution for storage, visualization, quantitative analysis, and multisample comparison of acquired cell invasion data.

  1. The X-Ray Light Curve of the Very Luminous Supernova SN 1978K in NGC 1313

    NASA Astrophysics Data System (ADS)

    Schlegel, Eric M.; Petre, R.; Colbert, E. J. M.

    1996-01-01

    We present the 0.5-2.0 keV light curve of the X-ray luminous supernova SN 1978K in NGC 1313, based on six ROSAT observations spanning 1990 July to t994 July. SN 1978K is one of a few supernovae or supernova remnants that are very luminous (˜1039-1040 ergs s-1) in the X-ray, optical, and radio bands, and the first, at a supernova age of 10-20 yr, for which sufficient data exist to create an X-ray light curve. The X-ray flux is approximately constant over the 4 yr sampled by our observations, which were obtained 12-16 yr after the initial explosion. Three models exist to explain the large X-ray luminosity: pulsar input, a reverse shock running back into the expanding debris of the supernova, and the outgoing shock crushing of cloudlets in the debris field. Based upon calculations of Chevalier & Fransson, a pulsar cannot provide sufficient energy to produce the soft X-ray luminosity. Based upon the models and the light curve to date, it is not possible to discern the evolutionary phase of the supernova.

  2. A simplified field protocol for genetic sampling of birds using buccal swabs

    USGS Publications Warehouse

    Vilstrup, Julia T.; Mullins, Thomas D.; Miller, Mark P.; McDearman, Will; Walters, Jeffrey R.; Haig, Susan M.

    2018-01-01

    DNA sampling is an essential prerequisite for conducting population genetic studies. For many years, blood sampling has been the preferred method for obtaining DNA in birds because of their nucleated red blood cells. Nonetheless, use of buccal swabs has been gaining favor because they are less invasive yet still yield adequate amounts of DNA for amplifying mitochondrial and nuclear markers; however, buccal swab protocols often include steps (e.g., extended air-drying and storage under frozen conditions) not easily adapted to field settings. Furthermore, commercial extraction kits and swabs for buccal sampling can be expensive for large population studies. We therefore developed an efficient, cost-effective, and field-friendly protocol for sampling wild birds after comparing DNA yield among 3 inexpensive buccal swab types (2 with foam tips and 1 with a cotton tip). Extraction and amplification success was high (100% and 97.2% respectively) using inexpensive generic swabs. We found foam-tipped swabs provided higher DNA yields than cotton-tipped swabs. We further determined that omitting a drying step and storing swabs in Longmire buffer increased efficiency in the field while still yielding sufficient amounts of DNA for detailed population genetic studies using mitochondrial and nuclear markers. This new field protocol allows time- and cost-effective DNA sampling of juveniles or small-bodied birds for which drawing blood may cause excessive stress to birds and technicians alike.

  3. Rapid LC-MS/MS quantification of the major benzodiazepines and their metabolites on dried blood spots using a simple and cost-effective sample pretreatment.

    PubMed

    Déglon, Julien; Versace, François; Lauer, Estelle; Widmer, Christèle; Mangin, Patrice; Thomas, Aurélien; Staub, Christian

    2012-06-01

    Dried blood spots (DBS) sampling has gained popularity in the bioanalytical community as an alternative to conventional plasma sampling, as it provides numerous benefits in terms of sample collection and logistics. The aim of this work was to show that these advantages can be coupled with a simple and cost-effective sample pretreatment, with subsequent rapid LC-MS/MS analysis for quantitation of 15 benzodiazepines, six metabolites and three Z-drugs. For this purpose, a simplified offline procedure was developed that consisted of letting a 5-µl DBS infuse directly into 100 µl of MeOH, in a conventional LC vial. The parameters related to the DBS pretreatment, such as extraction time or internal standard addition, were investigated and optimized, demonstrating that passive infusion in a regular LC vial was sufficient to quantitatively extract the analytes of interest. The method was validated according to international criteria in the therapeutic concentration ranges of the selected compounds. The presented strategy proved to be efficient for the rapid analysis of the selected drugs. Indeed, the offline sample preparation was reduced to a minimum, using a small amount of organic solvent and consumables, without affecting the accuracy of the method. Thus, this approach enables simple and rapid DBS analysis, even when using a non-DBS-dedicated autosampler, while lowering the costs and environmental impact.

  4. A minimally invasive micro sampler for quantitative sampling with an ultrahigh-aspect-ratio microneedle and a PDMS actuator.

    PubMed

    Liu, Long; Wang, Yan; Yao, Jinyuan; Yang, Cuijun; Ding, Guifu

    2016-08-01

    This study describes a novel micro sampler consisting of an ultrahigh-aspect-ratio microneedle and a PDMS actuator. The microneedle was fabricated by a new method which introduced reshaped photoresist technology to form a flow channel inside. The microneedle includes two parts: shaft and pedestal. In this study, the shaft length is 1500 μm with a 45° taper angle on the tip and pedestal is 1000 μm. Besides, the shaft and pedestal are connected by an arc connection structure with a length of 600 μm. The microneedles have sufficient mechanical strength to insert into skin with a wide safety margin which was proved by mechanics tests. Moreover, a PDMS actuator with a chamber inside was designed and fabricated in this study. The chamber, acting as a reservoir in sampling process as well as providing power, was optimized by finite element analysis (FEA) to decrease dead volume and improve sampling precision. The micro sampler just needs finger press to activate the sampling process as well as used for quantitative micro injection to some extent. And a volume of 31.5 ± 0.8 μl blood was successfully sampled from the ear artery of a rabbit. This micro sampler is suitable for micro sampling for diagnose or therapy in biomedical field.

  5. Alchemical prediction of hydration free energies for SAMPL

    PubMed Central

    Mobley, David L.; Liu, Shaui; Cerutti, David S.; Swope, William C.; Rice, Julia E.

    2013-01-01

    Hydration free energy calculations have become important tests of force fields. Alchemical free energy calculations based on molecular dynamics simulations provide a rigorous way to calculate these free energies for a particular force field, given sufficient sampling. Here, we report results of alchemical hydration free energy calculations for the set of small molecules comprising the 2011 Statistical Assessment of Modeling of Proteins and Ligands (SAMPL) challenge. Our calculations are largely based on the Generalized Amber Force Field (GAFF) with several different charge models, and we achieved RMS errors in the 1.4-2.2 kcal/mol range depending on charge model, marginally higher than what we typically observed in previous studies1-5. The test set consists of ethane, biphenyl, and a dibenzyl dioxin, as well as a series of chlorinated derivatives of each. We found that, for this set, using high-quality partial charges from MP2/cc-PVTZ SCRF RESP fits provided marginally improved agreement with experiment over using AM1-BCC partial charges as we have more typically done, in keeping with our recent findings5. Switching to OPLS Lennard-Jones parameters with AM1-BCC charges also improves agreement with experiment. We also find a number of chemical trends within each molecular series which we can explain, but there are also some surprises, including some that are captured by the calculations and some that are not. PMID:22198475

  6. NMR spectroscopic and analytical ultracentrifuge analysis of membrane protein detergent complexes.

    PubMed

    Maslennikov, Innokentiy; Kefala, Georgia; Johnson, Casey; Riek, Roland; Choe, Senyon; Kwiatkowski, Witek

    2007-11-08

    Structural studies of integral membrane proteins (IMPs) are hampered by inherent difficulties in their heterologous expression and in the purification of solubilized protein-detergent complexes (PDCs). The choice and concentrations of detergents used in an IMP preparation play a critical role in protein homogeneity and are thus important for successful crystallization. Seeking an effective and standardized means applicable to genomic approaches for the characterization of PDCs, we chose 1D-NMR spectroscopic analysis to monitor the detergent content throughout their purification: protein extraction, detergent exchange, and sample concentration. We demonstrate that a single NMR measurement combined with a SDS-PAGE of a detergent extracted sample provides a useful gauge of the detergent's extraction potential for a given protein. Furthermore, careful monitoring of the detergent content during the process of IMP production allows for a high level of reproducibility. We also show that in many cases a simple sedimentation velocity measurement provides sufficient data to estimate both the oligomeric state and the detergent-to-protein ratio in PDCs, as well as to evaluate the homogeneity of the samples prior to crystallization screening. The techniques presented here facilitate the screening and selection of the extraction detergent, as well as help to maintain reproducibility in the detergent exchange and PDC concentration procedures. Such reproducibility is particularly important for the optimization of initial crystallization conditions, for which multiple purifications are routinely required.

  7. NMR spectroscopic and analytical ultracentrifuge analysis of membrane protein detergent complexes

    PubMed Central

    Maslennikov, Innokentiy; Kefala, Georgia; Johnson, Casey; Riek, Roland; Choe, Senyon; Kwiatkowski, Witek

    2007-01-01

    Background Structural studies of integral membrane proteins (IMPs) are hampered by inherent difficulties in their heterologous expression and in the purification of solubilized protein-detergent complexes (PDCs). The choice and concentrations of detergents used in an IMP preparation play a critical role in protein homogeneity and are thus important for successful crystallization. Results Seeking an effective and standardized means applicable to genomic approaches for the characterization of PDCs, we chose 1D-NMR spectroscopic analysis to monitor the detergent content throughout their purification: protein extraction, detergent exchange, and sample concentration. We demonstrate that a single NMR measurement combined with a SDS-PAGE of a detergent extracted sample provides a useful gauge of the detergent's extraction potential for a given protein. Furthermore, careful monitoring of the detergent content during the process of IMP production allows for a high level of reproducibility. We also show that in many cases a simple sedimentation velocity measurement provides sufficient data to estimate both the oligomeric state and the detergent-to-protein ratio in PDCs, as well as to evaluate the homogeneity of the samples prior to crystallization screening. Conclusion The techniques presented here facilitate the screening and selection of the extraction detergent, as well as help to maintain reproducibility in the detergent exchange and PDC concentration procedures. Such reproducibility is particularly important for the optimization of initial crystallization conditions, for which multiple purifications are routinely required. PMID:17988403

  8. The Biotechnology Facility for International Space Station

    NASA Technical Reports Server (NTRS)

    Goodwin, Thomas; Lundquist, Charles; Hurlbert, Katy; Tuxhorn, Jennifer

    2004-01-01

    The primary mission of the Cellular Biotechnology Program is to advance microgravity as a tool in basic and applied cell biology. The microgravity environment can be used to study fundamental principles of cell biology and to achieve specific applications such as tissue engineering. The Biotechnology Facility (BTF) will provide a state-of-the-art facility to perform cellular biotechnology research onboard the International Space Station (ISS). The BTF will support continuous operation, which will allow performance of long-duration experiments and will significantly increase the on-orbit science throughput. With the BTF, dedicated ground support, and a community of investigators, the goals of the Cellular Biotechnology Program at Johnson Space Center are to: Support approximately 400 typical investigator experiments during the nominal design life of BTF (10 years). Support a steady increase in investigations per year, starting with stationary bioreactor experiments and adding rotating bioreactor experiments at a later date. Support at least 80% of all new cellular biotechnology investigations selected through the NASA Research Announcement (NRA) process. Modular components - to allow sequential and continuous experiment operations without cross-contamination Increased cold storage capability (+4 C, -80 C, -180 C). Storage of frozen cell culture inoculum - to allow sequential investigations. Storage of post-experiment samples - for return of high quality samples. Increased number of cell cultures per investigation, with replicates - to provide sufficient number of samples for data analysis and publication of results in peer-reviewed scientific journals.

  9. Who should be undertaking population-based surveys in humanitarian emergencies?

    PubMed Central

    Spiegel, Paul B

    2007-01-01

    Background Timely and accurate data are necessary to prioritise and effectively respond to humanitarian emergencies. 30-by-30 cluster surveys are commonly used in humanitarian emergencies because of their purported simplicity and reasonable validity and precision. Agencies have increasingly used 30-by-30 cluster surveys to undertake measurements beyond immunisation coverage and nutritional status. Methodological errors in cluster surveys have likely occurred for decades in humanitarian emergencies, often with unknown or unevaluated consequences. Discussion Most surveys in humanitarian emergencies are done by non-governmental organisations (NGOs). Some undertake good quality surveys while others have an already overburdened staff with limited epidemiological skills. Manuals explaining cluster survey methodology are available and in use. However, it is debatable as to whether using standardised, 'cookbook' survey methodologies are appropriate. Coordination of surveys is often lacking. If a coordinating body is established, as recommended, it is questionable whether it should have sole authority to release surveys due to insufficient independence. Donors should provide sufficient funding for personnel, training, and survey implementation, and not solely for direct programme implementation. Summary A dedicated corps of trained epidemiologists needs to be identified and made available to undertake surveys in humanitarian emergencies. NGOs in the field may need to form an alliance with certain specialised agencies or pool technically capable personnel. If NGOs continue to do surveys by themselves, a simple training manual with sample survey questionnaires, methodology, standardised files for data entry and analysis, and manual for interpretation should be developed and modified locally for each situation. At the beginning of an emergency, a central coordinating body should be established that has sufficient authority to set survey standards, coordinate when and where surveys should be undertaken and act as a survey repository. Technical expertise is expensive and donors must pay for it. As donors increasingly demand evidence-based programming, they have an obligation to ensure that sufficient funds are provided so organisations have adequate technical staff. PMID:17543107

  10. Characterization and scaling of anisotropy of fabrics and fractures at laboratory scales: insights from volumetric analysis using computed tomography

    NASA Astrophysics Data System (ADS)

    Ketcham, Richard A.

    2017-04-01

    Anisotropy in three-dimensional quantities such as geometric shape and orientation is commonly quantified using principal components analysis, in which a second order tensor determines the orientations of orthogonal components and their relative magnitudes. This approach has many advantages, such as simplicity and ability to accommodate many forms of data, and resilience to data sparsity. However, when data are sufficiently plentiful and precise, they sometimes show that aspects of the principal components approach are oversimplifications that may affect how the data are interpreted or extrapolated for mathematical or physical modeling. High-resolution X-ray computed tomography (CT) can effectively extract thousands of measurements from a single sample, providing a data density sufficient to examine the ways in which anisotropy on the hand-sample scale and smaller can be quantified, and the extent to which the ways the data are simplified are faithful to the underlying distributions. Features within CT data can be considered as discrete objects or continuum fabrics; the latter can be characterized using a variety of metrics, such as the most commonly used mean intercept length, and also the more specialized star length and star volume distributions. Each method posits a different scaling among components that affects the measured degree of anisotropy. The star volume distribution is the most sensitive to anisotropy, and commonly distinguishes strong fabric components that are not orthogonal. Although these data are well-presented using a stereoplot, 3D rose diagrams are another visualization option that can often help identify these components. This talk presents examples from a number of cases, starting with trabecular bone and extending to geological features such as fractures and brittle and ductile fabrics, in which non-orthogonal principal components identified using CT provide some insight into the origin of the underlying structures, and how they should be interpreted and potentially up-scaled.

  11. Targeted stock identification using multilocus genotype 'familyprinting'

    USGS Publications Warehouse

    Letcher, B.H.; King, T.L.

    1999-01-01

    We present an approach to stock identification of small, targeted populations that uses multilocus microsatellite genotypes of individual mating adults to uniquely identify first- and second-generation offspring in a mixture. We call the approach 'familyprinting'; unlike DNA fingerprinting where tissue samples of individuals are matched, offspring from various families are assigned to pairs of parents or sets of four grandparents with known genotypes. The basic unit of identification is the family, but families can be nested within a variety of stock units ranging from naturally reproducing groups of fish in a small tributary or pond from which mating adults can be sampled to large or small collections of families produced in hatcheries and stocked in specific locations. We show that, with as few as seven alleles per locus using four loci without error, first-generation offspring can be uniquely assigned to the correct family. For second-generation applications in a hatchery more alleles per locus (10) and loci (10) are required for correct assignment of all offspring to the correct set of grandparents. Using microsatellite DNA variation from an Atlantic salmon (Salmo solar) restoration river (Connecticut River, USA), we also show that this population contains sufficient genetic diversity in sea-run returns for 100% correct first, generation assignment and 97% correct second-generation assignment using 14 loci. We are currently using first- and second-generation familyprinting in this population with the ultimate goal of identifying stocking tributary. In addition to within-river familyprinting, there also appears to be sufficient genetic diversity within and between Atlantic salmon populations for identification of 'familyprinted' fish in a mixture of multiple populations. We also suggest that second-generation familyprinting with multiple populations may also provide a tool for examining stock structure. Familyprinting with microsatellite DNA markers is a viable method for identification of offspring of randomly mating adults from small, targeted stocks and should provide a useful addition to current mixed stock analyses with genetic markers.

  12. Online Learning in Higher Education: Necessary and Sufficient Conditions

    ERIC Educational Resources Information Center

    Lim, Cher Ping

    2005-01-01

    The spectacular development of information and communication technologies through the Internet has provided opportunities for students to explore the virtual world of information. In this article, the author discusses the necessary and sufficient conditions for successful online learning in educational institutions. The necessary conditions…

  13. Predictive sufficiency and the use of stored internal state

    NASA Technical Reports Server (NTRS)

    Musliner, David J.; Durfee, Edmund H.; Shin, Kang G.

    1994-01-01

    In all embedded computing systems, some delay exists between sensing and acting. By choosing an action based on sensed data, a system is essentially predicting that there will be no significant changes in the world during this delay. However, the dynamic and uncertain nature of the real world can make these predictions incorrect, and thus, a system may execute inappropriate actions. Making systems more reactive by decreasing the gap between sensing and action leaves less time for predictions to err, but still provides no principled assurance that they will be correct. Using the concept of predictive sufficiency described in this paper, a system can prove that its predictions are valid, and that it will never execute inappropriate actions. In the context of our CIRCA system, we also show how predictive sufficiency allows a system to guarantee worst-case response times to changes in its environment. Using predictive sufficiency, CIRCA is able to build real-time reactive control plans which provide a sound basis for performance guarantees that are unavailable with other reactive systems.

  14. The K-KIDS Sample: K Dwarfs within 50 Parsecs and the Search for their Closest Companions with CHIRON

    NASA Astrophysics Data System (ADS)

    Paredes-Alvarez, Leonardo; Nusdeo, Daniel Anthony; Henry, Todd J.; Jao, Wei-Chun; Gies, Douglas R.; White, Russel; RECONS Team

    2017-01-01

    To understand fundamental aspects of stellar populations, astronomers need carefully vetted, volume-complete samples. In our K-KIDS effort, our goal is to survey a large sample of K dwarfs for their "kids", companions that may be stellar, brown dwarf, or planetary in nature. Four surveys for companions orbiting an initial set of 1048 K dwarfs with declinations between +30 and -30 have begun. Companions are being detected with separations less than 1 AU out to 10000 AU. Fortuitously, the combination of Hipparcos and Gaia DR1 astrometry with optical photometry from APASS and infrared photometry from 2MASS now allows us to create an effectively volume-complete sample of K dwarfs to a horizon of 50 pc. This sample facilitates rigorous studies of the luminosity and mass functions, as well as comprehensive mapping of the companions orbiting K dwarfs that have never before been possible.Here we present two important results. First, we find that our initial sample of ~1000 K dwarfs can be expanded to 2000-3000 stars in what is an effectively volume-complete sample. This population is sufficiently large to provide superb statistics on the outcomes of star and planet formation processes. Second, initial results from our high-precision radial velocity survey of K dwarfs with the CHIRON spectrograph on the CTIO/SMARTS 1.5m reveal its short-term precision and indicate that stellar, brown dwarf and Jovian planets will be detectable. We present radial velocity curves for an inital sample of 8 K dwarfs with V = 7-10 using cross-correlation techniques on R=80,000 spectra, and illustrate the stability of CHIRON over hours, days, and weeks. Ultimately, the combination of all four surveys will provide an unprecedented portrait of K dwarfs and their kids.This effort has been supported by the NSF through grants AST-1412026 and AST-1517413, and via observations made possible by the SMARTS Consortium

  15. Development of a Novel Method for Temporal Analysis of Airborne Microbial Communities

    NASA Astrophysics Data System (ADS)

    Spring, A.; Domingue, K. D.; Mooney, M. M.; Kerber, T. V.; Lemmer, K. M.; Docherty, K. M.

    2017-12-01

    Microorganisms are ubiquitous in the atmosphere, which serves as an important vector for microbial dispersal to all terrestrial habitats. Very little is known about the mechanisms that control microbial dispersal, because sampling of airborne microbial communities beyond 2 m above the ground is limited. The goal of this study was to construct and test an airborne microbial sampling system to collect sufficient DNA for conducting next generation sequencing and microbial community analyses. The system we designed employs helium-filled helikites as a mechanism for launching samplers to various altitudes. The samplers use a passive collection dish system, weigh under 6 lbs and are operated by remote control from the ground. We conducted several troubleshooting experiments to test sampler functionality. We extracted DNA from sterile collection dish surfaces and examined communities using amplicons of the V4 region of 16S rRNA in bacteria using Illumina Mi-Seq. The results of these experiments demonstrate that the samplers we designed 1) remain decontaminated when closed and collect sufficient microbial biomass for DNA-based analyses when open for 6 hours; 2) are optimally decontaminated with 15 minutes of UV exposure; 3) require 8 collection dish surfaces to collect sufficient biomass. We also determined that DNA extraction conducted within 24 hours of collection has less of an impact on community composition than extraction after frozen storage. Using this sampling system, we collected samples from multiple altitudes in December 2016 and May 2017 at 3 sites in Kalamazoo and Pellston, Michigan. In Kalamazoo, areas sampled were primarily developed or agricultural, while in Pellston they were primarily forested. We observed significant differences between airborne bacterial communities collected at each location and time point. Additionally, bacterial communities did not differ with altitude, suggesting that terrestrial land use has an important influence on the upward distribution of bacteria. Proteobacteria were predominant in air samples from Kalamazoo, while Firmicutes were more prevalent in Pellston. Our results demonstrate that the sampling platform we designed is a useful tool for exploring ecological questions related to distribution of airborne microbial communities across a vertical transect.

  16. A Communication Framework for Collaborative Defense

    DTIC Science & Technology

    2009-02-28

    been able to provide sufficient automation to be able to build up the most extensive application signature database in the world with a fraction of...perceived. We have been able to provide sufficient automation to be able to build up the most extensive application signature database in the world with a...that are well understood in the context of databases . These techniques allow users to quickly scan for the existence of a key in a database . 8 To be

  17. Quantifying the sources of variability in equine faecal egg counts: implications for improving the utility of the method.

    PubMed

    Denwood, M J; Love, S; Innocent, G T; Matthews, L; McKendrick, I J; Hillary, N; Smith, A; Reid, S W J

    2012-08-13

    The faecal egg count (FEC) is the most widely used means of quantifying the nematode burden of horses, and is frequently used in clinical practice to inform treatment and prevention. The statistical process underlying the FEC is complex, comprising a Poisson counting error process for each sample, compounded with an underlying continuous distribution of means between samples. Being able to quantify the sources of variability contributing to this distribution of means is a necessary step towards providing estimates of statistical power for future FEC and FECRT studies, and may help to improve the usefulness of the FEC technique by identifying and minimising unwanted sources of variability. Obtaining such estimates require a hierarchical statistical model coupled with repeated FEC observations from a single animal over a short period of time. Here, we use this approach to provide the first comparative estimate of multiple sources of within-horse FEC variability. The results demonstrate that a substantial proportion of the observed variation in FEC between horses occurs as a result of variation in FEC within an animal, with the major sources being aggregation of eggs within faeces and variation in egg concentration between faecal piles. The McMaster procedure itself is associated with a comparatively small coefficient of variation, and is therefore highly repeatable when a sufficiently large number of eggs are observed to reduce the error associated with the counting process. We conclude that the variation between samples taken from the same animal is substantial, but can be reduced through the use of larger homogenised faecal samples. Estimates are provided for the coefficient of variation (cv) associated with each within animal source of variability in observed FEC, allowing the usefulness of individual FEC to be quantified, and providing a basis for future FEC and FECRT studies. Copyright © 2012 Elsevier B.V. All rights reserved.

  18. Necessary and sufficient conditions for R₀ to be a sum of contributions of fertility loops.

    PubMed

    Rueffler, Claus; Metz, Johan A J

    2013-03-01

    Recently, de-Camino-Beck and Lewis (Bull Math Biol 69:1341-1354, 2007) have presented a method that under certain restricted conditions allows computing the basic reproduction ratio R₀ in a simple manner from life cycle graphs, without, however, giving an explicit indication of these conditions. In this paper, we give various sets of sufficient and generically necessary conditions. To this end, we develop a fully algebraic counterpart of their graph-reduction method which we actually found more useful in concrete applications. Both methods, if they work, give a simple algebraic formula that can be interpreted as the sum of contributions of all fertility loops. This formula can be used in e.g. pest control and conservation biology, where it can complement sensitivity and elasticity analyses. The simplest of the necessary and sufficient conditions is that, for irreducible projection matrices, all paths from birth to reproduction have to pass through a common state. This state may be visible in the state representation for the chosen sampling time, but the passing may also occur in between sampling times, like a seed stage in the case of sampling just before flowering. Note that there may be more than one birth state, like when plants in their first year can already have different sizes at the sampling time. Also the common state may occur only later in life. However, in all cases R₀ allows a simple interpretation as the expected number of new individuals that in the next generation enter the common state deriving from a single individual in this state. We end with pointing to some alternative algebraically simple quantities with properties similar to those of R₀ that may sometimes be used to good effect in cases where no simple formula for R₀ exists.

  19. Simplified method for detecting tritium contamination in plants and soil

    USGS Publications Warehouse

    Andraski, Brian J.; Sandstrom, M.W.; Michel, R.L.; Radyk, J.C.; Stonestrom, David A.; Johnson, M.J.; Mayers, C.J.

    2003-01-01

    Cost-effective methods are needed to identify the presence and distribution of tritium near radioactive waste disposal and other contaminated sites. The objectives of this study were to (i) develop a simplified sample preparation method for determining tritium contamination in plants and (ii) determine if plant data could be used as an indicator of soil contamination. The method entailed collection and solar distillation of plant water from foliage, followed by filtration and adsorption of scintillation-interfering constituents on a graphite-based solid phase extraction (SPE) column. The method was evaluated using samples of creosote bush [Larrea tridentata (Sessé & Moc. ex DC.) Coville], an evergreen shrub, near a radioactive disposal area in the Mojave Desert. Laboratory tests showed that a 2-g SPE column was necessary and sufficient for accurate determination of known tritium concentrations in plant water. Comparisons of tritium concentrations in plant water determined with the solar distillation–SPE method and the standard (and more laborious) toluene-extraction method showed no significant difference between methods. Tritium concentrations in plant water and in water vapor of root-zone soil also showed no significant difference between methods. Thus, the solar distillation–SPE method provides a simple and cost-effective way to identify plant and soil contamination. The method is of sufficient accuracy to facilitate collection of plume-scale data and optimize placement of more sophisticated (and costly) monitoring equipment at contaminated sites. Although work to date has focused on one desert plant, the approach may be transferable to other species and environments after site-specific experiments.

  20. 3D-Printing for Analytical Ultracentrifugation

    PubMed Central

    Desai, Abhiksha; Krynitsky, Jonathan; Pohida, Thomas J.; Zhao, Huaying

    2016-01-01

    Analytical ultracentrifugation (AUC) is a classical technique of physical biochemistry providing information on size, shape, and interactions of macromolecules from the analysis of their migration in centrifugal fields while free in solution. A key mechanical element in AUC is the centerpiece, a component of the sample cell assembly that is mounted between the optical windows to allow imaging and to seal the sample solution column against high vacuum while exposed to gravitational forces in excess of 300,000 g. For sedimentation velocity it needs to be precisely sector-shaped to allow unimpeded radial macromolecular migration. During the history of AUC a great variety of centerpiece designs have been developed for different types of experiments. Here, we report that centerpieces can now be readily fabricated by 3D printing at low cost, from a variety of materials, and with customized designs. The new centerpieces can exhibit sufficient mechanical stability to withstand the gravitational forces at the highest rotor speeds and be sufficiently precise for sedimentation equilibrium and sedimentation velocity experiments. Sedimentation velocity experiments with bovine serum albumin as a reference molecule in 3D printed centerpieces with standard double-sector design result in sedimentation boundaries virtually indistinguishable from those in commercial double-sector epoxy centerpieces, with sedimentation coefficients well within the range of published values. The statistical error of the measurement is slightly above that obtained with commercial epoxy, but still below 1%. Facilitated by modern open-source design and fabrication paradigms, we believe 3D printed centerpieces and AUC accessories can spawn a variety of improvements in AUC experimental design, efficiency and resource allocation. PMID:27525659

  1. Joint modelling rationale for chained equations

    PubMed Central

    2014-01-01

    Background Chained equations imputation is widely used in medical research. It uses a set of conditional models, so is more flexible than joint modelling imputation for the imputation of different types of variables (e.g. binary, ordinal or unordered categorical). However, chained equations imputation does not correspond to drawing from a joint distribution when the conditional models are incompatible. Concurrently with our work, other authors have shown the equivalence of the two imputation methods in finite samples. Methods Taking a different approach, we prove, in finite samples, sufficient conditions for chained equations and joint modelling to yield imputations from the same predictive distribution. Further, we apply this proof in four specific cases and conduct a simulation study which explores the consequences when the conditional models are compatible but the conditions otherwise are not satisfied. Results We provide an additional “non-informative margins” condition which, together with compatibility, is sufficient. We show that the non-informative margins condition is not satisfied, despite compatible conditional models, in a situation as simple as two continuous variables and one binary variable. Our simulation study demonstrates that as a consequence of this violation order effects can occur; that is, systematic differences depending upon the ordering of the variables in the chained equations algorithm. However, the order effects appear to be small, especially when associations between variables are weak. Conclusions Since chained equations is typically used in medical research for datasets with different types of variables, researchers must be aware that order effects are likely to be ubiquitous, but our results suggest they may be small enough to be negligible. PMID:24559129

  2. Fast experiments for structure elucidation of small molecules: Hadamard NMR with multiple receivers.

    PubMed

    Gierth, Peter; Codina, Anna; Schumann, Frank; Kovacs, Helena; Kupče, Ēriks

    2015-11-01

    We propose several significant improvements to the PANSY (Parallel NMR SpectroscopY) experiments-PANSY COSY and PANSY-TOCSY. The improved versions of these experiments provide sufficient spectral information for structure elucidation of small organic molecules from just two 2D experiments. The PANSY-TOCSY-Q experiment has been modified to allow for simultaneous acquisition of three different types of NMR spectra-1D C-13 of non-protonated carbon sites, 2D TOCSY and multiplicity edited 2D HETCOR. In addition the J-filtered 2D PANSY-gCOSY experiment records a 2D HH gCOSY spectrum in parallel with a (1) J-filtered HC long-range HETCOR spectrum as well as offers a simplified data processing. In addition to parallel acquisition, further time savings are feasible because of significantly smaller F1 spectral windows as compared to the indirect detection experiments. Use of cryoprobes and multiple receivers can significantly alleviate the sensitivity issues that are usually associated with the so called direct detection experiments. In cases where experiments are sampling limited rather than sensitivity limited further reduction of experiment time is achieved by using Hadamard encoding. In favorable cases the total recording time for the two PANSY experiments can be reduced to just 40 s. The proposed PANSY experiments provide sufficient information to allow the CMCse software package (Bruker) to solve structures of small organic molecules. Copyright © 2015 John Wiley & Sons, Ltd.

  3. Tensile stress induced depolarization in [001]-poled transverse mode Pb(Zn{sub 1/3}Nb{sub 2/3})O{sub 3} -(6-7)%PbTiO{sub 3} single crystals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shukla, Rahul; Department of Mechanical Engineering, National University of Singapore, Singapore 119260; Lim, Leong-Chew

    2011-04-01

    This paper investigates the effects of electrically induced and direct tensile stress on the deformation and dielectric properties of Pb(Zn{sub 1/3}Nb{sub 2/3})O{sub 3}-(6-7)%PbTiO{sub 3} single crystals of [110]{sup L}x[001]{sup T} cut by using a unimorph sample and a four-point-bend (FPB) sample, respectively. The results show a dip in tip displacement for the unimorph sample at sufficiently high electric field parallel to the poling field direction and a sudden rise in capacitance for the FPB sample at sufficiently high tensile stress in the [110] crystal direction, respectively. These phenomena are attributed to the tensile stress induced rhombohedral-to-orthorhombic phase transition and associatedmore » depolarization events in the crystal. For the said crystal cut, the obtained tensile depoling stress is in the range of 15-20 MPa. The present work furthermore shows that the occurrence of tensile stress-induced depolarization is possible even when the direction of the applied electric field is parallel to the poling field direction, as in the unimorph sample examined.« less

  4. Validation of the German version of the insomnia severity index in adolescents, young adults and adult workers: results from three cross-sectional studies.

    PubMed

    Gerber, Markus; Lang, Christin; Lemola, Sakari; Colledge, Flora; Kalak, Nadeem; Holsboer-Trachsler, Edith; Pühse, Uwe; Brand, Serge

    2016-05-31

    A variety of objective and subjective methods exist to assess insomnia. The Insomnia Severity Index (ISI) was developed to provide a brief self-report instrument useful to assess people's perception of sleep complaints. The ISI was developed in English, and has been translated into several languages including German. Surprisingly, the psychometric properties of the German version have not been evaluated, although the ISI is often used with German-speaking populations. The psychometric properties of the ISI are tested in three independent samples: 1475 adolescents, 862 university students, and 533 police and emergency response service officers. In all three studies, participants provide information about insomnia (ISI), sleep quality (Pittsburgh Sleep Quality Index), and psychological functioning (diverse instruments). Descriptive statistics, gender differences, homogeneity and internal consistency, convergent validity, and factorial validity (including measurement invariance across genders) are examined in each sample. The findings show that the German version of the ISI has generally acceptable psychometric properties and sufficient concurrent validity. Confirmatory factor analyses show that a 1-factor solution achieves good model fit. Furthermore, measurement invariance across gender is supported in all three samples. While the ISI has been widely used in German-speaking countries, this study is the first to provide empirical evidence that the German version of this instrument has good psychometric properties and satisfactory convergent and factorial validity across various age groups and both men and women. Thus, the German version of the ISI can be recommended as a brief screening measure in German-speaking populations.

  5. Revealing stellar brightness profiles by means of microlensing fold caustics

    NASA Astrophysics Data System (ADS)

    Dominik, M.

    2004-09-01

    With a handful of measurements of limb-darkening coefficients, galactic microlensing has already proven to be a powerful technique for studying atmospheres of distant stars. Survey campaigns such as OGLE-III are capable of providing ~10 suitable target stars per year that undergo microlensing events involving passages over the caustic created by a binary lens, which last from a few hours to a few days and allow us to resolve the stellar atmosphere by frequent broad-band photometry. For a caustic exit lasting 12 h and a photometric precision of 1.5 per cent, a moderate sampling interval of 30 min (corresponding to ~25-30 data points) is sufficient for providing a reliable measurement of the linear limb-darkening coefficient Γ with an uncertainty of ~8 per cent, which reduces to ~3 per cent for a reduced sampling interval of 6 min for the surroundings of the end of the caustic exit. While some additional points over the remaining parts of the light curve are highly valuable, a denser sampling in these regions provides little improvement. Unless an accuracy of less than 5 per cent is desired, limb-darkening coefficients for several filters can be obtained or observing time can be spent on other targets during the same night. The adoption of an inappropriate stellar brightness profile as well as the effect of acceleration between source and caustic yield distinguishable characteristic systematics in the model residuals. Acceleration effects are unlikely to affect the light curve significantly for most events, although a free acceleration parameter blurs the limb-darkening measurement if the passage duration cannot be accurately determined.

  6. Monitoring diesel particulate matter and calculating diesel particulate densities using Grimm model 1.109 real-time aerosol monitors in underground mines.

    PubMed

    Kimbal, Kyle C; Pahler, Leon; Larson, Rodney; VanDerslice, Jim

    2012-01-01

    Currently, there is no Mine Safety and Health Administration (MSHA)-approved sampling method that provides real-time results for ambient concentrations of diesel particulates. This study investigated whether a commercially available aerosol spectrometer, the Grimm Portable Aerosol Spectrometer Model 1.109, could be used during underground mine operations to provide accurate real-time diesel particulate data relative to MSHA-approved cassette-based sampling methods. A subset was to estimate size-specific diesel particle densities to potentially improve the diesel particulate concentration estimates using the aerosol monitor. Concurrent sampling was conducted during underground metal mine operations using six duplicate diesel particulate cassettes, according to the MSHA-approved method, and two identical Grimm Model 1.109 instruments. Linear regression was used to develop adjustment factors relating the Grimm results to the average of the cassette results. Statistical models using the Grimm data produced predicted diesel particulate concentrations that highly correlated with the time-weighted average cassette results (R(2) = 0.86, 0.88). Size-specific diesel particulate densities were not constant over the range of particle diameters observed. The variance of the calculated diesel particulate densities by particle diameter size supports the current understanding that diesel emissions are a mixture of particulate aerosols and a complex host of gases and vapors not limited to elemental and organic carbon. Finally, diesel particulate concentrations measured by the Grimm Model 1.109 can be adjusted to provide sufficiently accurate real-time air monitoring data for an underground mining environment.

  7. MetaGenSense: A web-application for analysis and exploration of high throughput sequencing metagenomic data

    PubMed Central

    Denis, Jean-Baptiste; Vandenbogaert, Mathias; Caro, Valérie

    2016-01-01

    The detection and characterization of emerging infectious agents has been a continuing public health concern. High Throughput Sequencing (HTS) or Next-Generation Sequencing (NGS) technologies have proven to be promising approaches for efficient and unbiased detection of pathogens in complex biological samples, providing access to comprehensive analyses. As NGS approaches typically yield millions of putatively representative reads per sample, efficient data management and visualization resources have become mandatory. Most usually, those resources are implemented through a dedicated Laboratory Information Management System (LIMS), solely to provide perspective regarding the available information. We developed an easily deployable web-interface, facilitating management and bioinformatics analysis of metagenomics data-samples. It was engineered to run associated and dedicated Galaxy workflows for the detection and eventually classification of pathogens. The web application allows easy interaction with existing Galaxy metagenomic workflows, facilitates the organization, exploration and aggregation of the most relevant sample-specific sequences among millions of genomic sequences, allowing them to determine their relative abundance, and associate them to the most closely related organism or pathogen. The user-friendly Django-Based interface, associates the users’ input data and its metadata through a bio-IT provided set of resources (a Galaxy instance, and both sufficient storage and grid computing power). Galaxy is used to handle and analyze the user’s input data from loading, indexing, mapping, assembly and DB-searches. Interaction between our application and Galaxy is ensured by the BioBlend library, which gives API-based access to Galaxy’s main features. Metadata about samples, runs, as well as the workflow results are stored in the LIMS. For metagenomic classification and exploration purposes, we show, as a proof of concept, that integration of intuitive exploratory tools, like Krona for representation of taxonomic classification, can be achieved very easily. In the trend of Galaxy, the interface enables the sharing of scientific results to fellow team members. PMID:28451381

  8. MetaGenSense: A web-application for analysis and exploration of high throughput sequencing metagenomic data.

    PubMed

    Correia, Damien; Doppelt-Azeroual, Olivia; Denis, Jean-Baptiste; Vandenbogaert, Mathias; Caro, Valérie

    2015-01-01

    The detection and characterization of emerging infectious agents has been a continuing public health concern. High Throughput Sequencing (HTS) or Next-Generation Sequencing (NGS) technologies have proven to be promising approaches for efficient and unbiased detection of pathogens in complex biological samples, providing access to comprehensive analyses. As NGS approaches typically yield millions of putatively representative reads per sample, efficient data management and visualization resources have become mandatory. Most usually, those resources are implemented through a dedicated Laboratory Information Management System (LIMS), solely to provide perspective regarding the available information. We developed an easily deployable web-interface, facilitating management and bioinformatics analysis of metagenomics data-samples. It was engineered to run associated and dedicated Galaxy workflows for the detection and eventually classification of pathogens. The web application allows easy interaction with existing Galaxy metagenomic workflows, facilitates the organization, exploration and aggregation of the most relevant sample-specific sequences among millions of genomic sequences, allowing them to determine their relative abundance, and associate them to the most closely related organism or pathogen. The user-friendly Django-Based interface, associates the users' input data and its metadata through a bio-IT provided set of resources (a Galaxy instance, and both sufficient storage and grid computing power). Galaxy is used to handle and analyze the user's input data from loading, indexing, mapping, assembly and DB-searches. Interaction between our application and Galaxy is ensured by the BioBlend library, which gives API-based access to Galaxy's main features. Metadata about samples, runs, as well as the workflow results are stored in the LIMS. For metagenomic classification and exploration purposes, we show, as a proof of concept, that integration of intuitive exploratory tools, like Krona for representation of taxonomic classification, can be achieved very easily. In the trend of Galaxy, the interface enables the sharing of scientific results to fellow team members.

  9. EBUS-Guided Cautery-Assisted Transbronchial Forceps Biopsies: Safety and Sensitivity Relative to Transbronchial Needle Aspiration

    PubMed Central

    Bramley, Kyle; Pisani, Margaret A.; Murphy, Terrence E.; Araujo, Katy; Homer, Robert; Puchalski, Jonathan

    2016-01-01

    Background EBUS-guided transbronchial needle aspiration (TBNA) is important in the evaluation of thoracic lymphadenopathy. Reliably providing excellent diagnostic yield for malignancy, its diagnosis of sarcoidosis is inconsistent. Furthermore, when larger “core” biopsy samples of malignant tissue are required, TBNA may not suffice. The primary objective of this study was to determine if the sequential use of TBNA and a novel technique called cautery-assisted transbronchial forceps biopsies (ca-TBFB) was safe. Secondary outcomes included sensitivity and successful acquisition of tissue. Methods Fifty unselected patients undergoing convex probe EBUS were prospectively enrolled. Under EBUS guidance, all lymph nodes ≥ 1 cm were sequentially biopsied using TBNA and ca-TBFB. Safety and sensitivity were assessed at the nodal level for 111 nodes. Results of each technique were also reported on a per-patient basis. Results There were no significant adverse events. In nodes determined to be malignant, TBNA provided higher sensitivity (100%) than ca-TBFB (78%). However, among nodes with granulomatous inflammation, ca-TBFB exhibited higher sensitivity (90%) than TBNA (33%). For analysis based on patients rather than nodes, 6 of the 31 patients with malignancy would have been missed or understaged if the diagnosis was based on samples obtained by ca-TBFB. On the other hand, 3 of 8 patients with sarcoidosis would have been missed if analysis was based only on TBNA samples. In some cases only ca-TBFB acquired sufficient tissue for the core samples needed in clinical trials of malignancy. Conclusions The sequential use of TBNA and ca-TBFB appears to be safe. The larger samples obtained from ca-TBFB increased its sensitivity to detect granulomatous disease and provided specimens for clinical trials of malignancy when needle biopsies were insufficient. For thoracic surgeons and advanced bronchoscopists, we advocate ca-TBFB as an alternative to TBNA in select clinical scenarios. PMID:26912301

  10. Endobronchial Ultrasound-Guided Cautery-Assisted Transbronchial Forceps Biopsies: Safety and Sensitivity Relative to Transbronchial Needle Aspiration.

    PubMed

    Bramley, Kyle; Pisani, Margaret A; Murphy, Terrence E; Araujo, Katy L; Homer, Robert J; Puchalski, Jonathan T

    2016-05-01

    Endobronchial ultrasound (EBUS)-guided transbronchial needle aspiration (TBNA) is important in the evaluation of thoracic lymphadenopathy. Reliably providing excellent diagnostic yield for malignancy, its diagnosis of sarcoidosis is inconsistent. Furthermore, TBNA may not suffice when larger "core biopsy" samples of malignant tissue are required. The primary objective of this study was to determine if the sequential use of TBNA and a novel technique called cautery-assisted transbronchial forceps biopsy (ca-TBFB) was safe. Secondary outcomes included sensitivity and successful acquisition of tissue. The study prospectively enrolled 50 unselected patients undergoing convex-probe EBUS. All lymph nodes exceeding 1 cm were sequentially biopsied under EBUS guidance using TBNA and ca-TBFB. Safety and sensitivity were assessed at the nodal level for 111 nodes. Results of each technique were also reported for each patient. There were no significant adverse events. In nodes determined to be malignant, TBNA provided higher sensitivity (100%) than ca-TBFB (78%). However, among nodes with granulomatous inflammation, ca-TBFB exhibited higher sensitivity (90%) than TBNA (33%). On the one hand, for analysis based on patients rather than nodes, 6 of the 31 patients with malignancy would have been missed or understaged if the diagnosis were based on samples obtained by ca-TBFB. On the other hand, 3 of 8 patients with sarcoidosis would have been missed if analysis were based only on TBNA samples. In some patients, only ca-TBFB acquired sufficient tissue for the core samples needed in clinical trials of malignancy. The sequential use of TBNA and ca-TBFB appears to be safe. The larger samples obtained from ca-TBFB increased its sensitivity to detect granulomatous disease and provided adequate specimens for clinical trials of malignancy when specimens from needle biopsies were insufficient. For thoracic surgeons and advanced bronchoscopists, we advocate ca-TBFB as an alternative to TBNA in select clinical scenarios. Copyright © 2016 The Society of Thoracic Surgeons. Published by Elsevier Inc. All rights reserved.

  11. Entropy from State Probabilities: Hydration Entropy of Cations

    PubMed Central

    2013-01-01

    Entropy is an important energetic quantity determining the progression of chemical processes. We propose a new approach to obtain hydration entropy directly from probability density functions in state space. We demonstrate the validity of our approach for a series of cations in aqueous solution. Extensive validation of simulation results was performed. Our approach does not make prior assumptions about the shape of the potential energy landscape and is capable of calculating accurate hydration entropy values. Sampling times in the low nanosecond range are sufficient for the investigated ionic systems. Although the presented strategy is at the moment limited to systems for which a scalar order parameter can be derived, this is not a principal limitation of the method. The strategy presented is applicable to any chemical system where sufficient sampling of conformational space is accessible, for example, by computer simulations. PMID:23651109

  12. Image subsampling and point scoring approaches for large-scale marine benthic monitoring programs

    NASA Astrophysics Data System (ADS)

    Perkins, Nicholas R.; Foster, Scott D.; Hill, Nicole A.; Barrett, Neville S.

    2016-07-01

    Benthic imagery is an effective tool for quantitative description of ecologically and economically important benthic habitats and biota. The recent development of autonomous underwater vehicles (AUVs) allows surveying of spatial scales that were previously unfeasible. However, an AUV collects a large number of images, the scoring of which is time and labour intensive. There is a need to optimise the way that subsamples of imagery are chosen and scored to gain meaningful inferences for ecological monitoring studies. We examine the trade-off between the number of images selected within transects and the number of random points scored within images on the percent cover of target biota, the typical output of such monitoring programs. We also investigate the efficacy of various image selection approaches, such as systematic or random, on the bias and precision of cover estimates. We use simulated biotas that have varying size, abundance and distributional patterns. We find that a relatively small sampling effort is required to minimise bias. An increased precision for groups that are likely to be the focus of monitoring programs is best gained through increasing the number of images sampled rather than the number of points scored within images. For rare species, sampling using point count approaches is unlikely to provide sufficient precision, and alternative sampling approaches may need to be employed. The approach by which images are selected (simple random sampling, regularly spaced etc.) had no discernible effect on mean and variance estimates, regardless of the distributional pattern of biota. Field validation of our findings is provided through Monte Carlo resampling analysis of a previously scored benthic survey from temperate waters. We show that point count sampling approaches are capable of providing relatively precise cover estimates for candidate groups that are not overly rare. The amount of sampling required, in terms of both the number of images and number of points, varies with the abundance, size and distributional pattern of target biota. Therefore, we advocate either the incorporation of prior knowledge or the use of baseline surveys to establish key properties of intended target biota in the initial stages of monitoring programs.

  13. Genotyping of ancient Mycobacterium tuberculosis strains reveals historic genetic diversity.

    PubMed

    Müller, Romy; Roberts, Charlotte A; Brown, Terence A

    2014-04-22

    The evolutionary history of the Mycobacterium tuberculosis complex (MTBC) has previously been studied by analysis of sequence diversity in extant strains, but not addressed by direct examination of strain genotypes in archaeological remains. Here, we use ancient DNA sequencing to type 11 single nucleotide polymorphisms and two large sequence polymorphisms in the MTBC strains present in 10 archaeological samples from skeletons from Britain and Europe dating to the second-nineteenth centuries AD. The results enable us to assign the strains to groupings and lineages recognized in the extant MTBC. We show that at least during the eighteenth-nineteenth centuries AD, strains of M. tuberculosis belonging to different genetic groups were present in Britain at the same time, possibly even at a single location, and we present evidence for a mixed infection in at least one individual. Our study shows that ancient DNA typing applied to multiple samples can provide sufficiently detailed information to contribute to both archaeological and evolutionary knowledge of the history of tuberculosis.

  14. Rapid assessment of target species: Byssate bivalves in a large tropical port.

    PubMed

    Minchin, Dan; Olenin, Sergej; Liu, Ta-Kang; Cheng, Muhan; Huang, Sheng-Chih

    2016-11-15

    Rapid assessment sampling for target species is a fast cost-effective method aimed at determining the presence, abundance and distribution of alien and native harmful aquatic organisms and pathogens that may have been introduced by shipping. In this study, the method was applied within a large tropical port expected to have a high species diversity. The port of Kaohsiung was sampled for bivalve molluscan species that attach using a byssus. Such species, due to their biological traits, are spread by ships to ports worldwide. We estimated the abundance and distribution range of one dreissenid (Mytilopsis sallei) and four mytilids (Brachidontes variabilis, Arcuatula senhousa, Mytilus galloprovincialis, Perna viridis) known to be successful invaders and identified as potential pests, or high-risk harmful native or non-native species. We conclude that a rapid assessment of their abundance and distribution within a port, and its vicinity, is efficient and can provide sufficient information for decision making by port managers where IMO port exemptions may be sought. Copyright © 2016. Published by Elsevier Ltd.

  15. Towards real-time metabolic profiling of a biopsy specimen during a surgical operation by 1H high resolution magic angle spinning nuclear magnetic resonance: a case report

    PubMed Central

    2012-01-01

    Introduction Providing information on cancerous tissue samples during a surgical operation can help surgeons delineate the limits of a tumoral invasion more reliably. Here, we describe the use of metabolic profiling of a colon biopsy specimen by high resolution magic angle spinning nuclear magnetic resonance spectroscopy to evaluate tumoral invasion during a simulated surgical operation. Case presentation Biopsy specimens (n = 9) originating from the excised right colon of a 66-year-old Caucasian women with an adenocarcinoma were automatically analyzed using a previously built statistical model. Conclusions Metabolic profiling results were in full agreement with those of a histopathological analysis. The time-response of the technique is sufficiently fast for it to be used effectively during a real operation (17 min/sample). Metabolic profiling has the potential to become a method to rapidly characterize cancerous biopsies in the operation theater. PMID:22257563

  16. A simple autocorrelation algorithm for determining grain size from digital images of sediment

    USGS Publications Warehouse

    Rubin, D.M.

    2004-01-01

    Autocorrelation between pixels in digital images of sediment can be used to measure average grain size of sediment on the bed, grain-size distribution of bed sediment, and vertical profiles in grain size in a cross-sectional image through a bed. The technique is less sensitive than traditional laboratory analyses to tails of a grain-size distribution, but it offers substantial other advantages: it is 100 times as fast; it is ideal for sampling surficial sediment (the part that interacts with a flow); it can determine vertical profiles in grain size on a scale finer than can be sampled physically; and it can be used in the field to provide almost real-time grain-size analysis. The technique can be applied to digital images obtained using any source with sufficient resolution, including digital cameras, digital video, or underwater digital microscopes (for real-time grain-size mapping of the bed). ?? 2004, SEPM (Society for Sedimentary Geology).

  17. Effect of cooking procedures of kiymali pide, a traditional Turkish fast-food, on destruction of Escherichia coli O157:H7.

    PubMed

    Ilhak, Osman İrfan; Dikici, Abdullah; Can, Ozlem Pelin; Seker, Pınar; Oksüztepe, Gülsüm; Calıcıoğlu, Mehmet

    2013-06-01

    The objective of the present study was to obtain data about cooking time and temperature of kiymali pide in the restaurants and to investigate thermal inactivation of E. coli O157:H7 during experimental kiymali pide making. A field study was conducted in randomly selected 23 of 87 pide restaurants. Processing parameters including oven temperature, cooking period and post-cooking temperature were determined. Kiymali pide samples were prepared using ground beef filling experimentally inoculated with E. coli O157:H7 (7.6 log10 CFU/g). Pide samples were cooked at a conventional oven at 180 °C for 180, 240, 270, 300 and 330 s. Results of the current study suggest that cooking kiymali pide at 180 °C for at least 330 s (5.5 min) may provide sufficient food safety assurance (≥6 log10 CFU/g) for E. coli O157:H7. Copyright © 2013 Elsevier Ltd. All rights reserved.

  18. Wang-Landau Reaction Ensemble Method: Simulation of Weak Polyelectrolytes and General Acid-Base Reactions.

    PubMed

    Landsgesell, Jonas; Holm, Christian; Smiatek, Jens

    2017-02-14

    We present a novel method for the study of weak polyelectrolytes and general acid-base reactions in molecular dynamics and Monte Carlo simulations. The approach combines the advantages of the reaction ensemble and the Wang-Landau sampling method. Deprotonation and protonation reactions are simulated explicitly with the help of the reaction ensemble method, while the accurate sampling of the corresponding phase space is achieved by the Wang-Landau approach. The combination of both techniques provides a sufficient statistical accuracy such that meaningful estimates for the density of states and the partition sum can be obtained. With regard to these estimates, several thermodynamic observables like the heat capacity or reaction free energies can be calculated. We demonstrate that the computation times for the calculation of titration curves with a high statistical accuracy can be significantly decreased when compared to the original reaction ensemble method. The applicability of our approach is validated by the study of weak polyelectrolytes and their thermodynamic properties.

  19. MPA Portable: A Stand-Alone Software Package for Analyzing Metaproteome Samples on the Go.

    PubMed

    Muth, Thilo; Kohrs, Fabian; Heyer, Robert; Benndorf, Dirk; Rapp, Erdmann; Reichl, Udo; Martens, Lennart; Renard, Bernhard Y

    2018-01-02

    Metaproteomics, the mass spectrometry-based analysis of proteins from multispecies samples faces severe challenges concerning data analysis and results interpretation. To overcome these shortcomings, we here introduce the MetaProteomeAnalyzer (MPA) Portable software. In contrast to the original server-based MPA application, this newly developed tool no longer requires computational expertise for installation and is now independent of any relational database system. In addition, MPA Portable now supports state-of-the-art database search engines and a convenient command line interface for high-performance data processing tasks. While search engine results can easily be combined to increase the protein identification yield, an additional two-step workflow is implemented to provide sufficient analysis resolution for further postprocessing steps, such as protein grouping as well as taxonomic and functional annotation. Our new application has been developed with a focus on intuitive usability, adherence to data standards, and adaptation to Web-based workflow platforms. The open source software package can be found at https://github.com/compomics/meta-proteome-analyzer .

  20. Ghost Particle Velocimetry: Accurate 3D Flow Visualization Using Standard Lab Equipment

    NASA Astrophysics Data System (ADS)

    Buzzaccaro, Stefano; Secchi, Eleonora; Piazza, Roberto

    2013-07-01

    We describe and test a new approach to particle velocimetry, based on imaging and cross correlating the scattering speckle pattern generated on a near-field plane by flowing tracers with a size far below the diffraction limit, which allows reconstructing the velocity pattern in microfluidic channels without perturbing the flow. As a matter of fact, adding tracers is not even strictly required, provided that the sample displays sufficiently refractive-index fluctuations. For instance, phase separation in liquid mixtures in the presence of shear is suitable to be directly investigated by this “ghost particle velocimetry” technique, which just requires a microscope with standard lamp illumination equipped with a low-cost digital camera. As a further bonus, the peculiar spatial coherence properties of the illuminating source, which displays a finite longitudinal coherence length, allows for a 3D reconstruction of the profile with a resolution of few tenths of microns and makes the technique suitable to investigate turbid samples with negligible multiple scattering effects.

  1. Optical Measurement of Radiocarbon below Unity Fraction Modern by Linear Absorption Spectroscopy.

    PubMed

    Fleisher, Adam J; Long, David A; Liu, Qingnan; Gameson, Lyn; Hodges, Joseph T

    2017-09-21

    High-precision measurements of radiocarbon ( 14 C) near or below a fraction modern 14 C of 1 (F 14 C ≤ 1) are challenging and costly. An accurate, ultrasensitive linear absorption approach to detecting 14 C would provide a simple and robust benchtop alternative to off-site accelerator mass spectrometry facilities. Here we report the quantitative measurement of 14 C in gas-phase samples of CO 2 with F 14 C < 1 using cavity ring-down spectroscopy in the linear absorption regime. Repeated analysis of CO 2 derived from the combustion of either biogenic or petrogenic sources revealed a robust ability to differentiate samples with F 14 C < 1. With a combined uncertainty of 14 C/ 12 C = 130 fmol/mol (F 14 C = 0.11), initial performance of the calibration-free instrument is sufficient to investigate a variety of applications in radiocarbon measurement science including the study of biofuels and bioplastics, illicitly traded specimens, bomb dating, and atmospheric transport.

  2. Blood platelet counts, morphology and morphometry in lions, Panthera leo.

    PubMed

    Du Plessis, L

    2009-09-01

    Due to logistical problems in obtaining sufficient blood samples from apparently healthy animals in the wild in order to establish normal haematological reference values, only limited information regarding the blood platelet count and morphology of free-living lions (Panthera leo) is available. This study provides information on platelet counts and describes their morphology with particular reference to size in two normal, healthy and free-ranging lion populations. Blood samples were collected from a total of 16 lions. Platelet counts, determined manually, ranged between 218 and 358 x 10(9)/l. Light microscopy showed mostly activated platelets of various sizes with prominent granules. At the ultrastructural level the platelets revealed typical mammalian platelet morphology. However, morphometric analysis revealed a significant difference (P < 0.001) in platelet size between the two groups of animals. Basic haematological information obtained in this study may be helpful in future comparative studies between animals of the same species as well as in other felids.

  3. The Army Communications Objectives Measurement System (ACOMS): Survey Design

    DTIC Science & Technology

    1988-04-01

    monthly basis so that the annual sample includes sufficient Hispanics to detect at the .80 power level: (1) Year-to-year changes of 3% in item...Hispanics. The requirements are listed in terms of power level and must be translated into requisite sample sizes. The requirements are expressed as the...annual samples needed to detect certain differences at the 80% power level. Differences in both directions are to be examined, so that a two-tailed

  4. Developement of the Potassium-Argon Laser Experiment (KArLE) for In Situ Geochronology

    NASA Technical Reports Server (NTRS)

    Cohen, Barbara A.

    2012-01-01

    Absolute dating of planetary samples is an essential tool to establish the chronology of geological events, including crystallization history, magmatic evolution, and alteration. Thus far, radiometric geochronology of planetary samples has only been accomplishable in terrestrial laboratories on samples from dedicated sample return missions and meteorites. In situ instruments to measure rock ages have been proposed, but none have yet reached TRL 6, because isotopic measurements with sufficient resolution are challenging. We have begun work under the NASA Planetary Instrument Definition and Development Program (PIDDP) to develop the Potassium (K) - Argon Laser Experiment (KArLE), a novel combination of several flight-proven components that will enable accurate KAr isochron dating of planetary rocks. KArLE will ablate a rock sample, measure the K in the plasma state using laser-induced breakdown spectroscopy (LIBS), measure the liberated Ar using quadrupole mass spectrometry (QMS), and relate the two by measuring the volume of the abated pit using a optical methods such as a vertical scanning interferometer (VSI). Our preliminary work indicates that the KArLE instrument will be capable of determining the age of several kinds of planetary samples to 100 Myr, sufficient to address a wide range of geochronology problems in planetary science. Additional benefits derive from the fact that each KArLE component achieves analyses common to most planetary surface missions.

  5. 75 FR 81789 - Third Party Testing for Certain Children's Products; Full-Size Baby Cribs and Non-Full-Size Baby...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-28

    ... sufficient samples of the product, or samples that are identical in all material respects to the product. The... 1220, Safety Standards for Full-Size Baby Cribs and Non-Full- Size Baby Cribs. A true copy, in English... assessment bodies seeking accredited status must submit to the Commission copies, in English, of their...

  6. Characterizing the Conformational Landscape of Flavivirus Fusion Peptides via Simulation and Experiment

    PubMed Central

    Marzinek, Jan K.; Lakshminarayanan, Rajamani; Goh, Eunice; Huber, Roland G.; Panzade, Sadhana; Verma, Chandra; Bond, Peter J.

    2016-01-01

    Conformational changes in the envelope proteins of flaviviruses help to expose the highly conserved fusion peptide (FP), a region which is critical to membrane fusion and host cell infection, and which represents a significant target for antiviral drugs and antibodies. In principle, extended timescale atomic-resolution simulations may be used to characterize the dynamics of such peptides. However, the resultant accuracy is critically dependent upon both the underlying force field and sufficient conformational sampling. In the present study, we report a comprehensive comparison of three simulation methods and four force fields comprising a total of more than 40 μs of sampling. Additionally, we describe the conformational landscape of the FP fold across all flavivirus family members. All investigated methods sampled conformations close to available X-ray structures, but exhibited differently populated ensembles. The best force field / sampling combination was sufficiently accurate to predict that the solvated peptide fold is less ordered than in the crystallographic state, which was subsequently confirmed via circular dichroism and spectrofluorometric measurements. Finally, the conformational landscape of a mutant incapable of membrane fusion was significantly shallower than wild-type variants, suggesting that dynamics should be considered when therapeutically targeting FP epitopes. PMID:26785994

  7. The Preignition and Autoignition Oxidation of Alternatives to Petroleum Derived JP-8 and their Surrogate Components in a Pressurized Flow Reactor and Single Cylinder Research Engine

    DTIC Science & Technology

    2009-09-01

    sample probe consisted of TIG welding the 3/8” sample probe shaft to the sample probe tip (Koert, 1990 and Lenhert, 2004b). Silver solder was...was performed in the Drexel University Machine Shop. Conventional TIG welding was sufficient for welding the 3/8” O.D. tube to the sample probe tip...However, to TIG weld the thermocouple and the glass lined tube to the sample probe tip, extreme care had to be taken so as not to damage the

  8. Continual in situ monitoring of pore water stable isotopes in the subsurface

    NASA Astrophysics Data System (ADS)

    Volkmann, T. H. M.; Weiler, M.

    2014-05-01

    Stable isotope signatures provide an integral fingerprint of origin, flow paths, transport processes, and residence times of water in the environment. However, the full potential of stable isotopes to quantitatively characterize subsurface water dynamics is yet unfolded due to the difficulty in obtaining extensive, detailed, and repeated measurements of pore water in the unsaturated and saturated zone. This paper presents a functional and cost-efficient system for non-destructive continual in situ monitoring of pore water stable isotope signatures with high resolution. Automatic controllable valve arrays are used to continuously extract diluted water vapor in soil air via a branching network of small microporous probes into a commercial laser-based isotope analyzer. Normalized liquid-phase isotope signatures are then obtained based on a specific on-site calibration approach along with basic corrections for instrument bias and temperature dependent isotopic fractionation. The system was applied to sample depth profiles on three experimental plots with varied vegetation cover in southwest Germany. Two methods (i.e., based on advective versus diffusive vapor extraction) and two modes of sampling (i.e., using multiple permanently installed probes versus a single repeatedly inserted probe) were tested and compared. The results show that the isotope distribution along natural profiles could be resolved with sufficiently high accuracy and precision at sampling intervals of less than four minutes. The presented in situ approaches may thereby be used interchangeably with each other and with concurrent laboratory-based direct equilibration measurements of destructively collected samples. It is thus found that the introduced sampling techniques provide powerful tools towards a detailed quantitative understanding of dynamic and heterogeneous shallow subsurface and vadose zone processes.

  9. Improved analysis of SP and CoSaMP under total perturbations

    NASA Astrophysics Data System (ADS)

    Li, Haifeng

    2016-12-01

    Practically, in the underdetermined model y= A x, where x is a K sparse vector (i.e., it has no more than K nonzero entries), both y and A could be totally perturbed. A more relaxed condition means less number of measurements are needed to ensure the sparse recovery from theoretical aspect. In this paper, based on restricted isometry property (RIP), for subspace pursuit (SP) and compressed sampling matching pursuit (CoSaMP), two relaxed sufficient conditions are presented under total perturbations to guarantee that the sparse vector x is recovered. Taking random matrix as measurement matrix, we also discuss the advantage of our condition. Numerical experiments validate that SP and CoSaMP can provide oracle-order recovery performance.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Collister, R.; Gwinner, G.; Tandecki, M.

    We present the isotope shifts of the 7s 1/2 to 7p 1/2 transition for francium isotopes ²⁰⁶⁻²¹³Fr with reference to ²²¹Fr collected from two experimental periods. The shifts are measured on a sample of atoms prepared within a magneto-optical trap by a fast sweep of radio-frequency sidebands applied to a carrier laser. King plot analysis, which includes literature values for 7s 1/2 to 7p 3/2 isotope shifts, provides a field shift constant ratio of 1.0520(10) and a difference between the specific mass shift constants of 170(100) GHz amu between the D₁ and D₂ transitions, of sufficient precision to differentiate betweenmore » ab initio calculations.« less

  11. Biogeochemical evidence for subsurface hydrocarbon occurrence, Recluse oil field, Wyoming; preliminary results

    USGS Publications Warehouse

    Dalziel, Mary C.; Donovan, Terrence J.

    1980-01-01

    Anomalously high manganese-to-iron ratios occurring in pine needles and sage leaves over the Recluse oil field, Wyoming, suggest effects of petroleum microseepage on the plants. This conclusion is supported by iron and manganese concentrations in soils and carbon and oxygen isotope ratios in rock samples. Seeping hydrocarbons provided reducing conditions sufficient to enable divalent iron and manganese to be organically complexed or adsorbed on solids in the soils. These bound or adsorped elements in the divalent state are essential to plants, and the plants readily assimilate them. The magnitude of the plant anomalies, combined with the supportive isotopic and chemical evidence confirming petroleum leakage, makes a strong case for the use of plants as a biogeochemical prospecting tool.

  12. Cardiac auscultation in sports medicine: strategies to improve clinical care.

    PubMed

    Barrett, Michael J; Ayub, Bilal; Martinez, Matthew W

    2012-01-01

    Cardiac auscultation is an important part of the preparticipation physical examination of athletes. Sudden death remains a rare but tragic event among athletes. The most common cause of sudden death among young athletes in the United States continues to be hypertrophic cardiomyopathy, which may or may not present with a typical heart murmur. Many clinicians do not possess sufficient proficiency in recognizing abnormal heart murmurs. New insights in the field of auditory learning suggest that cardiac auscultation is more of a technical skill than an intellectual one. Intensive repetition of abnormal heart murmurs has been shown to improve proficiency in cardiac auscultation markedly. Sample audio files of two important murmurs, i.e., an innocent murmur and hypertrophic cardiomyopathy, are provided online with this review.

  13. HOW TO FIND YOUNG MASSIVE CLUSTER PROGENITORS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bressert, E.; Longmore, S.; Testi, L.

    2012-10-20

    We propose that bound, young massive stellar clusters form from dense clouds that have escape speeds greater than the sound speed in photo-ionized gas. In these clumps, radiative feedback in the form of gas ionization is bottled up, enabling star formation to proceed to sufficiently high efficiency so that the resulting star cluster remains bound even after gas removal. We estimate the observable properties of the massive proto-clusters (MPCs) for existing Galactic plane surveys and suggest how they may be sought in recent and upcoming extragalactic observations. These surveys will potentially provide a significant sample of MPC candidates that willmore » allow us to better understand extreme star-formation and massive cluster formation in the Local Universe.« less

  14. Indirect gonioscopy system for imaging iridocorneal angle of eye

    NASA Astrophysics Data System (ADS)

    Perinchery, Sandeep M.; Fu, Chan Yiu; Baskaran, Mani; Aung, Tin; Murukeshan, V. M.

    2017-08-01

    Current clinical optical imaging systems do not provide sufficient structural information of trabecular meshwork (TM) in the iridocorneal angle (ICA) of the eye due to their low resolution. Increase in the intraocular pressure (IOP) can occur due to the abnormalities in TM, which could subsequently lead to glaucoma. Here, we present an indirect gonioscopy based imaging probe with significantly improved visualization of structures in the ICA including TM region, compared to the currently available tools. Imaging quality of the developed system was tested in porcine samples. Improved direct high quality visualization of the TM region through this system can be used for Laser trabeculoplasty, which is a primary treatment of glaucoma. This system is expected to be used complementary to angle photography and gonioscopy.

  15. Linear decentralized learning control

    NASA Technical Reports Server (NTRS)

    Lee, Soo C.; Longman, Richard W.; Phan, Minh

    1992-01-01

    The new field of learning control develops controllers that learn to improve their performance at executing a given task, based on experience performing this task. The simplest forms of learning control are based on the same concept as integral control, but operating in the domain of the repetitions of the task. This paper studies the use of such controllers in a decentralized system, such as a robot with the controller for each link acting independently. The basic result of the paper is to show that stability of the learning controllers for all subsystems when the coupling between subsystems is turned off, assures stability of the decentralized learning in the coupled system, provided that the sample time in the digital learning controller is sufficiently short.

  16. A Thick Target for Synchrotrons and Betatrons

    DOE R&D Accomplishments Database

    McMillan, E. M.

    1950-09-19

    If a wide x-ray beam from an electron synchrotron or betatron is desired, in radiographic work with large objects for example, the usually very thin target may be replaced by a thick one, provided the resulting distortion of the x-ray spectrum due to multiple radiative processes is permissible. It is difficult to make the circulating electron beam traverse a thick target directly because of the small spacing between successive turns. Mounting a very thin beryllium, or other low-z material, fin on the edge of the thick target so that the fin projects into the beam will cause the beam to lose sufficient energy, and therefore radium, to strike the thick target the next time around. Sample design calculations are given.

  17. Short-Term Intra-Subject Variation in Exhaled Volatile Organic Compounds (VOCs) in COPD Patients and Healthy Controls and Its Effect on Disease Classification

    PubMed Central

    Phillips, Christopher; Mac Parthaláin, Neil; Syed, Yasir; Deganello, Davide; Claypole, Timothy; Lewis, Keir

    2014-01-01

    Exhaled volatile organic compounds (VOCs) are of interest for their potential to diagnose disease non-invasively. However, most breath VOC studies have analyzed single breath samples from an individual and assumed them to be wholly consistent representative of the person. This provided the motivation for an investigation of the variability of breath profiles when three breath samples are taken over a short time period (two minute intervals between samples) for 118 stable patients with Chronic Obstructive Pulmonary Disease (COPD) and 63 healthy controls and analyzed by gas chromatography and mass spectroscopy (GC/MS). The extent of the variation in VOC levels differed between COPD and healthy subjects and the patterns of variation differed for isoprene versus the bulk of other VOCs. In addition, machine learning approaches were applied to the breath data to establish whether these samples differed in their ability to discriminate COPD from healthy states and whether aggregation of multiple samples, into single data sets, could offer improved discrimination. The three breath samples gave similar classification accuracy to one another when evaluated separately (66.5% to 68.3% subjects classified correctly depending on the breath repetition used). Combining multiple breath samples into single data sets gave better discrimination (73.4% subjects classified correctly). Although accuracy is not sufficient for COPD diagnosis in a clinical setting, enhanced sampling and analysis may improve accuracy further. Variability in samples, and short-term effects of practice or exertion, need to be considered in any breath testing program to improve reliability and optimize discrimination. PMID:24957028

  18. Short-Term Intra-Subject Variation in Exhaled Volatile Organic Compounds (VOCs) in COPD Patients and Healthy Controls and Its Effect on Disease Classification.

    PubMed

    Phillips, Christopher; Mac Parthaláin, Neil; Syed, Yasir; Deganello, Davide; Claypole, Timothy; Lewis, Keir

    2014-05-09

    Exhaled volatile organic compounds (VOCs) are of interest for their potential to diagnose disease non-invasively. However, most breath VOC studies have analyzed single breath samples from an individual and assumed them to be wholly consistent representative of the person. This provided the motivation for an investigation of the variability of breath profiles when three breath samples are taken over a short time period (two minute intervals between samples) for 118 stable patients with Chronic Obstructive Pulmonary Disease (COPD) and 63 healthy controls and analyzed by gas chromatography and mass spectroscopy (GC/MS). The extent of the variation in VOC levels differed between COPD and healthy subjects and the patterns of variation differed for isoprene versus the bulk of other VOCs. In addition, machine learning approaches were applied to the breath data to establish whether these samples differed in their ability to discriminate COPD from healthy states and whether aggregation of multiple samples, into single data sets, could offer improved discrimination. The three breath samples gave similar classification accuracy to one another when evaluated separately (66.5% to 68.3% subjects classified correctly depending on the breath repetition used). Combining multiple breath samples into single data sets gave better discrimination (73.4% subjects classified correctly). Although accuracy is not sufficient for COPD diagnosis in a clinical setting, enhanced sampling and analysis may improve accuracy further. Variability in samples, and short-term effects of practice or exertion, need to be considered in any breath testing program to improve reliability and optimize discrimination.

  19. Evaluating the sufficiency of protected lands for maintaining wildlife population connectivity in the northern Rocky Mountains

    Treesearch

    Samuel A. Cushman; Erin L. Landguth; Curtis H. Flather

    2012-01-01

    Aim: The goal of this study was to evaluate the sufficiency of the network of protected lands in the U.S. northern Rocky Mountains in providing protection for habitat connectivity for 105 hypothetical organisms. A large proportion of the landscape...

  20. Sample size determination for mediation analysis of longitudinal data.

    PubMed

    Pan, Haitao; Liu, Suyu; Miao, Danmin; Yuan, Ying

    2018-03-27

    Sample size planning for longitudinal data is crucial when designing mediation studies because sufficient statistical power is not only required in grant applications and peer-reviewed publications, but is essential to reliable research results. However, sample size determination is not straightforward for mediation analysis of longitudinal design. To facilitate planning the sample size for longitudinal mediation studies with a multilevel mediation model, this article provides the sample size required to achieve 80% power by simulations under various sizes of the mediation effect, within-subject correlations and numbers of repeated measures. The sample size calculation is based on three commonly used mediation tests: Sobel's method, distribution of product method and the bootstrap method. Among the three methods of testing the mediation effects, Sobel's method required the largest sample size to achieve 80% power. Bootstrapping and the distribution of the product method performed similarly and were more powerful than Sobel's method, as reflected by the relatively smaller sample sizes. For all three methods, the sample size required to achieve 80% power depended on the value of the ICC (i.e., within-subject correlation). A larger value of ICC typically required a larger sample size to achieve 80% power. Simulation results also illustrated the advantage of the longitudinal study design. The sample size tables for most encountered scenarios in practice have also been published for convenient use. Extensive simulations study showed that the distribution of the product method and bootstrapping method have superior performance to the Sobel's method, but the product method was recommended to use in practice in terms of less computation time load compared to the bootstrapping method. A R package has been developed for the product method of sample size determination in mediation longitudinal study design.

  1. Diffusion cannot govern the discharge of neurotransmitter in fast synapses.

    PubMed Central

    Khanin, R; Parnas, H; Segel, L

    1994-01-01

    In the present work we show that diffusion cannot provide the observed fast discharge of neurotransmitter from a synaptic vesicle during neurotransmitter release, mainly because it is not sufficiently rapid nor is it sufficiently temperature-dependent. Modeling the discharge from the vesicle into the cleft as a continuous point source, we have determined that discharge should occur in 50-75 microseconds, to provide the observed high concentrations of transmitter at the critical zone. Images FIGURE 5 PMID:7811953

  2. Enabling two-dimensional fourier transform electronic spectroscopy on quantum dots

    NASA Astrophysics Data System (ADS)

    Hill, Robert John, Jr.

    Colloidal semiconductor nanocrystals exhibit unique properties not seen in their bulk counterparts. Quantum confinement of carriers causes a size-tunable bandgap, making them attractive candidates for solar cells. Fundamental understanding of their spectra and carrier dynamics is obscured by inhomogeneous broadening arising from the size distribution. Because quantum dots have long excited state lifetimes and are sensitive to both air and moisture, there are many potential artifacts in femtosecond experiments. Two-dimensional electronic spectroscopy promises insight into the photo-physics, but required key instrumental advances. Optics that can process a broad bandwidth without distortion are required for a two-dimensional optical spectrometer. To control pathlength differences for femtosecond time delays, hollow retro-reflectors are used on actively stabilized delay lines in interferometers. The fabrication of rigid, lightweight, precision hollow rooftop retroreflectors that allow beams to be stacked while preserving polarization is described. The rigidity and low mass enable active stabilization of an interferometer to within 0.6 nm rms displacement, while the return beam deviation is sufficient for Fourier transform spectroscopy with a frequency precision of better than 1 cm -1. Keeping samples oxygen and moisture free while providing fresh sample between laser shots is challenging in an interferometer. A low-vibration spinning sample cell was designed and built to keep samples oxygen free for days while allowing active stabilization of interferometer displacement to ˜1 nm. Combining these technologies has enabled 2D short-wave infrared spectroscopy on colloidal PbSe nanocrystals. 2D spectra demonstrate the advantages of this key instrumentation while providing valuable insight into the low-lying electronic states of colloidal quantum dots.

  3. The criterion of subscale sufficiency and its application to the relationship between static capillary pressure, saturation and interfacial areas.

    PubMed

    Kurzeja, Patrick

    2016-05-01

    Modern imaging techniques, increased simulation capabilities and extended theoretical frameworks, naturally drive the development of multiscale modelling by the question: which new information should be considered? Given the need for concise constitutive relationships and efficient data evaluation; however, one important question is often neglected: which information is sufficient? For this reason, this work introduces the formalized criterion of subscale sufficiency. This criterion states whether a chosen constitutive relationship transfers all necessary information from micro to macroscale within a multiscale framework. It further provides a scheme to improve constitutive relationships. Direct application to static capillary pressure demonstrates usefulness and conditions for subscale sufficiency of saturation and interfacial areas.

  4. Methods for the synthesis of olefins and derivatives

    DOEpatents

    Burk, Mark J; Pharkya, Priti; Van Dien, Stephen J; Burgard, Anthony P; Schilling, Christophe H

    2013-06-04

    The invention provides a method of producing acrylic acid. The method includes contacting fumaric acid with a sufficient amount of ethylene in the presence of a cross-metathesis transformation catalyst to produce about two moles of acrylic acid per mole of fumaric acid. Also provided is an acrylate ester. The method includes contacting fumarate diester with a sufficient amount of ethylene in the presence of a cross-metathesis transformation catalyst to produce about two moles of acrylate ester per mole of fumarate diester. An integrated process for process for producing acrylic acid or acrylate ester is provided which couples bioproduction of fumaric acid with metathesis transformation. An acrylic acid and an acrylate ester production also is provided.

  5. Methods for the synthesis of olefins and derivatives

    DOEpatents

    Burk, Mark J [San Diego, CA; Pharkya, Priti [San Diego, CA; Van Dien, Stephen J [Encinitas, CA; Burgard, Anthony P [Bellefonte, PA; Schilling, Christophe H [San Diego, CA

    2011-09-27

    The invention provides a method of producing acrylic acid. The method includes contacting fumaric acid with a sufficient amount of ethylene in the presence of a cross-metathesis transformation catalyst to produce about two moles of acrylic acid per mole of fumaric acid. Also provided is an acrylate ester. The method includes contacting fumarate diester with a sufficient amount of ethylene in the presence of a cross-metathesis transformation catalyst to produce about two moles of acrylate ester per mole of fumarate diester. An integrated process for process for producing acrylic acid or acrylate ester is provided which couples bioproduction of fumaric acid with metathesis transformation. An acrylic acid and an acrylate ester production also is provided.

  6. Methods for synthesis of olefins and derivatives

    DOEpatents

    Burk, Mark J.; Pharkya, Priti; Van Dien, Stephen J.; Burgard, Anthony P.; Schilling, Christophe H.

    2016-06-14

    The invention provides a method of producing acrylic acid. The method includes contacting fumaric acid with a sufficient amount of ethylene in the presence of a cross-metathesis transformation catalyst to produce about two moles of acrylic acid per mole of fumaric acid. Also provided is an acrylate ester. The method includes contacting fumarate diester with a sufficient amount of ethylene in the presence of a cross-metathesis transformation catalyst to produce about two moles of acrylate ester per mole of fumarate diester. An integrated process for process for producing acrylic acid or acrylate ester is provided which couples bioproduction of fumaric acid with metathesis transformation. An acrylic acid and an acrylate ester production also is provided.

  7. 10 CFR 431.135 - Units to be tested.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... EQUIPMENT Automatic Commercial Ice Makers Test Procedures § 431.135 Units to be tested. For each basic model of automatic commercial ice maker selected for testing, a sample of sufficient size shall be selected...

  8. Modality-Driven Classification and Visualization of Ensemble Variance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bensema, Kevin; Gosink, Luke; Obermaier, Harald

    Paper for the IEEE Visualization Conference Advances in computational power now enable domain scientists to address conceptual and parametric uncertainty by running simulations multiple times in order to sufficiently sample the uncertain input space.

  9. The Potassium-Argon Laser Experiment (KARLE): In Situ Geochronology for Planetary Robotic Missions

    NASA Technical Reports Server (NTRS)

    Cohen, B. A.; Devismes, D.; Miller, J. S.; Swindle, T. D.

    2014-01-01

    Isotopic dating is an essential tool to establish an absolute chronology for geological events, including crystallization history, magmatic evolution, and alteration events. The capability for in situ geochronology will open up the ability for geochronology to be accomplished as part of lander or rover complement, on multiple samples rather than just those returned. An in situ geochronology package can also complement sample return missions by identifying the most interesting rocks to cache or return to Earth. The K-Ar Laser Experiment (KArLE) brings together a novel combination of several flight-proven components to provide precise measurements of potassium (K) and argon (Ar) that will enable accurate isochron dating of planetary rocks. KArLE will ablate a rock sample, measure the K in the plasma state using laser-induced breakdown spectroscopy (LIBS), measure the liberated Ar using mass spectrometry (MS), and relate the two by measuring the volume of the ablated pit by optical imaging. Our work indicates that the KArLE instrument is capable of determining the age of planetary samples with sufficient accuracy to address a wide range of geochronology problems in planetary science. Additional benefits derive from the fact that each KArLE component achieves analyses useful for most planetary surface missions.

  10. DMSO Assisted Electrospray Ionization for the Detection of Small Peptide Hormones in Urine by Dilute-and-Shoot-Liquid-Chromatography-High Resolution Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Judák, Péter; Grainger, Janelle; Goebel, Catrin; Van Eenoo, Peter; Deventer, Koen

    2017-08-01

    The mobile phase additive (DMSO) has been described as a useful tool to enhance electrospray ionization (ESI) of peptides and proteins. So far, this technique has mainly been used in proteomic/peptide research, and its applicability in a routine clinical laboratory setting (i.e., doping control analysis) has not been described yet. This work provides a simple, easy to implement screening method for the detection of doping relevant small peptides (GHRPs, GnRHs, GHS, and vasopressin-analogues) with molecular weight less than 2 kDa applying DMSO in the mobile phase. The gain in sensitivity was sufficient to inject the urine samples after a 2-fold dilution step omitting a time consuming sample preparation. The employed analytical procedure was validated for the qualitative determination of 36 compounds, including 13 metabolites. The detection limits (LODs) ranged between 50 and 1000 pg/mL and were compliant with the 2 ng/mL minimum detection level required by the World Anti-Doping Agency (WADA) for all the target peptides. To demonstrate the feasibility of the work, urine samples obtained from patients who have been treated with desmopressin or leuprolide and urine samples that have been declared as adverse analytical findings were analyzed.

  11. Successful isolation and PCR amplification of DNA from National Institute of Standards and Technology herbal dietary supplement standard reference material powders and extracts.

    PubMed

    Cimino, Matthew T

    2010-03-01

    Twenty-four herbal dietary supplement powder and extract reference standards provided by the National Institute of Standards and Technology (NIST) were investigated using three different commercially available DNA extraction kits to evaluate DNA availability for downstream nucleotide-based applications. The material included samples of Camellia, Citrus, Ephedra, Ginkgo, Hypericum, Serenoa, And Vaccinium. Protocols from Qiagen, MoBio, and Phytopure were used to isolate and purify DNA from the NIST standards. The resulting DNA concentration was quantified using SYBR Green fluorometry. Each of the 24 samples yielded DNA, though the concentration of DNA from each approach was notably different. The Phytopure method consistently yielded more DNA. The average yield ratio was 22 : 3 : 1 (ng/microL; Phytopure : Qiagen : MoBio). Amplification of the internal transcribed spacer II region using PCR was ultimately successful in 22 of the 24 samples. Direct sequencing chromatograms of the amplified material suggested that most of the samples were comprised of mixtures. However, the sequencing chromatograms of 12 of the 24 samples were sufficient to confirm the identity of the target material. The successful extraction, amplification, and sequencing of DNA from these herbal dietary supplement extracts and powders supports a continued effort to explore nucleotide sequence-based tools for the authentication and identification of plants in dietary supplements. (c) Georg Thieme Verlag KG Stuttgart . New York.

  12. Possibilities for serial femtosecond crystallography sample delivery at future light sourcesa)

    PubMed Central

    Chavas, L. M. G.; Gumprecht, L.; Chapman, H. N.

    2015-01-01

    Serial femtosecond crystallography (SFX) uses X-ray pulses from free-electron laser (FEL) sources that can outrun radiation damage and thereby overcome long-standing limits in the structure determination of macromolecular crystals. Intense X-ray FEL pulses of sufficiently short duration allow the collection of damage-free data at room temperature and give the opportunity to study irreversible time-resolved events. SFX may open the way to determine the structure of biological molecules that fail to crystallize readily into large well-diffracting crystals. Taking advantage of FELs with high pulse repetition rates could lead to short measurement times of just minutes. Automated delivery of sample suspensions for SFX experiments could potentially give rise to a much higher rate of obtaining complete measurements than at today's third generation synchrotron radiation facilities, as no crystal alignment or complex robotic motions are required. This capability will also open up extensive time-resolved structural studies. New challenges arise from the resulting high rate of data collection, and in providing reliable sample delivery. Various developments for fully automated high-throughput SFX experiments are being considered for evaluation, including new implementations for a reliable yet flexible sample environment setup. Here, we review the different methods developed so far that best achieve sample delivery for X-ray FEL experiments and present some considerations towards the goal of high-throughput structure determination with X-ray FELs. PMID:26798808

  13. Quantification of in-contact probe-sample electrostatic forces with dynamic atomic force microscopy.

    PubMed

    Balke, Nina; Jesse, Stephen; Carmichael, Ben; Okatan, M Baris; Kravchenko, Ivan I; Kalinin, Sergei V; Tselev, Alexander

    2017-01-04

    Atomic force microscopy (AFM) methods utilizing resonant mechanical vibrations of cantilevers in contact with a sample surface have shown sensitivities as high as few picometers for detecting surface displacements. Such a high sensitivity is harnessed in several AFM imaging modes. Here, we demonstrate a cantilever-resonance-based method to quantify electrostatic forces on a probe in the probe-sample junction in the presence of a surface potential or when a bias voltage is applied to the AFM probe. We find that the electrostatic forces acting on the probe tip apex can produce signals equivalent to a few pm of surface displacement. In combination with modeling, the measurements of the force were used to access the strength of the electrical field at the probe tip apex in contact with a sample. We find an evidence that the electric field strength in the junction can reach ca. 1 V nm -1 at a bias voltage of a few volts and is limited by non-ideality of the tip-sample contact. This field is sufficiently strong to significantly influence material states and kinetic processes through charge injection, Maxwell stress, shifts of phase equilibria, and reduction of energy barriers for activated processes. Besides, the results provide a baseline for accounting for the effects of local electrostatic forces in electromechanical AFM measurements as well as offer additional means to probe ionic mobility and field-induced phenomena in solids.

  14. Incomplete reactions in nanothermite composites

    NASA Astrophysics Data System (ADS)

    Jacob, Rohit J.; Ortiz-Montalvo, Diana L.; Overdeep, Kyle R.; Weihs, Timothy P.; Zachariah, Michael R.

    2017-02-01

    Exothermic reactions between oxophilic metals and transition/post transition metal-oxides have been well documented owing to their fast reaction time scales (≈10 μs). This article examines the extent of the reaction in nano-aluminum based thermite systems through a forensic inspection of the products formed during reaction. Three nanothermite systems (Al/CuO, Al/Bi2O3, and Al/WO3) were selected owing to their diverse combustion characteristics, thereby providing sufficient generality and breadth to the analysis. Microgram quantities of the sample were coated onto a fine platinum wire, which was resistively heated at high heating rates (≈105 K/s) to ignite the sample. The subsequent products were captured/quenched very rapidly (≈500 μs) in order to preserve the chemistry/morphology during initiation and subsequent reaction and were quantitatively analyzed using electron microscopy and focused ion beam cross-sectioning followed by energy dispersive X-ray spectroscopy. Elemental examination of the cross-section of the quenched particles shows that oxygen is predominantly localized in the regions containing aluminum, implying the occurrence of the redox reaction. The Al/CuO system, which has simultaneous gaseous oxygen release and ignition (TIgnition ≈ TOxygen Release), shows a substantially lower oxygen content within the product particles as opposed to Al/Bi2O3 and Al/WO3 thermites, which are postulated to undergo a condensed phase reaction (TIgnition ≪ TOxygen Release). An effective Al:O composition for the interior section was obtained for all the mixtures, with the smaller particles generally showing a higher oxygen content than the larger ones. The observed results were further corroborated with the reaction temperature, obtained using a high-speed spectro-pyrometer, and bomb calorimetry conducted on larger samples (≈15 mg). The results suggest that thermites that produce sufficient amounts of gaseous products generate smaller product particles and achieve higher extents of completion.

  15. Experimental and Computational Studies of Carbonyl Diazide (CON6) as a Precursor to Diazirinone (CON2)

    NASA Astrophysics Data System (ADS)

    Esselman, Brian J.; Amberger, Brent K.; Nolan, Alex M.; Woods, R. Claude; McMahon, R. J.

    2011-10-01

    Intrigued by the reported 2005 synthesis of diazirinone (1), we carried out further experimental and theoretical studies aimed at the detailed matrix-isolation and millimeter-wave spectroscopic characterizations of 1. Diazirinone (1) is a peculiar isoconjugate of two very stable molecules and may be of astrochemical interest. Unfortunately, the original reported methods of diazirinone (1) generation did not yield this species, rather its decomposition products. Inspired by a more recent gas phase pyrolysis of CON6 (2) to yield CON2 (1), we proposed a new method of generating CON6 (2) in solution as a precursor of diazirinone (1). This new synthesis may allow us to generate larger quantities of both CON6 and CON2 for investigation by millimeter-wave spectroscopy. We are able to safely generate carbonyl diazide (2) in sufficient yield from the reaction of triphosgene (3) and tetrabutylammonium azide in diethyl ether. This has allowed us to obtain both matrix-isolation and gas phase IR spectra of carbonyl diazide (2). After purification, it has a gas-phase lifetime that allows samples to be useable for up to several weeks. However, it is a shock-sensitive material that must be handled with care to prevent violent decomposition. In order to provide better mechanistic insight into the decomposition of carbonyl diazide (2) to diazirinone (1), we have engaged in a DFT and ab initio computational study. We have found a pathway between the two species via the triplet acylnitrene, CON4, and an oxaziridine CON2 species, but not at sufficiently low energies to allow for the trapping and detection of diazirinone (1). Preliminary millimeter-wave spectra have been obtained from several synthesized and purified samples of CON6 (2). However, the assignment of the spectra lines has been unexpectedly problematic. We have placed several CON6 (2) samples, confirmed by IR spectroscopy at the time of sample loading, into our instrument and obtained two different sets of rotational lines. This rotational puzzle will be investigated further with a significantly upgraded millimeter-wave spectrometer.

  16. Accuracy in parameter estimation for targeted effects in structural equation modeling: sample size planning for narrow confidence intervals.

    PubMed

    Lai, Keke; Kelley, Ken

    2011-06-01

    In addition to evaluating a structural equation model (SEM) as a whole, often the model parameters are of interest and confidence intervals for those parameters are formed. Given a model with a good overall fit, it is entirely possible for the targeted effects of interest to have very wide confidence intervals, thus giving little information about the magnitude of the population targeted effects. With the goal of obtaining sufficiently narrow confidence intervals for the model parameters of interest, sample size planning methods for SEM are developed from the accuracy in parameter estimation approach. One method plans for the sample size so that the expected confidence interval width is sufficiently narrow. An extended procedure ensures that the obtained confidence interval will be no wider than desired, with some specified degree of assurance. A Monte Carlo simulation study was conducted that verified the effectiveness of the procedures in realistic situations. The methods developed have been implemented in the MBESS package in R so that they can be easily applied by researchers. © 2011 American Psychological Association

  17. Robust reliable sampled-data control for switched systems with application to flight control

    NASA Astrophysics Data System (ADS)

    Sakthivel, R.; Joby, Maya; Shi, P.; Mathiyalagan, K.

    2016-11-01

    This paper addresses the robust reliable stabilisation problem for a class of uncertain switched systems with random delays and norm bounded uncertainties. The main aim of this paper is to obtain the reliable robust sampled-data control design which involves random time delay with an appropriate gain control matrix for achieving the robust exponential stabilisation for uncertain switched system against actuator failures. In particular, the involved delays are assumed to be randomly time-varying which obeys certain mutually uncorrelated Bernoulli distributed white noise sequences. By constructing an appropriate Lyapunov-Krasovskii functional (LKF) and employing an average-dwell time approach, a new set of criteria is derived for ensuring the robust exponential stability of the closed-loop switched system. More precisely, the Schur complement and Jensen's integral inequality are used in derivation of stabilisation criteria. By considering the relationship among the random time-varying delay and its lower and upper bounds, a new set of sufficient condition is established for the existence of reliable robust sampled-data control in terms of solution to linear matrix inequalities (LMIs). Finally, an illustrative example based on the F-18 aircraft model is provided to show the effectiveness of the proposed design procedures.

  18. OBT analysis method using polyethylene beads for limited quantities of animal tissue.

    PubMed

    Kim, S B; Stuart, M

    2015-08-01

    This study presents a polyethylene beads method for OBT determination in animal tissues and animal products for cases where the amount of water recovered by combustion is limited by sample size or quantity. In the method, the amount of water recovered after combustion is enhanced by adding tritium-free polyethylene beads to the sample prior to combustion in an oxygen bomb. The method reduces process time by allowing the combustion water to be easily collected with a pipette. Sufficient water recovery was achieved using the polyethylene beads method when 2 g of dry animal tissue or animal product were combusted with 2 g of polyethylene beads. Correction factors, which account for the dilution due to the combustion water of the beads, are provided for beef, chicken, pork, fish and clams, as well as egg, milk and cheese. The method was tested by comparing its OBT results with those of the conventional method using animal samples collected on the Chalk River Laboratories (CRL) site. The results determined that the polyethylene beads method added no more than 25% uncertainty when appropriate correction factors are used. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.

  19. Ion Clouds in the Inductively Coupled Plasma Torch: A Closer Look through Computations.

    PubMed

    Aghaei, Maryam; Lindner, Helmut; Bogaerts, Annemie

    2016-08-16

    We have computationally investigated the introduction of copper elemental particles in an inductively coupled plasma torch connected to a sampling cone, including for the first time the ionization of the sample. The sample is inserted as liquid particles, which are followed inside the entire torch, i.e., from the injector inlet up to the ionization and reaching the sampler. The spatial position of the ion clouds inside the torch as well as detailed information on the copper species fluxes at the position of the sampler orifice and the exhausts of the torch are provided. The effect of on- and off-axis injection is studied. We clearly show that the ion clouds of on-axis injected material are located closer to the sampler with less radial diffusion. This guarantees a higher transport efficiency through the sampler cone. Moreover, our model reveals the optimum ranges of applied power and flow rates, which ensure the proper position of ion clouds inside the torch, i.e., close enough to the sampler to increase the fraction that can enter the mass spectrometer and with minimum loss of material toward the exhausts as well as a sufficiently high plasma temperature for efficient ionization.

  20. Timelapse ultrasonic tomography for measuring damage localization in geomechanics laboratory tests.

    PubMed

    Tudisco, Erika; Roux, Philippe; Hall, Stephen A; Viggiani, Giulia M B; Viggiani, Gioacchino

    2015-03-01

    Variation of mechanical properties in materials can be detected non-destructively using ultrasonic measurements. In particular, changes in elastic wave velocity can occur due to damage, i.e., micro-cracking and particles debonding. Here the challenge of characterizing damage in geomaterials, i.e., rocks and soils, is addressed. Geomaterials are naturally heterogeneous media in which the deformation can localize, so that few measurements of acoustic velocity across the sample are not sufficient to capture the heterogeneities. Therefore, an ultrasonic tomography procedure has been implemented to map the spatial and temporal variations in propagation velocity, which provides information on the damage process. Moreover, double beamforming has been successfully applied to identify and isolate multiple arrivals that are caused by strong heterogeneities (natural or induced by the deformation process). The applicability of the developed experimental technique to laboratory geomechanics testing is illustrated using data acquired on a sample of natural rock before and after being deformed under triaxial compression. The approach is then validated and extended to time-lapse monitoring using data acquired during plane strain compression of a sample including a well defined layer with different mechanical properties than the matrix.

Top