Sample records for high counting efficiency

  1. Effects of sodium salicylate on the determination of Lead-210/Bismuth-210 by Cerenkov counting.

    PubMed

    Wang, Yadong; Yang, Yonggang; Song, Lijuan; Ma, Yan; Luo, Maoyi; Dai, Xiongxin

    2018-05-21

    Due to the nature of Cerenkov radiation and instrumental limitations, detection efficiencies of 210 Bi by Cerenkov counting are generally quite low (~15%). Sodium salicylate, acting as a wavelength shifter, has been used to improve the detection efficiency of Cerenkov photons. In this study, we found that the addition of sodium salicylate could significantly increase the counting efficiencies of 210 Pb/ 210 Bi in aqueous samples. Meanwhile, a sharp increase of the counting efficiency for the alphas from 210 Po was also observed with the addition of high concentration of sodium salicylate, implying that scintillation light rather than Cerenkov photons from the alphas has been produced. Detailed studies about the effects of sodium salicylate on the counting of 210 Pb, 210 Bi and 210 Po were conducted. At low concentration (< 0.5 mg g -1 ) of sodium salicylate, only a small increase in Cerenkov counting efficiency for 210 Bi by the wavelength-shifting effect could be observed. Meanwhile, the counting efficiency for 210 Bi at high concentration (> 1 mg g -1 ) of sodium salicylate would significantly increase due to the scintillation effect. Copyright © 2018 Elsevier Ltd. All rights reserved.

  2. Performance characteristics of high-conductivity channel electron multipliers. [as UV and x ray detector

    NASA Technical Reports Server (NTRS)

    Timothy, J. G.; Bybee, R. L.

    1978-01-01

    The paper describes a new type of continuous channel multiplier (CEM) fabricated from a low-resistance glass to produce a high-conductivity channel section and thereby obtain a high count-rate capability. The flat-cone cathode configuration of the CEM is specifically designed for the detection of astigmatic exit images from grazing-incidence spectrometers at the optimum angle of illumination for high detection efficiencies at XUV wavelengths. Typical operating voltages are in the range of 2500-2900 V with stable counting plateau slopes in the range 3-6% per 100-V increment. The modal gain at 2800 V was typically in the range (50-80) million. The modal gain falls off at count rates in excess of about 20,000 per sec. The detection efficiency remains essentially constant to count rates in excess of 2 million per sec. Higher detection efficiencies (better than 20%) are obtained by coating the CEM with MgF2. In life tests of coated CEMs, no measurable change in detection efficiency was measured to a total accumulated signal of 2 times 10 to the 11th power counts.

  3. A novel algorithm for solving the true coincident counting issues in Monte Carlo simulations for radiation spectroscopy.

    PubMed

    Guan, Fada; Johns, Jesse M; Vasudevan, Latha; Zhang, Guoqing; Tang, Xiaobin; Poston, John W; Braby, Leslie A

    2015-06-01

    Coincident counts can be observed in experimental radiation spectroscopy. Accurate quantification of the radiation source requires the detection efficiency of the spectrometer, which is often experimentally determined. However, Monte Carlo analysis can be used to supplement experimental approaches to determine the detection efficiency a priori. The traditional Monte Carlo method overestimates the detection efficiency as a result of omitting coincident counts caused mainly by multiple cascade source particles. In this study, a novel "multi-primary coincident counting" algorithm was developed using the Geant4 Monte Carlo simulation toolkit. A high-purity Germanium detector for ⁶⁰Co gamma-ray spectroscopy problems was accurately modeled to validate the developed algorithm. The simulated pulse height spectrum agreed well qualitatively with the measured spectrum obtained using the high-purity Germanium detector. The developed algorithm can be extended to other applications, with a particular emphasis on challenging radiation fields, such as counting multiple types of coincident radiations released from nuclear fission or used nuclear fuel.

  4. PPO-ethanol system as wavelength shifter for the Cherenkov counting technique using a liquid scintillation counter

    NASA Astrophysics Data System (ADS)

    Takiue, Makoto; Fujii, Haruo; Ishikawa, Hiroaki

    1984-12-01

    2, 5-diphenyloxazole (PPO) has been proposed as a wavelength shifter for Cherenkov counting. Since PPO is not incorporated with water, we have introduced the fluor into water in the form of micelle using a PPO-ethanol system. This technique makes it possible to obtain a high Cherenkov counting efficiency under stable sample conditions, attributed to the proper spectrometric features of the PPO. The 32P Cherenkov counting efficiency (68.4%) obtained from this technique is large as that measured with a conventional Cherenkov technique.

  5. Historical review of lung counting efficiencies for low energy photon emitters

    DOE PAGES

    Jeffers, Karen L.; Hickman, David P.

    2014-03-01

    This publication reviews the measured efficiency and variability over time of a high purity planar germanium in vivo lung count system for multiple photon energies using increasingly thick overlays with the Lawrence Livermore Torso Phantom. Furthermore, the measured variations in efficiency are compared with the current requirement for in vivo bioassay performance as defined by the American National Standards Institute Standard.

  6. Technology Development for High Efficiency Optical Communications

    NASA Technical Reports Server (NTRS)

    Farr, William H.

    2012-01-01

    Deep space optical communications is a significantly more challenging operational domain than near Earth space optical communications, primarily due to effects resulting from the vastly increased range between transmitter and receiver. The NASA Game Changing Development Program Deep Space Optical Communications Project is developing four key technologies for the implementation of a high efficiency telecommunications system that will enable greater than 10X the data rate of a state-of-the-art deep space RF system (Ka-band) for similar transceiver mass and power burden on the spacecraft. These technologies are a low mass spacecraft disturbance isolation assembly, a flight qualified photon counting detector array, a high efficiency flight laser amplifier and a high efficiency photon counting detector array for the ground-based receiver.

  7. UV superconducting nanowire single-photon detectors with high efficiency, low noise, and 4 K operating temperature

    NASA Astrophysics Data System (ADS)

    Wollman, E. E.; Verma, V. B.; Beyer, A. D.; Briggs, R. M.; Korzh, B.; Allmaras, J. P.; Marsili, F.; Lita, A. E.; Mirin, R. P.; Nam, S. W.; Shaw, M. D.

    2017-10-01

    For photon-counting applications at ultraviolet wavelengths, there are currently no detectors that combine high efficiency (> 50%), sub-nanosecond timing resolution, and sub-Hz dark count rates. Superconducting nanowire single-photon detectors (SNSPDs) have seen success over the past decade for photon-counting applications in the near-infrared, but little work has been done to optimize SNSPDs for wavelengths below 400 nm. Here, we describe the design, fabrication, and characterization of UV SNSPDs operating at wavelengths between 250 and 370 nm. The detectors have active areas up to 56 ${\\mu}$m in diameter, 70 - 80% efficiency, timing resolution down to 60 ps FWHM, blindness to visible and infrared photons, and dark count rates of ~ 0.25 counts/hr for a 56 ${\\mu}$m diameter pixel. By using the amorphous superconductor MoSi, these UV SNSPDs are also able to operate at temperatures up to 4.2 K. These performance metrics make UV SNSPDs ideal for applications in trapped-ion quantum information processing, lidar studies of the upper atmosphere, UV fluorescent-lifetime imaging microscopy, and photon-starved UV astronomy.

  8. Waveguide integrated superconducting single-photon detectors with high internal quantum efficiency at telecom wavelengths

    PubMed Central

    Kahl, Oliver; Ferrari, Simone; Kovalyuk, Vadim; Goltsman, Gregory N.; Korneev, Alexander; Pernice, Wolfram H. P.

    2015-01-01

    Superconducting nanowire single-photon detectors (SNSPDs) provide high efficiency for detecting individual photons while keeping dark counts and timing jitter minimal. Besides superior detection performance over a broad optical bandwidth, compatibility with an integrated optical platform is a crucial requirement for applications in emerging quantum photonic technologies. Here we present SNSPDs embedded in nanophotonic integrated circuits which achieve internal quantum efficiencies close to unity at 1550 nm wavelength. This allows for the SNSPDs to be operated at bias currents far below the critical current where unwanted dark count events reach milli-Hz levels while on-chip detection efficiencies above 70% are maintained. The measured dark count rates correspond to noise-equivalent powers in the 10−19 W/Hz−1/2 range and the timing jitter is as low as 35 ps. Our detectors are fully scalable and interface directly with waveguide-based optical platforms. PMID:26061283

  9. Waveguide integrated superconducting single-photon detectors with high internal quantum efficiency at telecom wavelengths.

    PubMed

    Kahl, Oliver; Ferrari, Simone; Kovalyuk, Vadim; Goltsman, Gregory N; Korneev, Alexander; Pernice, Wolfram H P

    2015-06-10

    Superconducting nanowire single-photon detectors (SNSPDs) provide high efficiency for detecting individual photons while keeping dark counts and timing jitter minimal. Besides superior detection performance over a broad optical bandwidth, compatibility with an integrated optical platform is a crucial requirement for applications in emerging quantum photonic technologies. Here we present SNSPDs embedded in nanophotonic integrated circuits which achieve internal quantum efficiencies close to unity at 1550 nm wavelength. This allows for the SNSPDs to be operated at bias currents far below the critical current where unwanted dark count events reach milli-Hz levels while on-chip detection efficiencies above 70% are maintained. The measured dark count rates correspond to noise-equivalent powers in the 10(-19) W/Hz(-1/2) range and the timing jitter is as low as 35 ps. Our detectors are fully scalable and interface directly with waveguide-based optical platforms.

  10. Estimated Costs for Delivery of HIV Antiretroviral Therapy to Individuals with CD4+ T-Cell Counts >350 cells/uL in Rural Uganda

    PubMed Central

    Jain, Vivek; Chang, Wei; Byonanebye, Dathan M.; Owaraganise, Asiphas; Twinomuhwezi, Ellon; Amanyire, Gideon; Black, Douglas; Marseille, Elliot; Kamya, Moses R.; Havlir, Diane V.; Kahn, James G.

    2015-01-01

    Background Evidence favoring earlier HIV ART initiation at high CD4+ T-cell counts (CD4>350/uL) has grown, and guidelines now recommend earlier HIV treatment. However, the cost of providing ART to individuals with CD4>350 in Sub-Saharan Africa has not been well estimated. This remains a major barrier to optimal global cost projections for accelerating the scale-up of ART. Our objective was to compute costs of ART delivery to high CD4+count individuals in a typical rural Ugandan health center-based HIV clinic, and use these data to construct scenarios of efficient ART scale-up. Methods Within a clinical study evaluating streamlined ART delivery to 197 individuals with CD4+ cell counts >350 cells/uL (EARLI Study: NCT01479634) in Mbarara, Uganda, we performed a micro-costing analysis of administrative records, ART prices, and time-and-motion analysis of staff work patterns. We computed observed per-person-per-year (ppy) costs, and constructed models estimating costs under several increasingly efficient ART scale-up scenarios using local salaries, lowest drug prices, optimized patient loads, and inclusion of viral load (VL) testing. Findings Among 197 individuals enrolled in the EARLI Study, median pre-ART CD4+ cell count was 569/uL (IQR 451–716). Observed ART delivery cost was $628 ppy at steady state. Models using local salaries and only core laboratory tests estimated costs of $529/$445 ppy (+/-VL testing, respectively). Models with lower salaries, lowest ART prices, and optimized healthcare worker schedules reduced costs by $100–200 ppy. Costs in a maximally efficient scale-up model were $320/$236 ppy (+/- VL testing). This included $39 for personnel, $106 for ART, $130/$46 for laboratory tests, and $46 for administrative/other costs. A key limitation of this study is its derivation and extrapolation of costs from one large rural treatment program of high CD4+ count individuals. Conclusions In a Ugandan HIV clinic, ART delivery costs—including VL testing—for individuals with CD4>350 were similar to estimates from high-efficiency programs. In higher efficiency scale-up models, costs were substantially lower. These favorable costs may be achieved because high CD4+ count patients are often asymptomatic, facilitating more efficient streamlined ART delivery. Our work provides a framework for calculating costs of efficient ART scale-up models using accessible data from specific programs and regions. PMID:26632823

  11. Estimated Costs for Delivery of HIV Antiretroviral Therapy to Individuals with CD4+ T-Cell Counts >350 cells/uL in Rural Uganda.

    PubMed

    Jain, Vivek; Chang, Wei; Byonanebye, Dathan M; Owaraganise, Asiphas; Twinomuhwezi, Ellon; Amanyire, Gideon; Black, Douglas; Marseille, Elliot; Kamya, Moses R; Havlir, Diane V; Kahn, James G

    2015-01-01

    Evidence favoring earlier HIV ART initiation at high CD4+ T-cell counts (CD4>350/uL) has grown, and guidelines now recommend earlier HIV treatment. However, the cost of providing ART to individuals with CD4>350 in Sub-Saharan Africa has not been well estimated. This remains a major barrier to optimal global cost projections for accelerating the scale-up of ART. Our objective was to compute costs of ART delivery to high CD4+count individuals in a typical rural Ugandan health center-based HIV clinic, and use these data to construct scenarios of efficient ART scale-up. Within a clinical study evaluating streamlined ART delivery to 197 individuals with CD4+ cell counts >350 cells/uL (EARLI Study: NCT01479634) in Mbarara, Uganda, we performed a micro-costing analysis of administrative records, ART prices, and time-and-motion analysis of staff work patterns. We computed observed per-person-per-year (ppy) costs, and constructed models estimating costs under several increasingly efficient ART scale-up scenarios using local salaries, lowest drug prices, optimized patient loads, and inclusion of viral load (VL) testing. Among 197 individuals enrolled in the EARLI Study, median pre-ART CD4+ cell count was 569/uL (IQR 451-716). Observed ART delivery cost was $628 ppy at steady state. Models using local salaries and only core laboratory tests estimated costs of $529/$445 ppy (+/-VL testing, respectively). Models with lower salaries, lowest ART prices, and optimized healthcare worker schedules reduced costs by $100-200 ppy. Costs in a maximally efficient scale-up model were $320/$236 ppy (+/- VL testing). This included $39 for personnel, $106 for ART, $130/$46 for laboratory tests, and $46 for administrative/other costs. A key limitation of this study is its derivation and extrapolation of costs from one large rural treatment program of high CD4+ count individuals. In a Ugandan HIV clinic, ART delivery costs--including VL testing--for individuals with CD4>350 were similar to estimates from high-efficiency programs. In higher efficiency scale-up models, costs were substantially lower. These favorable costs may be achieved because high CD4+ count patients are often asymptomatic, facilitating more efficient streamlined ART delivery. Our work provides a framework for calculating costs of efficient ART scale-up models using accessible data from specific programs and regions.

  12. Waveguide integrated low noise NbTiN nanowire single-photon detectors with milli-Hz dark count rate

    PubMed Central

    Schuck, Carsten; Pernice, Wolfram H. P.; Tang, Hong X.

    2013-01-01

    Superconducting nanowire single-photon detectors are an ideal match for integrated quantum photonic circuits due to their high detection efficiency for telecom wavelength photons. Quantum optical technology also requires single-photon detection with low dark count rate and high timing accuracy. Here we present very low noise superconducting nanowire single-photon detectors based on NbTiN thin films patterned directly on top of Si3N4 waveguides. We systematically investigate a large variety of detector designs and characterize their detection noise performance. Milli-Hz dark count rates are demonstrated over the entire operating range of the nanowire detectors which also feature low timing jitter. The ultra-low dark count rate, in combination with the high detection efficiency inherent to our travelling wave detector geometry, gives rise to a measured noise equivalent power at the 10−20 W/Hz1/2 level. PMID:23714696

  13. High-Dose Neutron Detector Development Using 10B Coated Cells

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Menlove, Howard Olsen; Henzlova, Daniela

    2016-11-08

    During FY16 the boron-lined parallel-plate technology was optimized to fully benefit from its fast timing characteristics in order to enhance its high count rate capability. To facilitate high count rate capability, a novel fast amplifier with timing and operating properties matched to the detector characteristics was developed and implemented in the 8” boron plate detector that was purchased from PDT. Each of the 6 sealed-cells was connected to a fast amplifier with corresponding List mode readout from each amplifier. The FY16 work focused on improvements in the boron-10 coating materials and procedures at PDT to significantly improve the neutron detectionmore » efficiency. An improvement in the efficiency of a factor of 1.5 was achieved without increasing the metal backing area for the boron coating. This improvement has allowed us to operate the detector in gamma-ray backgrounds that are four orders of magnitude higher than was previously possible while maintaining a relatively high counting efficiency for neutrons. This improvement in the gamma-ray rejection is a key factor in the development of the high dose neutron detector.« less

  14. Efficient statistical mapping of avian count data

    USGS Publications Warehouse

    Royle, J. Andrew; Wikle, C.K.

    2005-01-01

    We develop a spatial modeling framework for count data that is efficient to implement in high-dimensional prediction problems. We consider spectral parameterizations for the spatially varying mean of a Poisson model. The spectral parameterization of the spatial process is very computationally efficient, enabling effective estimation and prediction in large problems using Markov chain Monte Carlo techniques. We apply this model to creating avian relative abundance maps from North American Breeding Bird Survey (BBS) data. Variation in the ability of observers to count birds is modeled as spatially independent noise, resulting in over-dispersion relative to the Poisson assumption. This approach represents an improvement over existing approaches used for spatial modeling of BBS data which are either inefficient for continental scale modeling and prediction or fail to accommodate important distributional features of count data thus leading to inaccurate accounting of prediction uncertainty.

  15. Direct measurement of carbon-14 in carbon dioxide by liquid scintillation counting

    NASA Technical Reports Server (NTRS)

    Horrocks, D. L.

    1969-01-01

    Liquid scintillation counting technique is applied to the direct measurement of carbon-14 in carbon dioxide. This method has high counting efficiency and eliminates many of the basic problems encountered with previous techniques. The technique can be used to achieve a percent substitution reaction and is of interest as an analytical technique.

  16. High Throughput, High Yield Fabrication of High Quantum Efficiency Back-Illuminated Photon Counting, Far UV, UV, and Visible Detector Arrays

    NASA Technical Reports Server (NTRS)

    Nikzad, Shouleh; Hoenk, M. E.; Carver, A. G.; Jones, T. J.; Greer, F.; Hamden, E.; Goodsall, T.

    2013-01-01

    In this paper we discuss the high throughput end-to-end post fabrication processing of high performance delta-doped and superlattice-doped silicon imagers for UV, visible, and NIR applications. As an example, we present our results on far ultraviolet and ultraviolet quantum efficiency (QE) in a photon counting, detector array. We have improved the QE by nearly an order of magnitude over microchannel plates (MCPs) that are the state-of-the-art UV detectors for many NASA space missions as well as defense applications. These achievements are made possible by precision interface band engineering of Molecular Beam Epitaxy (MBE) and Atomic Layer Deposition (ALD).

  17. Evaluation of counting methods for oceanic radium-228

    NASA Astrophysics Data System (ADS)

    Orr, James C.

    1988-07-01

    Measurement of open ocean 228Ra is difficult, typically requiring at least 200 L of seawater. The burden of collecting and processing these large-volume samples severely limits the widespread use of this promising tracer. To use smaller-volume samples, a more sensitive means of analysis is required. To seek out new and improved counting method(s), conventional 228Ra counting methods have been compared with some promising techniques which are currently used for other radionuclides. Of the conventional methods, α spectrometry possesses the highest efficiency (3-9%) and lowest background (0.0015 cpm), but it suffers from the need for complex chemical processing after sampling and the need to allow about 1 year for adequate ingrowth of 228Th granddaughter. The other two conventional counting methods measure the short-lived 228Ac daughter while it remains supported by 228Ra, thereby avoiding the complex sample processing and the long delay before counting. The first of these, high-resolution γ spectrometry, offers the simplest processing and an efficiency (4.8%) comparable to α spectrometry; yet its high background (0.16 cpm) and substantial equipment cost (˜30,000) limit its widespread use. The second no-wait method, β-γ coincidence spectrometry, also offers comparable efficiency (5.3%), but it possesses both lower background (0.0054 cpm) and lower initial cost (˜12,000). Three new (i.e., untried for 228Ra) techniques all seem to promise about a fivefold increase in efficiency over conventional methods. By employing liquid scintillation methods, both α spectrometry and β-γ coincidence spectrometry can improve their counter efficiency while retaining low background. The third new 228Ra counting method could be adapted from a technique which measures 224Ra by 220Rn emanation. After allowing for ingrowth and then counting for the 224Ra great-granddaughter, 228Ra could be back calculated, thereby yielding a method with high efficiency, where no sample processing is required. The efficiency and background of each of the three new methods have been estimated and are compared with those of the three methods currently employed to measure oceanic 228Ra. From efficiency and background, the relative figure of merit and the detection limit have been determined for each of the six counters. These data suggest that the new counting methods have the potential to measure most 228Ra samples with just 30 L of seawater, to better than 5% precision. Not only would this reduce the time, effort, and expense involved in sample collection, but 228Ra could then be measured on many small-volume samples (20-30 L) previously collected with only 226Ra in mind. By measuring 228Ra quantitatively on such small-volume samples, three analyses (large-volume 228Ra, large-volume 226Ra, and small-volume 226Ra) could be reduced to one, thereby dramatically improving analytical precision.

  18. The piecewise-linear dynamic attenuator reduces the impact of count rate loss with photon-counting detectors

    NASA Astrophysics Data System (ADS)

    Hsieh, Scott S.; Pelc, Norbert J.

    2014-06-01

    Photon counting x-ray detectors (PCXDs) offer several advantages compared to standard energy-integrating x-ray detectors, but also face significant challenges. One key challenge is the high count rates required in CT. At high count rates, PCXDs exhibit count rate loss and show reduced detective quantum efficiency in signal-rich (or high flux) measurements. In order to reduce count rate requirements, a dynamic beam-shaping filter can be used to redistribute flux incident on the patient. We study the piecewise-linear attenuator in conjunction with PCXDs without energy discrimination capabilities. We examined three detector models: the classic nonparalyzable and paralyzable detector models, and a ‘hybrid’ detector model which is a weighted average of the two which approximates an existing, real detector (Taguchi et al 2011 Med. Phys. 38 1089-102 ). We derive analytic expressions for the variance of the CT measurements for these detectors. These expressions are used with raw data estimated from DICOM image files of an abdomen and a thorax to estimate variance in reconstructed images for both the dynamic attenuator and a static beam-shaping (‘bowtie’) filter. By redistributing flux, the dynamic attenuator reduces dose by 40% without increasing peak variance for the ideal detector. For non-ideal PCXDs, the impact of count rate loss is also reduced. The nonparalyzable detector shows little impact from count rate loss, but with the paralyzable model, count rate loss leads to noise streaks that can be controlled with the dynamic attenuator. With the hybrid model, the characteristic count rates required before noise streaks dominate the reconstruction are reduced by a factor of 2 to 3. We conclude that the piecewise-linear attenuator can reduce the count rate requirements of the PCXD in addition to improving dose efficiency. The magnitude of this reduction depends on the detector, with paralyzable detectors showing much greater benefit than nonparalyzable detectors.

  19. Development of the brain's structural network efficiency in early adolescence: A longitudinal DTI twin study.

    PubMed

    Koenis, Marinka M G; Brouwer, Rachel M; van den Heuvel, Martijn P; Mandl, René C W; van Soelen, Inge L C; Kahn, René S; Boomsma, Dorret I; Hulshoff Pol, Hilleke E

    2015-12-01

    The brain is a network and our intelligence depends in part on the efficiency of this network. The network of adolescents differs from that of adults suggesting developmental changes. However, whether the network changes over time at the individual level and, if so, how this relates to intelligence, is unresolved in adolescence. In addition, the influence of genetic factors in the developing network is not known. Therefore, in a longitudinal study of 162 healthy adolescent twins and their siblings (mean age at baseline 9.9 [range 9.0-15.0] years), we mapped local and global structural network efficiency of cerebral fiber pathways (weighted with mean FA and streamline count) and assessed intelligence over a three-year interval. We find that the efficiency of the brain's structural network is highly heritable (locally up to 74%). FA-based local and global efficiency increases during early adolescence. Streamline count based local efficiency both increases and decreases, and global efficiency reorganizes to a net decrease. Local FA-based efficiency was correlated to IQ. Moreover, increases in FA-based network efficiency (global and local) and decreases in streamline count based local efficiency are related to increases in intellectual functioning. Individual changes in intelligence and local FA-based efficiency appear to go hand in hand in frontal and temporal areas. More widespread local decreases in streamline count based efficiency (frontal cingulate and occipital) are correlated with increases in intelligence. We conclude that the teenage brain is a network in progress in which individual differences in maturation relate to level of intellectual functioning. © 2015 Wiley Periodicals, Inc.

  20. Single Photon Counting UV Solar-Blind Detectors Using Silicon and III-Nitride Materials

    PubMed Central

    Nikzad, Shouleh; Hoenk, Michael; Jewell, April D.; Hennessy, John J.; Carver, Alexander G.; Jones, Todd J.; Goodsall, Timothy M.; Hamden, Erika T.; Suvarna, Puneet; Bulmer, J.; Shahedipour-Sandvik, F.; Charbon, Edoardo; Padmanabhan, Preethi; Hancock, Bruce; Bell, L. Douglas

    2016-01-01

    Ultraviolet (UV) studies in astronomy, cosmology, planetary studies, biological and medical applications often require precision detection of faint objects and in many cases require photon-counting detection. We present an overview of two approaches for achieving photon counting in the UV. The first approach involves UV enhancement of photon-counting silicon detectors, including electron multiplying charge-coupled devices and avalanche photodiodes. The approach used here employs molecular beam epitaxy for delta doping and superlattice doping for surface passivation and high UV quantum efficiency. Additional UV enhancements include antireflection (AR) and solar-blind UV bandpass coatings prepared by atomic layer deposition. Quantum efficiency (QE) measurements show QE > 50% in the 100–300 nm range for detectors with simple AR coatings, and QE ≅ 80% at ~206 nm has been shown when more complex AR coatings are used. The second approach is based on avalanche photodiodes in III-nitride materials with high QE and intrinsic solar blindness. PMID:27338399

  1. FPGA-based photon-counting phase-modulation fluorometer and a brief comparison with that operated in a pulsed-excitation mode

    NASA Astrophysics Data System (ADS)

    Iwata, Tetsuo; Taga, Takanori; Mizuno, Takahiko

    2018-02-01

    We have constructed a high-efficiency, photon-counting phase-modulation fluorometer (PC-PMF) using a field-programmable gate array, which is a modified version of the photon-counting fluorometer (PCF) that works in a pulsed-excitation mode (Iwata and Mizuno in Meas Sci Technol 28:075501, 2017). The common working principle for both is the simultaneous detection of the photoelectron pulse train, which covers 64 ns with a 1.0-ns resolution time (1.0 ns/channel). The signal-gathering efficiency was improved more than 100 times over that of conventional time-correlated single-photon-counting at the expense of resolution time depending on the number of channels. The system dead time for building a histogram was eliminated, markedly shortening the measurement time for fluorescent samples with moderately high quantum yields. We describe the PC-PMF and make a brief comparison with the pulsed-excitation PCF in precision, demonstrating the potential advantage of PC-PMF.

  2. Single Photon Counting UV Solar-Blind Detectors Using Silicon and III-Nitride Materials.

    PubMed

    Nikzad, Shouleh; Hoenk, Michael; Jewell, April D; Hennessy, John J; Carver, Alexander G; Jones, Todd J; Goodsall, Timothy M; Hamden, Erika T; Suvarna, Puneet; Bulmer, J; Shahedipour-Sandvik, F; Charbon, Edoardo; Padmanabhan, Preethi; Hancock, Bruce; Bell, L Douglas

    2016-06-21

    Ultraviolet (UV) studies in astronomy, cosmology, planetary studies, biological and medical applications often require precision detection of faint objects and in many cases require photon-counting detection. We present an overview of two approaches for achieving photon counting in the UV. The first approach involves UV enhancement of photon-counting silicon detectors, including electron multiplying charge-coupled devices and avalanche photodiodes. The approach used here employs molecular beam epitaxy for delta doping and superlattice doping for surface passivation and high UV quantum efficiency. Additional UV enhancements include antireflection (AR) and solar-blind UV bandpass coatings prepared by atomic layer deposition. Quantum efficiency (QE) measurements show QE > 50% in the 100-300 nm range for detectors with simple AR coatings, and QE ≅ 80% at ~206 nm has been shown when more complex AR coatings are used. The second approach is based on avalanche photodiodes in III-nitride materials with high QE and intrinsic solar blindness.

  3. Material screening with HPGe counting station for PandaX experiment

    NASA Astrophysics Data System (ADS)

    Wang, X.; Chen, X.; Fu, C.; Ji, X.; Liu, X.; Mao, Y.; Wang, H.; Wang, S.; Xie, P.; Zhang, T.

    2016-12-01

    A gamma counting station based on high-purity germanium (HPGe) detector was set up for the material screening of the PandaX dark matter experiments in the China Jinping Underground Laboratory. Low background gamma rate of 2.6 counts/min within the energy range of 20 to 2700 keV is achieved due to the well-designed passive shield. The sentivities of the HPGe detetector reach mBq/kg level for isotopes like K, U, Th, and even better for Co and Cs, resulted from the low-background rate and the high relative detection efficiency of 175%. The structure and performance of the counting station are described in this article. Detailed counting results for the radioactivity in materials used by the PandaX dark-matter experiment are presented. The upgrading plan of the counting station is also discussed.

  4. High quantum efficiency and low dark count rate in multi-layer superconducting nanowire single-photon detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jafari Salim, A., E-mail: ajafaris@uwaterloo.ca; Eftekharian, A.; University of Waterloo, Waterloo, Ontario N2L 3G1

    In this paper, we theoretically show that a multi-layer superconducting nanowire single-photon detector (SNSPD) is capable of approaching characteristics of an ideal SNSPD in terms of the quantum efficiency, dark count, and band-width. A multi-layer structure improves the performance in two ways. First, the potential barrier for thermally activated vortex crossing, which is the major source of dark counts and the reduction of the critical current in SNSPDs is elevated. In a multi-layer SNSPD, a vortex is made of 2D-pancake vortices that form a stack. It will be shown that the stack of pancake vortices effectively experiences a larger potentialmore » barrier compared to a vortex in a single-layer SNSPD. This leads to an increase in the experimental critical current as well as significant decrease in the dark count rate. In consequence, an increase in the quantum efficiency for photons of the same energy or an increase in the sensitivity to photons of lower energy is achieved. Second, a multi-layer structure improves the efficiency of single-photon absorption by increasing the effective optical thickness without compromising the single-photon sensitivity.« less

  5. Design considerations of high-performance InGaAs/InP single-photon avalanche diodes for quantum key distribution.

    PubMed

    Ma, Jian; Bai, Bing; Wang, Liu-Jun; Tong, Cun-Zhu; Jin, Ge; Zhang, Jun; Pan, Jian-Wei

    2016-09-20

    InGaAs/InP single-photon avalanche diodes (SPADs) are widely used in practical applications requiring near-infrared photon counting such as quantum key distribution (QKD). Photon detection efficiency and dark count rate are the intrinsic parameters of InGaAs/InP SPADs, due to the fact that their performances cannot be improved using different quenching electronics given the same operation conditions. After modeling these parameters and developing a simulation platform for InGaAs/InP SPADs, we investigate the semiconductor structure design and optimization. The parameters of photon detection efficiency and dark count rate highly depend on the variables of absorption layer thickness, multiplication layer thickness, excess bias voltage, and temperature. By evaluating the decoy-state QKD performance, the variables for SPAD design and operation can be globally optimized. Such optimization from the perspective of specific applications can provide an effective approach to design high-performance InGaAs/InP SPADs.

  6. Photon counting microstrip X-ray detectors with GaAs sensors

    NASA Astrophysics Data System (ADS)

    Ruat, M.; Andrä, M.; Bergamaschi, A.; Barten, R.; Brückner, M.; Dinapoli, R.; Fröjdh, E.; Greiffenberg, D.; Lopez-Cuenca, C.; Lozinskaya, A. D.; Mezza, D.; Mozzanica, A.; Novikov, V. A.; Ramilli, M.; Redford, S.; Ruder, C.; Schmitt, B.; Shi, X.; Thattil, D.; Tinti, G.; Tolbanov, O. P.; Tyazhev, A.; Vetter, S.; Zarubin, A. N.; Zhang, J.

    2018-01-01

    High-Z sensors are increasingly used to overcome the poor efficiency of Si sensors above 15 keV, and further extend the energy range of synchrotron and FEL experiments. Detector-grade GaAs sensors of 500 μm thickness offer 98% absorption efficiency at 30 keV and 50% at 50 keV . In this work we assess the usability of GaAs sensors in combination with the MYTHEN photon-counting microstrip readout chip developed at PSI. Different strip length and pitch are compared, and the detector performance is evaluated in regard of the sensor material properties. Despite increased leakage current and noise, photon-counting strips mounted with GaAs sensors can be used with photons of energy as low as 5 keV, and exhibit excellent linearity with energy. The charge sharing is doubled as compared to silicon strips, due to the high diffusion coefficient of electrons in GaAs.

  7. LOW ENERGY COUNTING CHAMBERS

    DOEpatents

    Hayes, P.M.

    1960-02-16

    A beta particle counter adapted to use an end window made of polyethylene terephthalate was designed. The extreme thinness of the film results in a correspondingly high transmission of incident low-energy beta particles by the window. As a consequence, the counting efficiency of the present counter is over 40% greater than counters using conventional mica end windows.

  8. SCOTCH: Secure Counting Of encrypTed genomiC data using a Hybrid approach.

    PubMed

    Chenghong, Wang; Jiang, Yichen; Mohammed, Noman; Chen, Feng; Jiang, Xiaoqian; Al Aziz, Md Momin; Sadat, Md Nazmus; Wang, Shuang

    2017-01-01

    As genomic data are usually at large scale and highly sensitive, it is essential to enable both efficient and secure analysis, by which the data owner can securely delegate both computation and storage on untrusted public cloud. Counting query of genotypes is a basic function for many downstream applications in biomedical research (e.g., computing allele frequency, calculating chi-squared statistics, etc.). Previous solutions show promise on secure counting of outsourced data but the efficiency is still a big limitation for real world applications. In this paper, we propose a novel hybrid solution to combine a rigorous theoretical model (homomorphic encryption) and the latest hardware-based infrastructure (i.e., Software Guard Extensions) to speed up the computation while preserving the privacy of both data owners and data users. Our results demonstrated efficiency by using the real data from the personal genome project.

  9. SCOTCH: Secure Counting Of encrypTed genomiC data using a Hybrid approach

    PubMed Central

    Chenghong, Wang; Jiang, Yichen; Mohammed, Noman; Chen, Feng; Jiang, Xiaoqian; Al Aziz, Md Momin; Sadat, Md Nazmus; Wang, Shuang

    2017-01-01

    As genomic data are usually at large scale and highly sensitive, it is essential to enable both efficient and secure analysis, by which the data owner can securely delegate both computation and storage on untrusted public cloud. Counting query of genotypes is a basic function for many downstream applications in biomedical research (e.g., computing allele frequency, calculating chi-squared statistics, etc.). Previous solutions show promise on secure counting of outsourced data but the efficiency is still a big limitation for real world applications. In this paper, we propose a novel hybrid solution to combine a rigorous theoretical model (homomorphic encryption) and the latest hardware-based infrastructure (i.e., Software Guard Extensions) to speed up the computation while preserving the privacy of both data owners and data users. Our results demonstrated efficiency by using the real data from the personal genome project. PMID:29854245

  10. Measuring Transmission Efficiencies Of Mass Spectrometers

    NASA Technical Reports Server (NTRS)

    Srivastava, Santosh K.

    1989-01-01

    Coincidence counts yield absolute efficiencies. System measures mass-dependent transmission efficiencies of mass spectrometers, using coincidence-counting techniques reminiscent of those used for many years in calibration of detectors for subatomic particles. Coincidences between detected ions and electrons producing them counted during operation of mass spectrometer. Under certain assumptions regarding inelastic scattering of electrons, electron/ion-coincidence count is direct measure of transmission efficiency of spectrometer. When fully developed, system compact, portable, and used routinely to calibrate mass spectrometers.

  11. Evaluation of large format electron bombarded virtual phase CCDs as ultraviolet imaging detectors

    NASA Technical Reports Server (NTRS)

    Opal, Chet B.; Carruthers, George R.

    1989-01-01

    In conjunction with an external UV-sensitive cathode, an electron-bombarded CCD may be used as a high quantum efficiency/wide dynamic range photon-counting UV detector. Results are presented for the case of a 1024 x 1024, 18-micron square pixel virtual phase CCD used with an electromagnetically focused f/2 Schmidt camera, which yields excellent simgle-photoevent discrimination and counting efficiency. Attention is given to the vacuum-chamber arrangement used to conduct system tests and the CCD electronics and data-acquisition systems employed.

  12. High Efficiency, 100 mJ per pulse, Nd:YAG Oscillator Optimized for Space-Based Earth and Planetary Remote Sensing

    NASA Technical Reports Server (NTRS)

    Coyle, D. Barry; Stysley, Paul R.; Poulios, Demetrios; Fredrickson, Robert M.; Kay, Richard B.; Cory, Kenneth C.

    2014-01-01

    We report on a newly solid state laser transmitter, designed and packaged for Earth and planetary space-based remote sensing applications for high efficiency, low part count, high pulse energy scalability/stability, and long life. Finally, we have completed a long term operational test which surpassed 2 Billion pulses with no measured decay in pulse energy.

  13. 32-channel single photon counting module for ultrasensitive detection of DNA sequences

    NASA Astrophysics Data System (ADS)

    Gudkov, Georgiy; Dhulla, Vinit; Borodin, Anatoly; Gavrilov, Dmitri; Stepukhovich, Andrey; Tsupryk, Andrey; Gorbovitski, Boris; Gorfinkel, Vera

    2006-10-01

    We continue our work on the design and implementation of multi-channel single photon detection systems for highly sensitive detection of ultra-weak fluorescence signals, for high-performance, multi-lane DNA sequencing instruments. A fiberized, 32-channel single photon detection (SPD) module based on single photon avalanche diode (SPAD), model C30902S-DTC, from Perkin Elmer Optoelectronics (PKI) has been designed and implemented. Unavailability of high performance, large area SPAD arrays and our desire to design high performance photon counting systems drives us to use individual diodes. Slight modifications in our quenching circuit has doubled the linear range of our system from 1MHz to 2MHz, which is the upper limit for these devices and the maximum saturation count rate has increased to 14 MHz. The detector module comprises of a single board computer PC-104 that enables data visualization, recording, processing, and transfer. Very low dark count (300-1000 counts/s), robust, efficient, simple data collection and processing, ease of connectivity to any other application demanding similar requirements and similar performance results to the best commercially available single photon counting module (SPCM from PKI) are some of the features of this system.

  14. On Approaching the Ultimate Limits of Communication Using a Photon-Counting Detector

    NASA Technical Reports Server (NTRS)

    Erkmen, Baris I.; Moision, Bruce E.; Dolinar, Samuel J.; Birnbaum, Kevin M.; Divsalar, Dariush

    2012-01-01

    Coherent states achieve the Holevo capacity of a pure-loss channel when paired with an optimal measurement, but a physical realization of this measurement scheme is as of yet unknown, and it is also likely to be of high complexity. In this paper, we focus on the photon-counting measurement and study the photon and dimensional efficiencies attainable with modulations over classical- and nonclassical-state alphabets. We analyze two binary modulation architectures that improve upon the dimensional versus photon efficiency tradeoff achievable with the state-of-the-art coherent-state on-off keying modulation. We show that at high photon efficiency these architectures achieve an efficiency tradeoff that differs from the best possible tradeoff--determined by the Holevo capacity--by only a constant factor. The first architecture we analyze is a coherent-state transmitter that relies on feedback from the receiver to control the transmitted energy. The second architecture uses a single-photon number-state source.

  15. The darkest EMCCD ever

    NASA Astrophysics Data System (ADS)

    Daigle, Olivier; Quirion, Pierre-Olivier; Lessard, Simon

    2010-07-01

    EMCCDs are devices capable of sub-electron read-out noise at high pixel rate, together with a high quantum efficiency (QE). However, they are plagued by an excess noise factor (ENF) which has the same effect on photometric measurement as if the QE would be halved. In order to get rid of the ENF, the photon counting (PC) operation is mandatory, with the drawback of counting only one photon per pixel per frame. The high frame rate capability of the EMCCDs comes to the rescue, at the price of increased clock induced charges (CIC), which dominates the noise budget of the EMCCD. The CIC can be greatly reduced with an appropriate clocking, which renders the PC operation of the EMCCD very efficient for faint flux photometry or spectroscopy, adaptive optics, ultrafast imaging and Lucky Imaging. This clocking is achievable with a new EMCCD controller: CCCP, the CCD Controller for Counting Photons. This new controller, which is now commercialized by Nüvü cameras inc., was integrated into an EMCCD camera and tested at the observatoire du mont-M'egantic. The results are presented in this paper.

  16. Photon Counting System for High-Sensitivity Detection of Bioluminescence at Optical Fiber End.

    PubMed

    Iinuma, Masataka; Kadoya, Yutaka; Kuroda, Akio

    2016-01-01

    The technique of photon counting is widely used for various fields and also applicable to a high-sensitivity detection of luminescence. Thanks to recent development of single photon detectors with avalanche photodiodes (APDs), the photon counting system with an optical fiber has become powerful for a detection of bioluminescence at an optical fiber end, because it allows us to fully use the merits of compactness, simple operation, highly quantum efficiency of the APD detectors. This optical fiber-based system also has a possibility of improving the sensitivity to a local detection of Adenosine triphosphate (ATP) by high-sensitivity detection of the bioluminescence. In this chapter, we are introducing a basic concept of the optical fiber-based system and explaining how to construct and use this system.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ovard R. Perry; David L. Georgeson

    This report describes the April 2011 calibration of the Accuscan II HpGe In Vivo system for high energy lung counting. The source used for the calibration was a NIST traceable lung set manufactured at the University of Cincinnati UCLL43AMEU & UCSL43AMEU containing Am-241 and Eu-152 with energies from 26 keV to 1408 keV. The lung set was used in conjunction with a Realistic Torso phantom. The phantom was placed on the RMC II counting table (with pins removed) between the v-ridges on the backwall of the Accuscan II counter. The top of the detector housing was positioned perpendicular to themore » junction of the phantom clavicle with the sternum. This position places the approximate center line of the detector housing with the center of the lungs. The energy and efficiency calibrations were performed using a Realistic Torso phantom (Appendix I) and the University of Cincinnati lung set. This report includes an overview introduction and records for the energy/FWHM and efficiency calibration including performance verification and validation counting. The Accuscan II system was successfully calibrated for high energy lung counting and verified in accordance with ANSI/HPS N13.30-1996 criteria.« less

  18. Microradiography with Semiconductor Pixel Detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakubek, Jan; Cejnarova, Andrea; Dammer, Jiri

    High resolution radiography (with X-rays, neutrons, heavy charged particles, ...) often exploited also in tomographic mode to provide 3D images stands as a powerful imaging technique for instant and nondestructive visualization of fine internal structure of objects. Novel types of semiconductor single particle counting pixel detectors offer many advantages for radiation imaging: high detection efficiency, energy discrimination or direct energy measurement, noiseless digital integration (counting), high frame rate and virtually unlimited dynamic range. This article shows the application and potential of pixel detectors (such as Medipix2 or TimePix) in different fields of radiation imaging.

  19. Photon Counting Energy Dispersive Detector Arrays for X-ray Imaging

    PubMed Central

    Iwanczyk, Jan S.; Nygård, Einar; Meirav, Oded; Arenson, Jerry; Barber, William C.; Hartsough, Neal E.; Malakhov, Nail; Wessel, Jan C.

    2009-01-01

    The development of an innovative detector technology for photon-counting in X-ray imaging is reported. This new generation of detectors, based on pixellated cadmium telluride (CdTe) and cadmium zinc telluride (CZT) detector arrays electrically connected to application specific integrated circuits (ASICs) for readout, will produce fast and highly efficient photon-counting and energy-dispersive X-ray imaging. There are a number of applications that can greatly benefit from these novel imagers including mammography, planar radiography, and computed tomography (CT). Systems based on this new detector technology can provide compositional analysis of tissue through spectroscopic X-ray imaging, significantly improve overall image quality, and may significantly reduce X-ray dose to the patient. A very high X-ray flux is utilized in many of these applications. For example, CT scanners can produce ~100 Mphotons/mm2/s in the unattenuated beam. High flux is required in order to collect sufficient photon statistics in the measurement of the transmitted flux (attenuated beam) during the very short time frame of a CT scan. This high count rate combined with a need for high detection efficiency requires the development of detector structures that can provide a response signal much faster than the transit time of carriers over the whole detector thickness. We have developed CdTe and CZT detector array structures which are 3 mm thick with 16×16 pixels and a 1 mm pixel pitch. These structures, in the two different implementations presented here, utilize either a small pixel effect or a drift phenomenon. An energy resolution of 4.75% at 122 keV has been obtained with a 30 ns peaking time using discrete electronics and a 57Co source. An output rate of 6×106 counts per second per individual pixel has been obtained with our ASIC readout electronics and a clinical CT X-ray tube. Additionally, the first clinical CT images, taken with several of our prototype photon-counting and energy-dispersive detector modules, are shown. PMID:19920884

  20. Photon Counting Energy Dispersive Detector Arrays for X-ray Imaging.

    PubMed

    Iwanczyk, Jan S; Nygård, Einar; Meirav, Oded; Arenson, Jerry; Barber, William C; Hartsough, Neal E; Malakhov, Nail; Wessel, Jan C

    2009-01-01

    The development of an innovative detector technology for photon-counting in X-ray imaging is reported. This new generation of detectors, based on pixellated cadmium telluride (CdTe) and cadmium zinc telluride (CZT) detector arrays electrically connected to application specific integrated circuits (ASICs) for readout, will produce fast and highly efficient photon-counting and energy-dispersive X-ray imaging. There are a number of applications that can greatly benefit from these novel imagers including mammography, planar radiography, and computed tomography (CT). Systems based on this new detector technology can provide compositional analysis of tissue through spectroscopic X-ray imaging, significantly improve overall image quality, and may significantly reduce X-ray dose to the patient. A very high X-ray flux is utilized in many of these applications. For example, CT scanners can produce ~100 Mphotons/mm(2)/s in the unattenuated beam. High flux is required in order to collect sufficient photon statistics in the measurement of the transmitted flux (attenuated beam) during the very short time frame of a CT scan. This high count rate combined with a need for high detection efficiency requires the development of detector structures that can provide a response signal much faster than the transit time of carriers over the whole detector thickness. We have developed CdTe and CZT detector array structures which are 3 mm thick with 16×16 pixels and a 1 mm pixel pitch. These structures, in the two different implementations presented here, utilize either a small pixel effect or a drift phenomenon. An energy resolution of 4.75% at 122 keV has been obtained with a 30 ns peaking time using discrete electronics and a (57)Co source. An output rate of 6×10(6) counts per second per individual pixel has been obtained with our ASIC readout electronics and a clinical CT X-ray tube. Additionally, the first clinical CT images, taken with several of our prototype photon-counting and energy-dispersive detector modules, are shown.

  1. Highly efficient entanglement swapping and teleportation at telecom wavelength

    PubMed Central

    Jin, Rui-Bo; Takeoka, Masahiro; Takagi, Utako; Shimizu, Ryosuke; Sasaki, Masahide

    2015-01-01

    Entanglement swapping at telecom wavelengths is at the heart of quantum networking in optical fiber infrastructures. Although entanglement swapping has been demonstrated experimentally so far using various types of entangled photon sources both in near-infrared and telecom wavelength regions, the rate of swapping operation has been too low to be applied to practical quantum protocols, due to limited efficiency of entangled photon sources and photon detectors. Here we demonstrate drastic improvement of the efficiency at telecom wavelength by using two ultra-bright entangled photon sources and four highly efficient superconducting nanowire single photon detectors. We have attained a four-fold coincidence count rate of 108 counts per second, which is three orders higher than the previous experiments at telecom wavelengths. A raw (net) visibility in a Hong-Ou-Mandel interference between the two independent entangled sources was 73.3 ± 1.0% (85.1 ± 0.8%). We performed the teleportation and entanglement swapping, and obtained a fidelity of 76.3% in the swapping test. Our results on the coincidence count rates are comparable with the ones ever recorded in teleportation/swapping and multi-photon entanglement generation experiments at around 800 nm wavelengths. Our setup opens the way to practical implementation of device-independent quantum key distribution and its distance extension by the entanglement swapping as well as multi-photon entangled state generation in telecom band infrastructures with both space and fiber links. PMID:25791212

  2. Highly efficient entanglement swapping and teleportation at telecom wavelength.

    PubMed

    Jin, Rui-Bo; Takeoka, Masahiro; Takagi, Utako; Shimizu, Ryosuke; Sasaki, Masahide

    2015-03-20

    Entanglement swapping at telecom wavelengths is at the heart of quantum networking in optical fiber infrastructures. Although entanglement swapping has been demonstrated experimentally so far using various types of entangled photon sources both in near-infrared and telecom wavelength regions, the rate of swapping operation has been too low to be applied to practical quantum protocols, due to limited efficiency of entangled photon sources and photon detectors. Here we demonstrate drastic improvement of the efficiency at telecom wavelength by using two ultra-bright entangled photon sources and four highly efficient superconducting nanowire single photon detectors. We have attained a four-fold coincidence count rate of 108 counts per second, which is three orders higher than the previous experiments at telecom wavelengths. A raw (net) visibility in a Hong-Ou-Mandel interference between the two independent entangled sources was 73.3 ± 1.0% (85.1 ± 0.8%). We performed the teleportation and entanglement swapping, and obtained a fidelity of 76.3% in the swapping test. Our results on the coincidence count rates are comparable with the ones ever recorded in teleportation/swapping and multi-photon entanglement generation experiments at around 800 nm wavelengths. Our setup opens the way to practical implementation of device-independent quantum key distribution and its distance extension by the entanglement swapping as well as multi-photon entangled state generation in telecom band infrastructures with both space and fiber links.

  3. Approaching the Ultimate Limits of Communication Efficiency with a Photon-Counting Detector

    NASA Technical Reports Server (NTRS)

    Erkmen, Baris; Moision, Bruce; Dolinar, Samuel J.; Birnbaum, Kevin M.; Divsalar, Dariush

    2012-01-01

    Coherent states achieve the Holevo capacity of a pure-loss channel when paired with an optimal measurement, but a physical realization of this measurement is as of yet unknown, and it is also likely to be of high complexity. In this paper, we focus on the photon-counting measurement and study the photon and dimensional efficiencies attainable with modulations over classical- and nonclassical-state alphabets. We first review the state-of-the-art coherent on-off-keying (OOK) with a photoncounting measurement, illustrating its asymptotic inefficiency relative to the Holevo limit. We show that a commonly made Poisson approximation in thermal noise leads to unbounded photon information efficiencies, violating the conjectured Holevo limit. We analyze two binary-modulation architectures that improve upon the dimensional versus photon efficiency tradeoff achievable with conventional OOK. We show that at high photon efficiency these architectures achieve an efficiency tradeoff that differs from the best possible tradeoff--determined by the Holevo capacity--by only a constant factor. The first architecture we analyze is a coherent-state transmitter that relies on feedback from the receiver to control the transmitted energy. The second architecture uses a single-photon number-state source.

  4. Quantitative basis for component factors of gas flow proportional counting efficiencies

    NASA Astrophysics Data System (ADS)

    Nichols, Michael C.

    This dissertation investigates the counting efficiency calibration of a gas flow proportional counter with beta-particle emitters in order to (1) determine by measurements and simulation the values of the component factors of beta-particle counting efficiency for a proportional counter, (2) compare the simulation results and measured counting efficiencies, and (3) determine the uncertainty of the simulation and measurements. Monte Carlo simulation results by the MCNP5 code were compared with measured counting efficiencies as a function of sample thickness for 14C, 89Sr, 90Sr, and 90Y. The Monte Carlo model simulated strontium carbonate with areal thicknesses from 0.1 to 35 mg cm-2. The samples were precipitated as strontium carbonate with areal thicknesses from 3 to 33 mg cm-2 , mounted on membrane filters, and counted on a low background gas flow proportional counter. The estimated fractional standard deviation was 2--4% (except 6% for 14C) for efficiency measurements of the radionuclides. The Monte Carlo simulations have uncertainties estimated to be 5 to 6 percent for carbon-14 and 2.4 percent for strontium-89, strontium-90, and yttrium-90. The curves of simulated counting efficiency vs. sample areal thickness agreed within 3% of the curves of best fit drawn through the 25--49 measured points for each of the four radionuclides. Contributions from this research include development of uncertainty budgets for the analytical processes; evaluation of alternative methods for determining chemical yield critical to the measurement process; correcting a bias found in the MCNP normalization of beta spectra histogram; clarifying the interpretation of the commonly used ICRU beta-particle spectra for use by MCNP; and evaluation of instrument parameters as applied to the simulation model to obtain estimates of the counting efficiency from simulated pulse height tallies.

  5. How to squeeze high quantum efficiency and high time resolution out of a SPAD

    NASA Technical Reports Server (NTRS)

    Lacaita, A.; Zappa, F.; Cova, Sergio; Ripamonti, Giancarlo; Spinelli, A.

    1993-01-01

    We address the issue whether Single-Photon Avalanche Diodes (SPADs) can be suitably designed to achieve a trade-off between quantum efficiency and time resolution performance. We briefly recall the physical mechanisms setting the time resolution of avalanche photodiodes operated in single-photon counting, and we give some criteria for the design of SPADs with a quantum efficiency better than l0 percent at 1064 nm together with a time resolution below 50 ps rms.

  6. Design and evaluation of a nondestructive fissile assay device for HTGR fuel samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNeany, S. R.; Knoll, R. W.; Jenkins, J. D.

    1979-02-01

    Nondestructive assay of fissile material plays an important role in nuclear fuel processing facilities. Information for product quality control, plant criticality safety, and nuclear materials accountability can be obtained from assay devices. All of this is necessary for a safe, efficient, and orderly operation of a production plant. Presented here is a design description and an operational evaluation of a device developed to nondestructively assay small samples of High-Temperature Gas-Cooled Reactor (HTGR) fuel. The measurement technique employed consists in thermal-neutron irradiation of a sample followed by pneumatic transfer to a high-efficiency neutron detector where delayed neutrons are counted. In general,more » samples undergo several irradiation and count cycles during a measurement. The total number of delayed-neutron counts accumulated is translated into grams of fissile mass through comparison with the counts accumulated in an identical irradiation and count sequence of calibration standards. Successful operation of the device through many experiments over a one-year period indicates high operational reliability. Tests of assay precision show this to be better than 0.25% for measurements of 10 min. Assay biases may be encountered if calibration standards are not representative of unknown samples, but reasonable care in construction and control of standards should lead to no more than 0.2% bias in the measurements. Nondestructive fissile assay of HTGR fuel samples by thermal-neutron irradiation and delayed-neutron detection has been demonstrated to be a rapid and accurate analysis technique. However, careful attention and control must be given to calibration standards to see that they remain representative of unknown samples.« less

  7. Multiple-Event, Single-Photon Counting Imaging Sensor

    NASA Technical Reports Server (NTRS)

    Zheng, Xinyu; Cunningham, Thomas J.; Sun, Chao; Wang, Kang L.

    2011-01-01

    The single-photon counting imaging sensor is typically an array of silicon Geiger-mode avalanche photodiodes that are monolithically integrated with CMOS (complementary metal oxide semiconductor) readout, signal processing, and addressing circuits located in each pixel and the peripheral area of the chip. The major problem is its single-event method for photon count number registration. A single-event single-photon counting imaging array only allows registration of up to one photon count in each of its pixels during a frame time, i.e., the interval between two successive pixel reset operations. Since the frame time can t be too short, this will lead to very low dynamic range and make the sensor merely useful for very low flux environments. The second problem of the prior technique is a limited fill factor resulting from consumption of chip area by the monolithically integrated CMOS readout in pixels. The resulting low photon collection efficiency will substantially ruin any benefit gained from the very sensitive single-photon counting detection. The single-photon counting imaging sensor developed in this work has a novel multiple-event architecture, which allows each of its pixels to register as more than one million (or more) photon-counting events during a frame time. Because of a consequently boosted dynamic range, the imaging array of the invention is capable of performing single-photon counting under ultra-low light through high-flux environments. On the other hand, since the multiple-event architecture is implemented in a hybrid structure, back-illumination and close-to-unity fill factor can be realized, and maximized quantum efficiency can also be achieved in the detector array.

  8. Differential Die-Away Instrument: Report on Neutron Detector Recovery Performance and Proposed Improvements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goodsell, Alison Victoria; Swinhoe, Martyn Thomas; Henzl, Vladimir

    2014-09-22

    Four helium-3 ( 3He) detector/preamplifier packages (¾”/KM200, DDSI/PDT-A111, DDA/PDT-A111, and DDA/PDT10A) were experimentally tested to determine the deadtime effects at different DT neutron generator output settings. At very high count rates, the ¾”/KM200 package performed best. At high count rates, the ¾”/KM200 and the DDSI/PDT-A111 packages performed very well, with the DDSI/PDT-A111 operating with slightly higher efficiency. All of the packages performed similarly at mid to low count rates. Proposed improvements include using a fast recovery LANL-made dual channel preamplifier, testing smaller diameter 3He tubes, and further investigating quench gases.

  9. Analytical method for measuring cosmogenic 35S in natural waters

    DOE PAGES

    Uriostegui, Stephanie H.; Bibby, Richard K.; Esser, Bradley K.; ...

    2015-05-18

    Here, cosmogenic sulfur-35 in water as dissolved sulfate ( 35SO 4) has successfully been used as an intrinsic hydrologic tracer in low-SO 4, high-elevation basins. Its application in environmental waters containing high SO 4 concentrations has been limited because only small amounts of SO 4 can be analyzed using current liquid scintillation counting (LSC) techniques. We present a new analytical method for analyzing large amounts of BaSO 4 for 35S. We quantify efficiency gains when suspending BaSO 4 precipitate in Inta-Gel Plus cocktail, purify BaSO 4 precipitate to remove dissolved organic matter, mitigate interference of radium-226 and its daughter productsmore » by selection of high purity barium chloride, and optimize LSC counting parameters for 35S determination in larger masses of BaSO 4. Using this improved procedure, we achieved counting efficiencies that are comparable to published LSC techniques despite a 10-fold increase in the SO 4 sample load. 35SO 4 was successfully measured in high SO 4 surface waters and groundwaters containing low ratios of 35S activity to SO 4 mass demonstrating that this new analytical method expands the analytical range of 35SO 4 and broadens the utility of 35SO 4 as an intrinsic tracer in hydrologic settings.« less

  10. Advances in photon counting for bioluminescence

    NASA Astrophysics Data System (ADS)

    Ingle, Martin B.; Powell, Ralph

    1998-11-01

    Photon counting systems were originally developed for astronomy, initially by the astronomical community. However, a major application area is in the study of luminescent probes in living plants, fishes and cell cultures. For these applications, it has been necessary to develop camera system capability at very low light levels -- a few photons occasionally -- and also at reasonably high light levels to enable the systems to be focused and to collect quality images of the object under study. The paper presents new data on MTF at extremely low photon flux and conventional ICCD illumination, counting efficiency and dark noise as a function of temperature.

  11. CMOS image sensors as an efficient platform for glucose monitoring.

    PubMed

    Devadhasan, Jasmine Pramila; Kim, Sanghyo; Choi, Cheol Soo

    2013-10-07

    Complementary metal oxide semiconductor (CMOS) image sensors have been used previously in the analysis of biological samples. In the present study, a CMOS image sensor was used to monitor the concentration of oxidized mouse plasma glucose (86-322 mg dL(-1)) based on photon count variation. Measurement of the concentration of oxidized glucose was dependent on changes in color intensity; color intensity increased with increasing glucose concentration. The high color density of glucose highly prevented photons from passing through the polydimethylsiloxane (PDMS) chip, which suggests that the photon count was altered by color intensity. Photons were detected by a photodiode in the CMOS image sensor and converted to digital numbers by an analog to digital converter (ADC). Additionally, UV-spectral analysis and time-dependent photon analysis proved the efficiency of the detection system. This simple, effective, and consistent method for glucose measurement shows that CMOS image sensors are efficient devices for monitoring glucose in point-of-care applications.

  12. Measurement of X-ray emission efficiency for K-lines.

    PubMed

    Procop, M

    2004-08-01

    Results for the X-ray emission efficiency (counts per C per sr) of K-lines for selected elements (C, Al, Si, Ti, Cu, Ge) and for the first time also for compounds and alloys (SiC, GaP, AlCu, TiAlC) are presented. An energy dispersive X-ray spectrometer (EDS) of known detection efficiency (counts per photon) has been used to record the spectra at a takeoff angle of 25 degrees determined by the geometry of the secondary electron microscope's specimen chamber. Overall uncertainty in measurement could be reduced to 5 to 10% in dependence on the line intensity and energy. Measured emission efficiencies have been compared with calculated efficiencies based on models applied in standardless analysis. The widespread XPP and PROZA models give somewhat too low emission efficiencies. The best agreement between measured and calculated efficiencies could be achieved by replacing in the modular PROZA96 model the original expression for the ionization cross section by the formula given by Casnati et al. (1982) A discrepancy remains for carbon, probably due to the high overvoltage ratio.

  13. Single photon detection using Geiger mode CMOS avalanche photodiodes

    NASA Astrophysics Data System (ADS)

    Lawrence, William G.; Stapels, Christopher; Augustine, Frank L.; Christian, James F.

    2005-10-01

    Geiger mode Avalanche Photodiodes fabricated using complementary metal-oxide-semiconductor (CMOS) fabrication technology combine high sensitivity detectors with pixel-level auxiliary circuitry. Radiation Monitoring Devices has successfully implemented CMOS manufacturing techniques to develop prototype detectors with active diameters ranging from 5 to 60 microns and measured detection efficiencies of up to 60%. CMOS active quenching circuits are included in the pixel layout. The actively quenched pixels have a quenching time less than 30 ns and a maximum count rate greater than 10 MHz. The actively quenched Geiger mode avalanche photodiode (GPD) has linear response at room temperature over six orders of magnitude. When operating in Geiger mode, these GPDs act as single photon-counting detectors that produce a digital output pulse for each photon with no associated read noise. Thermoelectrically cooled detectors have less than 1 Hz dark counts. The detection efficiency, dark count rate, and after-pulsing of two different pixel designs are measured and demonstrate the differences in the device operation. Additional applications for these devices include nuclear imaging and replacement of photomultiplier tubes in dosimeters.

  14. Performance Evaluation of Solar Blind NLOS Ultraviolet Communication Systems

    DTIC Science & Technology

    2008-12-01

    noise and signal count statistical distributions . Then we further link key system parameters such as path loss and communication bit error rate (BER... quantum noise limited photon-counting detection. These benefits can now begin to be realized based on technological advances in both miniaturized...multiplication gain of 105~107, high responsivity of 62 A/W, large detection area of a few cm2, reasonable quantum efficiency of 15%, and low dark current

  15. Improving the counting efficiency in time-correlated single photon counting experiments by dead-time optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peronio, P.; Acconcia, G.; Rech, I.

    Time-Correlated Single Photon Counting (TCSPC) has been long recognized as the most sensitive method for fluorescence lifetime measurements, but often requiring “long” data acquisition times. This drawback is related to the limited counting capability of the TCSPC technique, due to pile-up and counting loss effects. In recent years, multi-module TCSPC systems have been introduced to overcome this issue. Splitting the light into several detectors connected to independent TCSPC modules proportionally increases the counting capability. Of course, multi-module operation also increases the system cost and can cause space and power supply problems. In this paper, we propose an alternative approach basedmore » on a new detector and processing electronics designed to reduce the overall system dead time, thus enabling efficient photon collection at high excitation rate. We present a fast active quenching circuit for single-photon avalanche diodes which features a minimum dead time of 12.4 ns. We also introduce a new Time-to-Amplitude Converter (TAC) able to attain extra-short dead time thanks to the combination of a scalable array of monolithically integrated TACs and a sequential router. The fast TAC (F-TAC) makes it possible to operate the system towards the upper limit of detector count rate capability (∼80 Mcps) with reduced pile-up losses, addressing one of the historic criticisms of TCSPC. Preliminary measurements on the F-TAC are presented and discussed.« less

  16. Efficient high-performance ultrasound beamforming using oversampling

    NASA Astrophysics Data System (ADS)

    Freeman, Steven R.; Quick, Marshall K.; Morin, Marc A.; Anderson, R. C.; Desilets, Charles S.; Linnenbrink, Thomas E.; O'Donnell, Matthew

    1998-05-01

    High-performance and efficient beamforming circuitry is very important in large channel count clinical ultrasound systems. Current state-of-the-art digital systems using multi-bit analog to digital converters (A/Ds) have matured to provide exquisite image quality with moderate levels of integration. A simplified oversampling beamforming architecture has been proposed that may a low integration of delta-sigma A/Ds onto the same chip as digital delay and processing circuitry to form a monolithic ultrasound beamformer. Such a beamformer may enable low-power handheld scanners for high-end systems with very large channel count arrays. This paper presents an oversampling beamformer architecture that generates high-quality images using very simple; digitization, delay, and summing circuits. Additional performance may be obtained with this oversampled system for narrow bandwidth excitations by mixing the RF signal down in frequency to a range where the electronic signal to nose ratio of the delta-sigma A/D is optimized. An oversampled transmit beamformer uses the same delay circuits as receive and eliminates the need for separate transmit function generators.

  17. A novel concentration and viability detection method for Brettanomyces using the Cellometer image cytometry.

    PubMed

    Martyniak, Brian; Bolton, Jason; Kuksin, Dmitry; Shahin, Suzanne M; Chan, Leo Li-Ying

    2017-01-01

    Brettanomyces spp. can present unique cell morphologies comprised of excessive pseudohyphae and budding, leading to difficulties in enumerating cells. The current cell counting methods include manual counting of methylene blue-stained yeasts or measuring optical densities using a spectrophotometer. However, manual counting can be time-consuming and has high operator-dependent variations due to subjectivity. Optical density measurement can also introduce uncertainties where instead of individual cells counted, an average of a cell population is measured. In contrast, by utilizing the fluorescence capability of an image cytometer to detect acridine orange and propidium iodide viability dyes, individual cell nuclei can be counted directly in the pseudohyphae chains, which can improve the accuracy and efficiency of cell counting, as well as eliminating the subjectivity from manual counting. In this work, two experiments were performed to demonstrate the capability of Cellometer image cytometer to monitor Brettanomyces concentrations, viabilities, and budding/pseudohyphae percentages. First, a yeast propagation experiment was conducted to optimize software counting parameters for monitoring the growth of Brettanomyces clausenii, Brettanomyces bruxellensis, and Brettanomyces lambicus, which showed increasing cell concentrations, and varying pseudohyphae percentages. The pseudohyphae formed during propagation were counted either as multiple nuclei or a single multi-nuclei organism, where the results of counting the yeast as a single multi-nuclei organism were directly compared to manual counting. Second, a yeast fermentation experiment was conducted to demonstrate that the proposed image cytometric analysis method can monitor the growth pattern of B. lambicus and B. clausenii during beer fermentation. The results from both experiments displayed different growth patterns, viability, and budding/pseudohyphae percentages for each Brettanomyces species. The proposed Cellometer image cytometry method can improve efficiency and eliminate operator-dependent variations of cell counting compared with the traditional methods, which can potentially improve the quality of beverage products employing Brettanomyces yeasts.

  18. An analysis of dependency of counting efficiency on worker anatomy for in vivo measurements: whole-body counting

    NASA Astrophysics Data System (ADS)

    Zhang, Binquan; Mille, Matthew; Xu, X. George

    2008-07-01

    In vivo radiobioassay is integral to many health physics and radiological protection programs dealing with internal exposures. The Bottle Manikin Absorber (BOMAB) physical phantom has been widely used for whole-body counting calibrations. However, the shape of BOMAB phantoms—a collection of plastic, cylindrical shells which contain no bones or internal organs—does not represent realistic human anatomy. Furthermore, workers who come in contact with radioactive materials have rather different body shape and size. To date, there is a lack of understanding about how the counting efficiency would change when the calibrated counter is applied to a worker with complicated internal organs or tissues. This paper presents a study on various in vivo counting efficiencies obtained from Monte Carlo simulations of two BOMAB phantoms and three tomographic image-based models (VIP-Man, NORMAN and CNMAN) for a scenario involving homogeneous whole-body radioactivity contamination. The results reveal that a phantom's counting efficiency is strongly dependent on the shape and size of a phantom. Contrary to what was expected, it was found that only small differences in efficiency were observed when the density and material composition of all internal organs and tissues of the tomographic phantoms were changed to water. The results of this study indicate that BOMAB phantoms with appropriately adjusted size and shape can be sufficient for whole-body counting calibrations when the internal contamination is homogeneous.

  19. Efficiency of synaptic transmission of single-photon events from rod photoreceptor to rod bipolar dendrite.

    PubMed

    Schein, Stan; Ahmad, Kareem M

    2006-11-01

    A rod transmits absorption of a single photon by what appears to be a small reduction in the small number of quanta of neurotransmitter (Q(count)) that it releases within the integration period ( approximately 0.1 s) of a rod bipolar dendrite. Due to the quantal and stochastic nature of release, discrete distributions of Q(count) for darkness versus one isomerization of rhodopsin (R*) overlap. We suggested that release must be regular to narrow these distributions, reduce overlap, reduce the rate of false positives, and increase transmission efficiency (the fraction of R* events that are identified as light). Unsurprisingly, higher quantal release rates (Q(rates)) yield higher efficiencies. Focusing here on the effect of small changes in Q(rate), we find that a slightly higher Q(rate) yields greatly reduced efficiency, due to a necessarily fixed quantal-count threshold. To stabilize efficiency in the face of drift in Q(rate), the dendrite needs to regulate the biochemical realization of its quantal-count threshold with respect to its Q(count). These considerations reveal the mathematical role of calcium-based negative feedback and suggest a helpful role for spontaneous R*. In addition, to stabilize efficiency in the face of drift in degree of regularity, efficiency should be approximately 50%, similar to measurements.

  20. High-speed and high-efficiency travelling wave single-photon detectors embedded in nanophotonic circuits

    PubMed Central

    Pernice, W.H.P.; Schuck, C.; Minaeva, O.; Li, M.; Goltsman, G.N.; Sergienko, A.V.; Tang, H.X.

    2012-01-01

    Ultrafast, high-efficiency single-photon detectors are among the most sought-after elements in modern quantum optics and quantum communication. However, imperfect modal matching and finite photon absorption rates have usually limited their maximum attainable detection efficiency. Here we demonstrate superconducting nanowire detectors atop nanophotonic waveguides, which enable a drastic increase of the absorption length for incoming photons. This allows us to achieve high on-chip single-photon detection efficiency up to 91% at telecom wavelengths, repeatable across several fabricated chips. We also observe remarkably low dark count rates without significant compromise of the on-chip detection efficiency. The detectors are fully embedded in scalable silicon photonic circuits and provide ultrashort timing jitter of 18 ps. Exploiting this high temporal resolution, we demonstrate ballistic photon transport in silicon ring resonators. Our direct implementation of a high-performance single-photon detector on chip overcomes a major barrier in integrated quantum photonics. PMID:23271658

  1. STEFFY-software to calculate nuclide-specific total counting efficiency in well-type γ-ray detectors.

    PubMed

    Pommé, S

    2012-09-01

    A software package is presented to calculate the total counting efficiency for the decay of radionuclides in a well-type γ-ray detector. It is specifically applied to primary standardisation of activity by means of 4πγ-counting with a NaI(Tl) well-type scintillation detector. As an alternative to Monte Carlo simulations, the software combines good accuracy with superior speed and ease-of-use. It is also well suited to investigate uncertainties associated with the 4πγ-counting method for a variety of radionuclides and detector dimensions. In this paper, the underlying analytical models for the radioactive decay and subsequent counting efficiency of the emitted radiation in the detector are summarised. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Photon counting detector for the personal radiography inspection system "SIBSCAN"

    NASA Astrophysics Data System (ADS)

    Babichev, E. A.; Baru, S. E.; Grigoriev, D. N.; Leonov, V. V.; Oleynikov, V. P.; Porosev, V. V.; Savinov, G. A.

    2017-02-01

    X-ray detectors operating in the energy integrating mode are successfully used in many different applications. Nevertheless the direct photon counting detectors, having the superior parameters in comparison with the integrating ones, are rarely used yet. One of the reasons for this is the low value of the electrical signal generated by a detected photon. Silicon photomultiplier (SiPM) based scintillation counters have a high detection efficiency, high electronic gain and compact dimensions. This makes them a very attractive candidate to replace routinely used detectors in many fields. More than 10 years ago the digital scanning radiography system based on multistrip ionization chamber (MIC) was suggested at Budker Institute of Nuclear Physics. The detector demonstrates excellent radiation resistance and parameter stability after 5 year operations and an imaging of up to 1000 persons per day. Currently, the installations operate at several Russian airports and at subway stations in some cities. At the present time we design a new detector operating in the photon counting mode, having superior parameters than the gas one, based on scintillator - SiPM assemblies. This detector has close to zero noise, higher quantum efficiency and a count rate capability of more than 5 MHz per channel (20% losses), which leads to better image quality and improved detection capability. The suggested detector technology could be expanded to medical applications.

  3. A randomized approach to speed up the analysis of large-scale read-count data in the application of CNV detection.

    PubMed

    Wang, WeiBo; Sun, Wei; Wang, Wei; Szatkiewicz, Jin

    2018-03-01

    The application of high-throughput sequencing in a broad range of quantitative genomic assays (e.g., DNA-seq, ChIP-seq) has created a high demand for the analysis of large-scale read-count data. Typically, the genome is divided into tiling windows and windowed read-count data is generated for the entire genome from which genomic signals are detected (e.g. copy number changes in DNA-seq, enrichment peaks in ChIP-seq). For accurate analysis of read-count data, many state-of-the-art statistical methods use generalized linear models (GLM) coupled with the negative-binomial (NB) distribution by leveraging its ability for simultaneous bias correction and signal detection. However, although statistically powerful, the GLM+NB method has a quadratic computational complexity and therefore suffers from slow running time when applied to large-scale windowed read-count data. In this study, we aimed to speed up substantially the GLM+NB method by using a randomized algorithm and we demonstrate here the utility of our approach in the application of detecting copy number variants (CNVs) using a real example. We propose an efficient estimator, the randomized GLM+NB coefficients estimator (RGE), for speeding up the GLM+NB method. RGE samples the read-count data and solves the estimation problem on a smaller scale. We first theoretically validated the consistency and the variance properties of RGE. We then applied RGE to GENSENG, a GLM+NB based method for detecting CNVs. We named the resulting method as "R-GENSENG". Based on extensive evaluation using both simulated and empirical data, we concluded that R-GENSENG is ten times faster than the original GENSENG while maintaining GENSENG's accuracy in CNV detection. Our results suggest that RGE strategy developed here could be applied to other GLM+NB based read-count analyses, i.e. ChIP-seq data analysis, to substantially improve their computational efficiency while preserving the analytic power.

  4. Pulse shape discrimination of Cs2LiYCl6:Ce3+ detectors at high count rate based on triangular and trapezoidal filters

    NASA Astrophysics Data System (ADS)

    Wen, Xianfei; Enqvist, Andreas

    2017-09-01

    Cs2LiYCl6:Ce3+ (CLYC) detectors have demonstrated the capability to simultaneously detect γ-rays and thermal and fast neutrons with medium energy resolution, reasonable detection efficiency, and substantially high pulse shape discrimination performance. A disadvantage of CLYC detectors is the long scintillation decay times, which causes pulse pile-up at moderate input count rate. Pulse processing algorithms were developed based on triangular and trapezoidal filters to discriminate between neutrons and γ-rays at high count rate. The algorithms were first tested using low-rate data. They exhibit a pulse-shape discrimination performance comparable to that of the charge comparison method, at low rate. Then, they were evaluated at high count rate. Neutrons and γ-rays were adequately identified with high throughput at rates of up to 375 kcps. The algorithm developed using the triangular filter exhibits discrimination capability marginally higher than that of the trapezoidal filter based algorithm irrespective of low or high rate. The algorithms exhibit low computational complexity and are executable on an FPGA in real-time. They are also suitable for application to other radiation detectors whose pulses are piled-up at high rate owing to long scintillation decay times.

  5. SU-E-I-79: Source Geometry Dependence of Gamma Well-Counter Measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, M; Belanger, A; Kijewski, M

    Purpose: To determine the effect of liquid sample volume and geometry on counting efficiency in a gamma well-counter, and to assess the relative contributions of sample geometry and self-attenuation. Gamma wellcounters are standard equipment in clinical and preclinical studies, for measuring patient blood radioactivity and quantifying animal tissue uptake for tracer development and other purposes. Accurate measurements are crucial. Methods: Count rates were measured for aqueous solutions of 99m- Tc at four liquid volume values in a 1-cm-diam tube and at six volume values in a 2.2-cm-diam vial. Total activity was constant for all volumes, and data were corrected formore » decay. Count rates from a point source in air, supported by a filter paper, were measured at seven heights between 1.3 and 5.7 cm from the bottom of a tube. Results: Sample volume effects were larger for the tube than for the vial. For the tube, count efficiency relative to a 1-cc volume ranged from 1.05 at 0.05 cc to 0.84 at 3 cc. For the vial, relative count efficiency ranged from 1.02 at 0.05 cc to 0.87 at 15 cc. For the point source, count efficiency relative to 1.3 cm from the tube bottom ranged from 0.98 at 1.8 cm to 0.34 at 5.7 cm. The relative efficiency of a 3-cc liquid sample in a tube compared to a 1-cc sample is 0.84; the average relative efficiency for the solid sample in air between heights in the tube corresponding to the surfaces of those volumes (1.3 and 4.8 cm) is 0.81, implying that the major contribution to efficiency loss is geometry, rather than attenuation. Conclusion: Volume-dependent correction factors should be used for accurate quantitation radioactive of liquid samples. Solid samples should be positioned at the bottom of the tube for maximum count efficiency.« less

  6. Turtle: identifying frequent k-mers with cache-efficient algorithms.

    PubMed

    Roy, Rajat Shuvro; Bhattacharya, Debashish; Schliep, Alexander

    2014-07-15

    Counting the frequencies of k-mers in read libraries is often a first step in the analysis of high-throughput sequencing data. Infrequent k-mers are assumed to be a result of sequencing errors. The frequent k-mers constitute a reduced but error-free representation of the experiment, which can inform read error correction or serve as the input to de novo assembly methods. Ideally, the memory requirement for counting should be linear in the number of frequent k-mers and not in the, typically much larger, total number of k-mers in the read library. We present a novel method that balances time, space and accuracy requirements to efficiently extract frequent k-mers even for high-coverage libraries and large genomes such as human. Our method is designed to minimize cache misses in a cache-efficient manner by using a pattern-blocked Bloom filter to remove infrequent k-mers from consideration in combination with a novel sort-and-compact scheme, instead of a hash, for the actual counting. Although this increases theoretical complexity, the savings in cache misses reduce the empirical running times. A variant of method can resort to a counting Bloom filter for even larger savings in memory at the expense of false-negative rates in addition to the false-positive rates common to all Bloom filter-based approaches. A comparison with the state-of-the-art shows reduced memory requirements and running times. The tools are freely available for download at http://bioinformatics.rutgers.edu/Software/Turtle and http://figshare.com/articles/Turtle/791582. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  7. Characterization of spectrometric photon-counting X-ray detectors at different pitches

    NASA Astrophysics Data System (ADS)

    Jurdit, M.; Brambilla, A.; Moulin, V.; Ouvrier-Buffet, P.; Radisson, P.; Verger, L.

    2017-09-01

    There is growing interest in energy-sensitive photon-counting detectors based on high flux X-ray imaging. Their potential applications include medical imaging, non-destructive testing and security. Innovative detectors of this type will need to count individual photons and sort them into selected energy bins, at several million counts per second and per mm2. Cd(Zn)Te detector grade materials with a thickness of 1.5 to 3 mm and pitches from 800 μm down to 200 μm were assembled onto interposer boards. These devices were tested using in-house-developed full-digital fast readout electronics. The 16-channel demonstrators, with 256 energy bins, were experimentally characterized by determining spectral resolution, count rate, and charge sharing, which becomes challenging at low pitch. Charge sharing correction was found to efficiently correct X-ray spectra up to 40 × 106 incident photons.s-1.mm-2.

  8. Delayed gamma-ray spectroscopy with lanthanum bromide detector for non-destructive assay of nuclear material

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Favalli, Andrea; Iliev, Metodi; Ianakiev, Kiril

    High-energy delayed γ-ray spectroscopy is a potential technique for directly assaying spent fuel assemblies and achieving the safeguards goal of quantifying nuclear material inventories for spent fuel handling, interim storage, reprocessing facilities, repository sites, and final disposal. Requirements for the γ-ray detection system, up to ~6 MeV, can be summarized as follows: high efficiency at high γ-ray energies, high energy resolution, good linearity between γ-ray energy and output signal amplitude, ability to operate at very high count rates, and ease of use in industrial environments such as nuclear facilities. High Purity Germanium Detectors (HPGe) are the state of the artmore » and provide excellent energy resolution but are limited in their count rate capability. Lanthanum Bromide (LaBr 3) scintillation detectors offer significantly higher count rate capabilities at lower energy resolution. Thus, LaBr 3 detectors may be an effective alternative for nuclear spent-fuel applications, where count-rate capability is a requirement. This paper documents the measured performance of a 2” (length) × 2” (diameter) of LaBr3 scintillation detector system, coupled to a negatively biased PMT and a tapered active high voltage divider, with count-rates up to ~3 Mcps. An experimental methodology was developed that uses the average current from the PMT’s anode and a dual source method to characterize the detector system at specific very high count rate values. Delayed γ-ray spectra were acquired with the LaBr 3 detector system at the Idaho Accelerator Center, Idaho State University, where samples of ~3g of 235U were irradiated with moderated neutrons from a photo-neutron source. Results of the spectroscopy characterization and analysis of the delayed γ-ray spectra acquired indicate the possible use of LaBr3 scintillation detectors when high count rate capability may outweigh the lower energy resolution.« less

  9. Delayed gamma-ray spectroscopy with lanthanum bromide detector for non-destructive assay of nuclear material

    DOE PAGES

    Favalli, Andrea; Iliev, Metodi; Ianakiev, Kiril; ...

    2017-10-09

    High-energy delayed γ-ray spectroscopy is a potential technique for directly assaying spent fuel assemblies and achieving the safeguards goal of quantifying nuclear material inventories for spent fuel handling, interim storage, reprocessing facilities, repository sites, and final disposal. Requirements for the γ-ray detection system, up to ~6 MeV, can be summarized as follows: high efficiency at high γ-ray energies, high energy resolution, good linearity between γ-ray energy and output signal amplitude, ability to operate at very high count rates, and ease of use in industrial environments such as nuclear facilities. High Purity Germanium Detectors (HPGe) are the state of the artmore » and provide excellent energy resolution but are limited in their count rate capability. Lanthanum Bromide (LaBr 3) scintillation detectors offer significantly higher count rate capabilities at lower energy resolution. Thus, LaBr 3 detectors may be an effective alternative for nuclear spent-fuel applications, where count-rate capability is a requirement. This paper documents the measured performance of a 2” (length) × 2” (diameter) of LaBr3 scintillation detector system, coupled to a negatively biased PMT and a tapered active high voltage divider, with count-rates up to ~3 Mcps. An experimental methodology was developed that uses the average current from the PMT’s anode and a dual source method to characterize the detector system at specific very high count rate values. Delayed γ-ray spectra were acquired with the LaBr 3 detector system at the Idaho Accelerator Center, Idaho State University, where samples of ~3g of 235U were irradiated with moderated neutrons from a photo-neutron source. Results of the spectroscopy characterization and analysis of the delayed γ-ray spectra acquired indicate the possible use of LaBr3 scintillation detectors when high count rate capability may outweigh the lower energy resolution.« less

  10. Delayed gamma-ray spectroscopy with lanthanum bromide detector for non-destructive assay of nuclear material

    NASA Astrophysics Data System (ADS)

    Favalli, Andrea; Iliev, Metodi; Ianakiev, Kiril; Hunt, Alan W.; Ludewigt, Bernhard

    2018-01-01

    High-energy delayed γ-ray spectroscopy is a potential technique for directly assaying spent fuel assemblies and achieving the safeguards goal of quantifying nuclear material inventories for spent fuel handling, interim storage, reprocessing facilities, repository sites, and final disposal. Requirements for the γ-ray detection system, up to ∼6 MeV, can be summarized as follows: high efficiency at high γ-ray energies, high energy resolution, good linearity between γ-ray energy and output signal amplitude, ability to operate at very high count rates, and ease of use in industrial environments such as nuclear facilities. High Purity Germanium Detectors (HPGe) are the state of the art and provide excellent energy resolution but are limited in their count rate capability. Lanthanum Bromide (LaBr3) scintillation detectors offer significantly higher count rate capabilities at lower energy resolution. Thus, LaBr3 detectors may be an effective alternative for nuclear spent-fuel applications, where count-rate capability is a requirement. This paper documents the measured performance of a 2" (length) × 2" (diameter) of LaBr3 scintillation detector system, coupled to a negatively biased PMT and a tapered active high voltage divider, with count-rates up to ∼3 Mcps. An experimental methodology was developed that uses the average current from the PMT's anode and a dual source method to characterize the detector system at specific very high count rate values. Delayed γ-ray spectra were acquired with the LaBr3 detector system at the Idaho Accelerator Center, Idaho State University, where samples of ∼3g of 235U were irradiated with moderated neutrons from a photo-neutron source. Results of the spectroscopy characterization and analysis of the delayed γ-ray spectra acquired indicate the possible use of LaBr3 scintillation detectors when high count rate capability may outweigh the lower energy resolution.

  11. Performance and Characterization of a Modular Superconducting Nanowire Single Photon Detector System for Space-to-Earth Optical Communications Links

    NASA Technical Reports Server (NTRS)

    Vyhnalek, Brian E.; Tedder, Sarah A.; Nappier, Jennifer M.

    2018-01-01

    Space-to-ground photon-counting optical communication links supporting high data rates over large distances require enhanced ground receiver sensitivity in order to reduce the mass and power burden on the spacecraft transmitter. Superconducting nanowire single-photon detectors (SNSPDs) have been demonstrated to offer superior performance in detection efficiency, timing resolution, and count rates over semiconductor photodetectors, and are a suitable technology for high photon efficiency links. Recently photon detectors based on superconducting nanowires have become commercially available, and we have assessed the characteristics and performance of one such commercial system as a candidate for potential utilization in ground receiver designs. The SNSPD system features independent channels which can be added modularly, and we analyze the scalability of the system to support different data rates, as well as consider coupling concepts and issues as the number of channels increases.

  12. Particle and Photon Detection: Counting and Energy Measurement

    PubMed Central

    Janesick, James; Tower, John

    2016-01-01

    Fundamental limits for photon counting and photon energy measurement are reviewed for CCD and CMOS imagers. The challenges to extend photon counting into the visible/nIR wavelengths and achieve energy measurement in the UV with specific read noise requirements are discussed. Pixel flicker and random telegraph noise sources are highlighted along with various methods used in reducing their contribution on the sensor’s read noise floor. Practical requirements for quantum efficiency, charge collection efficiency, and charge transfer efficiency that interfere with photon counting performance are discussed. Lastly we will review current efforts in reducing flicker noise head-on, in hopes to drive read noise substantially below 1 carrier rms. PMID:27187398

  13. webpic: A flexible web application for collecting distance and count measurements from images

    PubMed Central

    2018-01-01

    Despite increasing ability to store and analyze large amounts of data for organismal and ecological studies, the process of collecting distance and count measurements from images has largely remained time consuming and error-prone, particularly for tasks for which automation is difficult or impossible. Improving the efficiency of these tasks, which allows for more high quality data to be collected in a shorter amount of time, is therefore a high priority. The open-source web application, webpic, implements common web languages and widely available libraries and productivity apps to streamline the process of collecting distance and count measurements from images. In this paper, I introduce the framework of webpic and demonstrate one readily available feature of this application, linear measurements, using fossil leaf specimens. This application fills the gap between workflows accomplishable by individuals through existing software and those accomplishable by large, unmoderated crowds. It demonstrates that flexible web languages can be used to streamline time-intensive research tasks without the use of specialized equipment or proprietary software and highlights the potential for web resources to facilitate data collection in research tasks and outreach activities with improved efficiency. PMID:29608592

  14. Further developments of series-connected superconducting tunnel junction to radiation detection

    NASA Astrophysics Data System (ADS)

    Kurakado, Masahiko; Ohsawa, Daisuke; Katano, Rintaro; Ito, Shin; Isozumi, Yasuhito

    1997-10-01

    One of the promising radiation detection devices for various practical applications is the series-connected superconducting tunnel junction (STJ) detector. In this article, interesting topics of the detectors are described since our previous work: e.g., more than two order higher detection efficiency compared with single STJ detectors, high count rate detection, and position resolution. Detectors were cooled to 0.35-0.4 K by means of a convenient 3He cryostat. The 5.9 and 6.5 keV x rays from 55Fe are separated by a detector specially designed for x-ray detection. The possible count rate of the series-junction detector estimated from the shaping-time constant applied in the measurements is high, e.g., over 104 counts per second. A series-junction detector equipped with a position sensing mechanism has shown a position resolution of about 35 μm in a sensing area with a radius of 1.1 mm. The position resolution of series junctions improves the energy resolution. A new type series-connected STJ detector is also proposed, i.e., the dispersed multitrap series-junction detector, for further improvement of detection efficiency and energy resolution.

  15. Analog synthetic biology.

    PubMed

    Sarpeshkar, R

    2014-03-28

    We analyse the pros and cons of analog versus digital computation in living cells. Our analysis is based on fundamental laws of noise in gene and protein expression, which set limits on the energy, time, space, molecular count and part-count resources needed to compute at a given level of precision. We conclude that analog computation is significantly more efficient in its use of resources than deterministic digital computation even at relatively high levels of precision in the cell. Based on this analysis, we conclude that synthetic biology must use analog, collective analog, probabilistic and hybrid analog-digital computational approaches; otherwise, even relatively simple synthetic computations in cells such as addition will exceed energy and molecular-count budgets. We present schematics for efficiently representing analog DNA-protein computation in cells. Analog electronic flow in subthreshold transistors and analog molecular flux in chemical reactions obey Boltzmann exponential laws of thermodynamics and are described by astoundingly similar logarithmic electrochemical potentials. Therefore, cytomorphic circuits can help to map circuit designs between electronic and biochemical domains. We review recent work that uses positive-feedback linearization circuits to architect wide-dynamic-range logarithmic analog computation in Escherichia coli using three transcription factors, nearly two orders of magnitude more efficient in parts than prior digital implementations.

  16. Analog synthetic biology

    PubMed Central

    Sarpeshkar, R.

    2014-01-01

    We analyse the pros and cons of analog versus digital computation in living cells. Our analysis is based on fundamental laws of noise in gene and protein expression, which set limits on the energy, time, space, molecular count and part-count resources needed to compute at a given level of precision. We conclude that analog computation is significantly more efficient in its use of resources than deterministic digital computation even at relatively high levels of precision in the cell. Based on this analysis, we conclude that synthetic biology must use analog, collective analog, probabilistic and hybrid analog–digital computational approaches; otherwise, even relatively simple synthetic computations in cells such as addition will exceed energy and molecular-count budgets. We present schematics for efficiently representing analog DNA–protein computation in cells. Analog electronic flow in subthreshold transistors and analog molecular flux in chemical reactions obey Boltzmann exponential laws of thermodynamics and are described by astoundingly similar logarithmic electrochemical potentials. Therefore, cytomorphic circuits can help to map circuit designs between electronic and biochemical domains. We review recent work that uses positive-feedback linearization circuits to architect wide-dynamic-range logarithmic analog computation in Escherichia coli using three transcription factors, nearly two orders of magnitude more efficient in parts than prior digital implementations. PMID:24567476

  17. 76 FR 28214 - UChicago Argonne, LLC, et al.; Notice of Decision on Applications for Duty-Free Entry of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-16

    .... Applicant: UChicago Argonne, LLC, Lemont, IL 60439. Instrument: Mythen 1K Detector System. Manufacturer... highly correlated systems. This instrument is unique in that it has a small pixel pitch (50 microns); high detection efficiency, single photon counting with high dynamic range; and a small, lightweight and...

  18. 2016 NIST (133Xe) and Transfer (131mXe, 133mXe, 135Xe) Calibration Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson, Troy A.

    A significantly improved calibration of the High Purity Germanium detectors used by the Idaho National Laboratory Noble Gas Laboratory was performed during the annual NIST calibration. New sample spacers provide reproducible and secure support of samples at distances of 4, 12, 24, 50 and 100 cm. Bean, 15mL and 50mL Schlenk tube geometries were calibrated. Also included in this year’s calibration was a correlation of detector dead-time with sample activity that can be used to predict the schedule of counting the samples at each distance for each geometry. This schedule prediction will help staff members set calendar reminders so thatmore » collection of calibration data at each geometry will not be missed. This report also correlates the counting efficiencies between detectors, so that if the counting efficiency on one detector is not known, it can be estimated from the same geometry on another detector.« less

  19. Characterization of a neutron sensitive MCP/Timepix detector for quantitative image analysis at a pulsed neutron source

    NASA Astrophysics Data System (ADS)

    Watanabe, Kenichi; Minniti, Triestino; Kockelmann, Winfried; Dalgliesh, Robert; Burca, Genoveva; Tremsin, Anton S.

    2017-07-01

    The uncertainties and the stability of a neutron sensitive MCP/Timepix detector when operating in the event timing mode for quantitative image analysis at a pulsed neutron source were investigated. The dominant component to the uncertainty arises from the counting statistics. The contribution of the overlap correction to the uncertainty was concluded to be negligible from considerations based on the error propagation even if a pixel occupation probability is more than 50%. We, additionally, have taken into account the multiple counting effect in consideration of the counting statistics. Furthermore, the detection efficiency of this detector system changes under relatively high neutron fluxes due to the ageing effects of current Microchannel Plates. Since this efficiency change is position-dependent, it induces a memory image. The memory effect can be significantly reduced with correction procedures using the rate equations describing the permanent gain degradation and the scrubbing effect on the inner surfaces of the MCP pores.

  20. Theoretical assessment of whole body counting performances using numerical phantoms of different gender and sizes.

    PubMed

    Marzocchi, O; Breustedt, B; Mostacci, D; Zankl, M; Urban, M

    2011-03-01

    A goal of whole body counting (WBC) is the estimation of the total body burden of radionuclides disregarding the actual position within the body. To achieve the goal, the detectors need to be placed in regions where the photon flux is as independent as possible from the distribution of the source. At the same time, the detectors need high photon fluxes in order to achieve better efficiency and lower minimum detectable activities. This work presents a method able to define the layout of new WBC systems and to study the behaviour of existing ones using both detection efficiency and its dependence on the position of the source within the body of computational phantoms.

  1. Muon counting using silicon photomultipliers in the AMIGA detector of the Pierre Auger observatory

    NASA Astrophysics Data System (ADS)

    Aab, A.; Abreu, P.; Aglietta, M.; Ahn, E. J.; Samarai, I. Al; Albuquerque, I. F. M.; Allekotte, I.; Allison, P.; Almela, A.; Alvarez Castillo, J.; Alvarez-Muñiz, J.; Ambrosio, M.; Anastasi, G. A.; Anchordoqui, L.; Andrada, B.; Andringa, S.; Aramo, C.; Arqueros, F.; Arsene, N.; Asorey, H.; Assis, P.; Aublin, J.; Avila, G.; Badescu, A. M.; Balaceanu, A.; Baus, C.; Beatty, J. J.; Becker, K. H.; Bellido, J. A.; Berat, C.; Bertaina, M. E.; Bertou, X.; Biermann, P. L.; Billoir, P.; Biteau, J.; Blaess, S. G.; Blanco, A.; Blazek, J.; Bleve, C.; Boháčová, M.; Boncioli, D.; Bonifazi, C.; Borodai, N.; Botti, A. M.; Brack, J.; Brancus, I.; Bretz, T.; Bridgeman, A.; Briechle, F. L.; Buchholz, P.; Bueno, A.; Buitink, S.; Buscemi, M.; Caballero-Mora, K. S.; Caccianiga, B.; Caccianiga, L.; Cancio, A.; Canfora, F.; Caramete, L.; Caruso, R.; Castellina, A.; Cataldi, G.; Cazon, L.; Cester, R.; Chavez, A. G.; Chiavassa, A.; Chinellato, J. A.; Chudoba, J.; Clay, R. W.; Colalillo, R.; Coleman, A.; Collica, L.; Coluccia, M. R.; Conceição, R.; Contreras, F.; Cooper, M. J.; Coutu, S.; Covault, C. E.; Cronin, J.; Dallier, R.; D'Amico, S.; Daniel, B.; Dasso, S.; Daumiller, K.; Dawson, B. R.; de Almeida, R. M.; de Jong, S. J.; De Mauro, G.; de Mello Neto, J. R. T.; De Mitri, I.; de Oliveira, J.; de Souza, V.; Debatin, J.; del Peral, L.; Deligny, O.; Di Giulio, C.; Di Matteo, A.; Díaz Castro, M. L.; Diogo, F.; Dobrigkeit, C.; D'Olivo, J. C.; Dorofeev, A.; dos Anjos, R. C.; Dova, M. T.; Dundovic, A.; Ebr, J.; Engel, R.; Erdmann, M.; Erfani, M.; Escobar, C. O.; Espadanal, J.; Etchegoyen, A.; Falcke, H.; Fang, K.; Farrar, G.; Fauth, A. C.; Fazzini, N.; Fick, B.; Figueira, J. M.; Filevich, A.; Filipčič, A.; Fratu, O.; Freire, M. M.; Fujii, T.; Fuster, A.; García, B.; Garcia-Pinto, D.; Gaté, F.; Gemmeke, H.; Gherghel-Lascu, A.; Ghia, P. L.; Giaccari, U.; Giammarchi, M.; Giller, M.; Głas, D.; Glaser, C.; Glass, H.; Golup, G.; Gómez Berisso, M.; Gómez Vitale, P. F.; González, N.; Gookin, B.; Gordon, J.; Gorgi, A.; Gorham, P.; Gouffon, P.; Grillo, A. F.; Grubb, T. D.; Guarino, F.; Guedes, G. P.; Hampel, M. R.; Hansen, P.; Harari, D.; Harrison, T. A.; Harton, J. L.; Hasankiadeh, Q.; Haungs, A.; Hebbeker, T.; Heck, D.; Heimann, P.; Herve, A. E.; Hill, G. C.; Hojvat, C.; Holt, E.; Homola, P.; Hörandel, J. R.; Horvath, P.; Hrabovský, M.; Huege, T.; Hulsman, J.; Insolia, A.; Isar, P. G.; Jandt, I.; Jansen, S.; Johnsen, J. A.; Josebachuili, M.; Kääpä, A.; Kambeitz, O.; Kampert, K. H.; Kasper, P.; Katkov, I.; Keilhauer, B.; Kemp, E.; Kieckhafer, R. M.; Klages, H. O.; Kleifges, M.; Kleinfeller, J.; Krause, R.; Krohm, N.; Kuempel, D.; Kukec Mezek, G.; Kunka, N.; Kuotb Awad, A.; LaHurd, D.; Latronico, L.; Lauscher, M.; Lebrun, P.; Legumina, R.; Leigui de Oliveira, M. A.; Letessier-Selvon, A.; Lhenry-Yvon, I.; Link, K.; Lopes, L.; López, R.; López Casado, A.; Luce, Q.; Lucero, A.; Malacari, M.; Mallamaci, M.; Mandat, D.; Mantsch, P.; Mariazzi, A. G.; Mariş, I. C.; Marsella, G.; Martello, D.; Martinez, H.; Martínez Bravo, O.; Masías Meza, J. J.; Mathes, H. J.; Mathys, S.; Matthews, J.; Matthews, J. A. J.; Matthiae, G.; Mayotte, E.; Mazur, P. O.; Medina, C.; Medina-Tanco, G.; Melo, D.; Menshikov, A.; Messina, S.; Micheletti, M. I.; Middendorf, L.; Minaya, I. A.; Miramonti, L.; Mitrica, B.; Mockler, D.; Molina-Bueno, L.; Mollerach, S.; Montanet, F.; Morello, C.; Mostafá, M.; Müller, G.; Muller, M. A.; Müller, S.; Naranjo, I.; Navas, S.; Nellen, L.; Neuser, J.; Nguyen, P. H.; Niculescu-Oglinzanu, M.; Niechciol, M.; Niemietz, L.; Niggemann, T.; Nitz, D.; Nosek, D.; Novotny, V.; Nožka, H.; Núñez, L. A.; Ochilo, L.; Oikonomou, F.; Olinto, A.; Pakk Selmi-Dei, D.; Palatka, M.; Pallotta, J.; Papenbreer, P.; Parente, G.; Parra, A.; Paul, T.; Pech, M.; Pedreira, F.; Pȩkala, J.; Pelayo, R.; Peña-Rodriguez, J.; Pereira, L. A. S.; Perrone, L.; Peters, C.; Petrera, S.; Phuntsok, J.; Piegaia, R.; Pierog, T.; Pieroni, P.; Pimenta, M.; Pirronello, V.; Platino, M.; Plum, M.; Porowski, C.; Prado, R. R.; Privitera, P.; Prouza, M.; Quel, E. J.; Querchfeld, S.; Quinn, S.; Ramos-Pollant, R.; Rautenberg, J.; Ravignani, D.; Reinert, D.; Revenu, B.; Ridky, J.; Risse, M.; Ristori, P.; Rizi, V.; Rodrigues de Carvalho, W.; Rodriguez Fernandez, G.; Rodriguez Rojo, J.; Rodríguez-Frías, M. D.; Rogozin, D.; Rosado, J.; Roth, M.; Roulet, E.; Rovero, A. C.; Saffi, S. J.; Saftoiu, A.; Salazar, H.; Saleh, A.; Salesa Greus, F.; Salina, G.; Sanabria Gomez, J. D.; Sánchez, F.; Sanchez-Lucas, P.; Santos, E. M.; Santos, E.; Sarazin, F.; Sarkar, B.; Sarmento, R.; Sarmiento-Cano, C.; Sato, R.; Scarso, C.; Schauer, M.; Scherini, V.; Schieler, H.; Schmidt, D.; Scholten, O.; Schovánek, P.; Schröder, F. G.; Schulz, A.; Schulz, J.; Schumacher, J.; Sciutto, S. J.; Segreto, A.; Settimo, M.; Shadkam, A.; Shellard, R. C.; Sigl, G.; Silli, G.; Sima, O.; Śmiałkowski, A.; Šmída, R.; Snow, G. R.; Sommers, P.; Sonntag, S.; Sorokin, J.; Squartini, R.; Stanca, D.; Stanič, S.; Stasielak, J.; Strafella, F.; Suarez, F.; Suarez Durán, M.; Sudholz, T.; Suomijärvi, T.; Supanitsky, A. D.; Sutherland, M. S.; Swain, J.; Szadkowski, Z.; Taborda, O. A.; Tapia, A.; Tepe, A.; Theodoro, V. M.; Timmermans, C.; Todero Peixoto, C. J.; Tomankova, L.; Tomé, B.; Tonachini, A.; Torralba Elipe, G.; Torres Machado, D.; Torri, M.; Travnicek, P.; Trini, M.; Ulrich, R.; Unger, M.; Urban, M.; Valbuena-Delgado, A.; Valdés Galicia, J. F.; Valiño, I.; Valore, L.; van Aar, G.; van Bodegom, P.; van den Berg, A. M.; van Vliet, A.; Varela, E.; Vargas Cárdenas, B.; Varner, G.; Vázquez, J. R.; Vázquez, R. A.; Veberič, D.; Verzi, V.; Vicha, J.; Villaseñor, L.; Vorobiov, S.; Wahlberg, H.; Wainberg, O.; Walz, D.; Watson, A. A.; Weber, M.; Weindl, A.; Wiencke, L.; Wilczyński, H.; Winchen, T.; Wittkowski, D.; Wundheiler, B.; Wykes, S.; Yang, L.; Yelos, D.; Yushkov, A.; Zas, E.; Zavrtanik, D.; Zavrtanik, M.; Zepeda, A.; Zimmermann, B.; Ziolkowski, M.; Zong, Z.; Zuccarello, F.

    2017-03-01

    AMIGA (Auger Muons and Infill for the Ground Array) is an upgrade of the Pierre Auger Observatory designed to extend its energy range of detection and to directly measure the muon content of the cosmic ray primary particle showers. The array will be formed by an infill of surface water-Cherenkov detectors associated with buried scintillation counters employed for muon counting. Each counter is composed of three scintillation modules, with a 10 m2 detection area per module. In this paper, a new generation of detectors, replacing the current multi-pixel photomultiplier tube (PMT) with silicon photo sensors (aka. SiPMs), is proposed. The selection of the new device and its front-end electronics is explained. A method to calibrate the counting system that ensures the performance of the detector is detailed. This method has the advantage of being able to be carried out in a remote place such as the one where the detectors are deployed. High efficiency results, i.e. 98% efficiency for the highest tested overvoltage, combined with a low probability of accidental counting (~2%), show a promising performance for this new system.

  2. Muon counting using silicon photomultipliers in the AMIGA detector of the Pierre Auger observatory

    DOE PAGES

    Aab, A.; Abreu, P.; Aglietta, M.; ...

    2017-03-03

    Here, AMIGA (Auger Muons and Infill for the Ground Array) is an upgrade of the Pierre Auger Observatory designed to extend its energy range of detection and to directly measure the muon content of the cosmic ray primary particle showers. The array will be formed by an infill of surface water-Cherenkov detectors associated with buried scintillation counters employed for muon counting. Each counter is composed of three scintillation modules, with a 10 m 2 detection area per module. In this paper, a new generation of detectors, replacing the current multi-pixel photomultiplier tube (PMT) with silicon photo sensors (aka. SiPMs), ismore » proposed. The selection of the new device and its front-end electronics is explained. A method to calibrate the counting system that ensures the performance of the detector is detailed. This method has the advantage of being able to be carried out in a remote place such as the one where the detectors are deployed. High efficiency results, i.e. 98% efficiency for the highest tested overvoltage, combined with a low probability of accidental counting (~2%), show a promising performance for this new system.« less

  3. Design Study of an Incinerator Ash Conveyor Counting System - 13323

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jaederstroem, Henrik; Bronson, Frazier

    A design study has been performed for a system that should measure the Cs-137 activity in ash from an incinerator. Radioactive ash, expected to consist of both Cs-134 and Cs-137, will be transported on a conveyor belt at 0.1 m/s. The objective of the counting system is to determine the Cs-137 activity and direct the ash to the correct stream after a diverter. The decision levels are ranging from 8000 to 400000 Bq/kg and the decision error should be as low as possible. The decision error depends on the total measurement uncertainty which depends on the counting statistics and themore » uncertainty in the efficiency of the geometry. For the low activity decision it is necessary to know the efficiency to be able to determine if the signal from the Cs-137 is above the minimum detectable activity and that it generates enough counts to reach the desired precision. For the higher activity decision the uncertainty of the efficiency needs to be understood to minimize decision errors. The total efficiency of the detector is needed to be able to determine if the detector will be able operate at the count rate at the highest expected activity. The design study that is presented in this paper describes how the objectives of the monitoring systems were obtained, the choice of detector was made and how ISOCS (In Situ Object Counting System) mathematical modeling was used to calculate the efficiency. The ISOCS uncertainty estimator (IUE) was used to determine which parameters of the ash was important to know accurately in order to minimize the uncertainty of the efficiency. The examined parameters include the height of the ash on the conveyor belt, the matrix composition and density and relative efficiency of the detector. (authors)« less

  4. Isotopic analysis of uranium in natural waters by alpha spectrometry

    USGS Publications Warehouse

    Edwards, K.W.

    1968-01-01

    A method is described for the determination of U234/U238 activity ratios for uranium present in natural waters. The uranium is coprecipitated from solution with aluminum phosphate, extracted into ethyl acetate, further purified by ion exchange, and finally electroplated on a titanium disc for counting. The individual isotopes are determined by measurement of the alpha-particle energy spectrum using a high resolution low-background alpha spectrometer. Overall chemical recovery of about 90 percent and a counting efficiency of 25 percent allow analyses of water samples containing as little as 0.10 ?g/l of uranium. The accuracy of the method is limited, on most samples, primarily by counting statistics.

  5. An efficient, movable single-particle detector for use in cryogenic ultra-high vacuum environments.

    PubMed

    Spruck, Kaija; Becker, Arno; Fellenberger, Florian; Grieser, Manfred; von Hahn, Robert; Klinkhamer, Vincent; Novotný, Oldřich; Schippers, Stefan; Vogel, Stephen; Wolf, Andreas; Krantz, Claude

    2015-02-01

    A compact, highly efficient single-particle counting detector for ions of keV/u kinetic energy, movable by a long-stroke mechanical translation stage, has been developed at the Max-Planck-Institut für Kernphysik (Max Planck Institute for Nuclear Physics, MPIK). Both, detector and translation mechanics, can operate at ambient temperatures down to ∼10 K and consist fully of ultra-high vacuum compatible, high-temperature bakeable, and non-magnetic materials. The set-up is designed to meet the technical demands of MPIK's Cryogenic Storage Ring. We present a series of functional tests that demonstrate full suitability for this application and characterise the set-up with regard to its particle detection efficiency.

  6. A Monte Carlo study of lung counting efficiency for female workers of different breast sizes using deformable phantoms

    NASA Astrophysics Data System (ADS)

    Hegenbart, L.; Na, Y. H.; Zhang, J. Y.; Urban, M.; Xu, X. George

    2008-10-01

    There are currently no physical phantoms available for calibrating in vivo counting devices that represent women with different breast sizes because such phantoms are difficult, time consuming and expensive to fabricate. In this work, a feasible alternative involving computational phantoms was explored. A series of new female voxel phantoms with different breast sizes were developed and ported into a Monte Carlo radiation transport code for performing virtual lung counting efficiency calibrations. The phantoms are based on the RPI adult female phantom, a boundary representation (BREP) model. They were created with novel deformation techniques and then voxelized for the Monte Carlo simulations. Eight models have been selected with cup sizes ranging from AA to G according to brassiere industry standards. Monte Carlo simulations of a lung counting system were performed with these phantoms to study the effect of breast size on lung counting efficiencies, which are needed to determine the activity of a radionuclide deposited in the lung and hence to estimate the resulting dose to the worker. Contamination scenarios involving three different radionuclides, namely Am-241, Cs-137 and Co-60, were considered. The results show that detector efficiencies considerably decrease with increasing breast size, especially for low energy photon emitting radionuclides. When the counting efficiencies of models with cup size AA were compared to those with cup size G, a difference of up to 50% was observed. The detector efficiencies for each radionuclide can be approximated by curve fitting in the total breast mass (polynomial of second order) or the cup size (power).

  7. Optical Design Considerations for Efficient Light Collection from Liquid Scintillation Counters

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bernacki, Bruce E.; Douglas, Matthew; Erchinger, Jennifer L.

    2015-01-01

    Liquid scintillation counters measure charged particle-emitting radioactive isotopes and are used for environmental studies, nuclear chemistry, and life science. Alpha and beta emissions arising from the material under study interact with the scintillation cocktail to produce light. The prototypical liquid scintillation counter employs low-level photon-counting detectors to measure the arrival of the scintillation light produced as a result of the dissolved material under study interacting with the scintillation cocktail. For reliable operation the counting instrument must convey the scintillation light to the detectors efficiently and predictably. Current best practices employ the use of two or more detectors for coincidence processingmore » to discriminate true scintillation events from background events due to instrumental effects such as photomultiplier tube dark rates, tube flashing, or other light emission not generated in the scintillation cocktail vial. In low background liquid scintillation counters additional attention is paid to shielding the scintillation cocktail from naturally occurring radioactive material (NORM) present in the laboratory and within the instruments construction materials. Low background design is generally at odds with optimal light collection. This study presents the evolution of a light collection design for liquid scintillation counting in a low background shield. The basic approach to achieve both good light collection and a low background measurement is described. The baseline signals arising from the scintillation vial are modeled and methods to efficiently collect scintillation light are presented as part of the development of a customized low-background, high sensitivity liquid scintillation counting system.« less

  8. [Automated analyser of organ cultured corneal endothelial mosaic].

    PubMed

    Gain, P; Thuret, G; Chiquet, C; Gavet, Y; Turc, P H; Théillère, C; Acquart, S; Le Petit, J C; Maugery, J; Campos, L

    2002-05-01

    Until now, organ-cultured corneal endothelial mosaic has been assessed in France by cell counting using a calibrated graticule, or by drawing cells on a computerized image. The former method is unsatisfactory because it is characterized by a lack of objective evaluation of the cell surface and hexagonality and it requires an experienced technician. The latter method is time-consuming and requires careful attention. We aimed to make an efficient, fast and easy to use, automated digital analyzer of video images of the corneal endothelium. The hardware included a PC Pentium III ((R)) 800 MHz-Ram 256, a Data Translation 3155 acquisition card, a Sony SC 75 CE CCD camera, and a 22-inch screen. Special functions for automated cell boundary determination consisted of Plug-in programs included in the ImageTool software. Calibration was performed using a calibrated micrometer. Cell densities of 40 organ-cultured corneas measured by both manual and automated counting were compared using parametric tests (Student's t test for paired variables and the Pearson correlation coefficient). All steps were considered more ergonomic i.e., endothelial image capture, image selection, thresholding of multiple areas of interest, automated cell count, automated detection of errors in cell boundary drawing, presentation of the results in an HTML file including the number of counted cells, cell density, coefficient of variation of cell area, cell surface histogram and cell hexagonality. The device was efficient because the global process lasted on average 7 minutes and did not require an experienced technician. The correlation between cell densities obtained with both methods was high (r=+0.84, p<0.001). The results showed an under-estimation using manual counting (2191+/-322 vs. 2273+/-457 cell/mm(2), p=0.046), compared with the automated method. Our automated endothelial cell analyzer is efficient and gives reliable results quickly and easily. A multicentric validation would allow us to standardize cell counts among cornea banks in our country.

  9. Rectifying full-counting statistics in a spin Seebeck engine

    NASA Astrophysics Data System (ADS)

    Tang, Gaomin; Chen, Xiaobin; Ren, Jie; Wang, Jian

    2018-02-01

    In terms of the nonequilibrium Green's function framework, we formulate the full-counting statistics of conjugate thermal spin transport in a spin Seebeck engine, which is made by a metal-ferromagnet insulator interface driven by a temperature bias. We obtain general expressions of scaled cumulant generating functions of both heat and spin currents that hold special fluctuation symmetry relations, and demonstrate intriguing properties, such as rectification and negative differential effects of high-order fluctuations of thermal excited spin current, maximum output spin power, and efficiency. The transport and noise depend on the strongly fluctuating electron density of states at the interface. The results are relevant for designing an efficient spin Seebeck engine and can broaden our view in nonequilibrium thermodynamics and the nonlinear phenomenon in quantum transport systems.

  10. Sampling characteristics and calibration of snorkel counts to estimate stream fish populations

    USGS Publications Warehouse

    Weaver, D.; Kwak, Thomas J.; Pollock, Kenneth

    2014-01-01

    Snorkeling is a versatile technique for estimating lotic fish population characteristics; however, few investigators have evaluated its accuracy at population or assemblage levels. We evaluated the accuracy of snorkeling using prepositioned areal electrofishing (PAE) for estimating fish populations in a medium-sized Appalachian Mountain river during fall 2008 and summer 2009. Strip-transect snorkel counts were calibrated with PAE counts in identical locations among macrohabitats, fish species or taxa, and seasons. Mean snorkeling efficiency (i.e., the proportion of individuals counted from the true population) among all taxa and seasons was 14.7% (SE, 2.5%), and the highest efficiencies were for River Chub Nocomis micropogon at 21.1% (SE, 5.9%), Central Stoneroller Campostoma anomalum at 20.3% (SE, 9.6%), and darters (Percidae) at 17.1% (SE, 3.7%), whereas efficiencies were lower for shiners (Notropis spp., Cyprinella spp., Luxilus spp.) at 8.2% (SE, 2.2%) and suckers (Catostomidae) at 6.6% (SE, 3.2%). Macrohabitat type, fish taxon, or sampling season did not significantly explain variance in snorkeling efficiency. Mean snorkeling detection probability (i.e., probability of detecting at least one individual of a taxon) among fish taxa and seasons was 58.4% (SE, 6.1%). We applied the efficiencies from our calibration study to adjust snorkel counts from an intensive snorkeling survey conducted in a nearby reach. Total fish density estimates from strip-transect counts adjusted for snorkeling efficiency were 7,288 fish/ha (SE, 1,564) during summer and 15,805 fish/ha (SE, 4,947) during fall. Precision of fish density estimates is influenced by variation in snorkeling efficiency and sample size and may be increased with additional sampling effort. These results demonstrate the sampling properties and utility of snorkeling to characterize lotic fish assemblages with acceptable efficiency and detection probability, less effort, and no mortality, compared with traditional sampling methods.

  11. Low-Noise Free-Running High-Rate Photon-Counting for Space Communication and Ranging

    NASA Technical Reports Server (NTRS)

    Lu, Wei; Krainak, Michael A.; Yang, Guangning; Sun, Xiaoli; Merritt, Scott

    2016-01-01

    We present performance data for low-noise free-running high-rate photon counting method for space optical communication and ranging. NASA GSFC is testing the performance of two types of novel photon-counting detectors 1) a 2x8 mercury cadmium telluride (HgCdTe) avalanche array made by DRS Inc., and a 2) a commercial 2880-element silicon avalanche photodiode (APD) array. We successfully measured real-time communication performance using both the 2 detected-photon threshold and logic AND-gate coincidence methods. Use of these methods allows mitigation of dark count, after-pulsing and background noise effects without using other method of Time Gating The HgCdTe APD array routinely demonstrated very high photon detection efficiencies ((is) greater than 50%) at near infrared wavelength. The commercial silicon APD array exhibited a fast output with rise times of 300 ps and pulse widths of 600 ps. On-chip individually filtered signals from the entire array were multiplexed onto a single fast output. NASA GSFC has tested both detectors for their potential application for space communications and ranging. We developed and compare their performances using both the 2 detected photon threshold and coincidence methods.

  12. Low-Noise Free-Running High-Rate Photon-Counting for Space Communication and Ranging

    NASA Technical Reports Server (NTRS)

    Lu, Wei; Krainak, Michael A.; Yang, Guan; Sun, Xiaoli; Merritt, Scott

    2016-01-01

    We present performance data for low-noise free-running high-rate photon counting method for space optical communication and ranging. NASA GSFC is testing the performance of two types of novel photon-counting detectors 1) a 2x8 mercury cadmium telluride (HgCdTe) avalanche array made by DRS Inc., and a 2) a commercial 2880-element silicon avalanche photodiode (APD) array. We successfully measured real-time communication performance using both the 2 detected-photon threshold and logic AND-gate coincidence methods. Use of these methods allows mitigation of dark count, after-pulsing and background noise effects without using other method of Time Gating The HgCdTe APD array routinely demonstrated very high photon detection efficiencies (50) at near infrared wavelength. The commercial silicon APD array exhibited a fast output with rise times of 300 ps and pulse widths of 600 ps. On-chip individually filtered signals from the entire array were multiplexed onto a single fast output. NASA GSFC has tested both detectors for their potential application for space communications and ranging. We developed and compare their performances using both the 2 detected photon threshold and coincidence methods.

  13. A quartz nanopillar hemocytometer for high-yield separation and counting of CD4+ T lymphocytes

    NASA Astrophysics Data System (ADS)

    Kim, Dong-Joo; Seol, Jin-Kyeong; Wu, Yu; Ji, Seungmuk; Kim, Gil-Sung; Hyung, Jung-Hwan; Lee, Seung-Yong; Lim, Hyuneui; Fan, Rong; Lee, Sang-Kwon

    2012-03-01

    We report the development of a novel quartz nanopillar (QNP) array cell separation system capable of selectively capturing and isolating a single cell population including primary CD4+ T lymphocytes from the whole pool of splenocytes. Integrated with a photolithographically patterned hemocytometer structure, the streptavidin (STR)-functionalized-QNP (STR-QNP) arrays allow for direct quantitation of captured cells using high content imaging. This technology exhibits an excellent separation yield (efficiency) of ~95.3 +/- 1.1% for the CD4+ T lymphocytes from the mouse splenocyte suspensions and good linear response for quantitating captured CD4+ T-lymphoblasts, which is comparable to flow cytometry and outperforms any non-nanostructured surface capture techniques, i.e. cell panning. This nanopillar hemocytometer represents a simple, yet efficient cell capture and counting technology and may find immediate applications for diagnosis and immune monitoring in the point-of-care setting.We report the development of a novel quartz nanopillar (QNP) array cell separation system capable of selectively capturing and isolating a single cell population including primary CD4+ T lymphocytes from the whole pool of splenocytes. Integrated with a photolithographically patterned hemocytometer structure, the streptavidin (STR)-functionalized-QNP (STR-QNP) arrays allow for direct quantitation of captured cells using high content imaging. This technology exhibits an excellent separation yield (efficiency) of ~95.3 +/- 1.1% for the CD4+ T lymphocytes from the mouse splenocyte suspensions and good linear response for quantitating captured CD4+ T-lymphoblasts, which is comparable to flow cytometry and outperforms any non-nanostructured surface capture techniques, i.e. cell panning. This nanopillar hemocytometer represents a simple, yet efficient cell capture and counting technology and may find immediate applications for diagnosis and immune monitoring in the point-of-care setting. Electronic supplementary information (ESI) available. See DOI: 10.1039/c2nr11338d

  14. Comparison of Dry Medium Culture Plates for Mesophilic Aerobic Bacteria in Milk, Ice Cream, Ham, and Codfish Fillet Products

    PubMed Central

    Park, Junghyun; Kim, Myunghee

    2013-01-01

    This study was performed to compare the performance of Sanita-Kun dry medium culture plate with those of traditional culture medium and Petrifilm dry medium culture plate for the enumeration of the mesophilic aerobic bacteria in milk, ice cream, ham, and codfish fillet. Mesophilic aerobic bacteria were comparatively evaluated in milk, ice cream, ham, and codfish fillet using Sanita-Kun aerobic count (SAC), Petrifilm aerobic count (PAC), and traditional plate count agar (PCA) media. According to the results, all methods showed high correlations of 0.989~1.000 and no significant differences were observed for enumerating the mesophilic aerobic bacteria in the tested food products. SAC method was easier to perform and count colonies efficiently as compared to the PCA and PAC methods. Therefore, we concluded that the SAC method offers an acceptable alternative to the PCA and PAC methods for counting the mesophilic aerobic bacteria in milk, ice cream, ham, and codfish fillet products. PMID:24551829

  15. UAS-based automatic bird count of a common gull colony

    NASA Astrophysics Data System (ADS)

    Grenzdörffer, G. J.

    2013-08-01

    The standard procedure to count birds is a manual one. However a manual bird count is a time consuming and cumbersome process, requiring several people going from nest to nest counting the birds and the clutches. High resolution imagery, generated with a UAS (Unmanned Aircraft System) offer an interesting alternative. Experiences and results of UAS surveys for automatic bird count of the last two years are presented for the bird reserve island Langenwerder. For 2011 1568 birds (± 5%) were detected on the image mosaic, based on multispectral image classification and GIS-based post processing. Based on the experiences of 2011 the results and the accuracy of the automatic bird count 2012 became more efficient. For 2012 1938 birds with an accuracy of approx. ± 3% were counted. Additionally a separation of breeding and non-breeding birds was performed with the assumption, that standing birds cause a visible shade. The final section of the paper is devoted to the analysis of the 3D-point cloud. Thereby the point cloud was used to determine the height of the vegetation and the extend and depth of closed sinks, which are unsuitable for breeding birds.

  16. Comparison of Relative Bias, Precision, and Efficiency of Sampling Methods for Natural Enemies of Soybean Aphid (Hemiptera: Aphididae).

    PubMed

    Bannerman, J A; Costamagna, A C; McCornack, B P; Ragsdale, D W

    2015-06-01

    Generalist natural enemies play an important role in controlling soybean aphid, Aphis glycines (Hemiptera: Aphididae), in North America. Several sampling methods are used to monitor natural enemy populations in soybean, but there has been little work investigating their relative bias, precision, and efficiency. We compare five sampling methods: quadrats, whole-plant counts, sweep-netting, walking transects, and yellow sticky cards to determine the most practical methods for sampling the three most prominent species, which included Harmonia axyridis (Pallas), Coccinella septempunctata L. (Coleoptera: Coccinellidae), and Orius insidiosus (Say) (Hemiptera: Anthocoridae). We show an important time by sampling method interaction indicated by diverging community similarities within and between sampling methods as the growing season progressed. Similarly, correlations between sampling methods for the three most abundant species over multiple time periods indicated differences in relative bias between sampling methods and suggests that bias is not consistent throughout the growing season, particularly for sticky cards and whole-plant samples. Furthermore, we show that sticky cards produce strongly biased capture rates relative to the other four sampling methods. Precision and efficiency differed between sampling methods and sticky cards produced the most precise (but highly biased) results for adult natural enemies, while walking transects and whole-plant counts were the most efficient methods for detecting coccinellids and O. insidiosus, respectively. Based on bias, precision, and efficiency considerations, the most practical sampling methods for monitoring in soybean include walking transects for coccinellid detection and whole-plant counts for detection of small predators like O. insidiosus. Sweep-netting and quadrat samples are also useful for some applications, when efficiency is not paramount. © The Authors 2015. Published by Oxford University Press on behalf of Entomological Society of America. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  17. Photoreceptor counting and montaging of en-face retinal images from an adaptive optics fundus camera

    PubMed Central

    Xue, Bai; Choi, Stacey S.; Doble, Nathan; Werner, John S.

    2008-01-01

    A fast and efficient method for quantifying photoreceptor density in images obtained with an en-face flood-illuminated adaptive optics (AO) imaging system is described. To improve accuracy of cone counting, en-face images are analyzed over extended areas. This is achieved with two separate semiautomated algorithms: (1) a montaging algorithm that joins retinal images with overlapping common features without edge effects and (2) a cone density measurement algorithm that counts the individual cones in the montaged image. The accuracy of the cone density measurement algorithm is high, with >97% agreement for a simulated retinal image (of known density, with low contrast) and for AO images from normal eyes when compared with previously reported histological data. Our algorithms do not require spatial regularity in cone packing and are, therefore, useful for counting cones in diseased retinas, as demonstrated for eyes with Stargardt’s macular dystrophy and retinitis pigmentosa. PMID:17429482

  18. Photoreceptor counting and montaging of en-face retinal images from an adaptive optics fundus camera

    NASA Astrophysics Data System (ADS)

    Xue, Bai; Choi, Stacey S.; Doble, Nathan; Werner, John S.

    2007-05-01

    A fast and efficient method for quantifying photoreceptor density in images obtained with an en-face flood-illuminated adaptive optics (AO) imaging system is described. To improve accuracy of cone counting, en-face images are analyzed over extended areas. This is achieved with two separate semiautomated algorithms: (1) a montaging algorithm that joins retinal images with overlapping common features without edge effects and (2) a cone density measurement algorithm that counts the individual cones in the montaged image. The accuracy of the cone density measurement algorithm is high, with >97% agreement for a simulated retinal image (of known density, with low contrast) and for AO images from normal eyes when compared with previously reported histological data. Our algorithms do not require spatial regularity in cone packing and are, therefore, useful for counting cones in diseased retinas, as demonstrated for eyes with Stargardt's macular dystrophy and retinitis pigmentosa.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aab, A.; Abreu, P.; Aglietta, M.

    Here, AMIGA (Auger Muons and Infill for the Ground Array) is an upgrade of the Pierre Auger Observatory designed to extend its energy range of detection and to directly measure the muon content of the cosmic ray primary particle showers. The array will be formed by an infill of surface water-Cherenkov detectors associated with buried scintillation counters employed for muon counting. Each counter is composed of three scintillation modules, with a 10 m 2 detection area per module. In this paper, a new generation of detectors, replacing the current multi-pixel photomultiplier tube (PMT) with silicon photo sensors (aka. SiPMs), ismore » proposed. The selection of the new device and its front-end electronics is explained. A method to calibrate the counting system that ensures the performance of the detector is detailed. This method has the advantage of being able to be carried out in a remote place such as the one where the detectors are deployed. High efficiency results, i.e. 98% efficiency for the highest tested overvoltage, combined with a low probability of accidental counting (~2%), show a promising performance for this new system.« less

  20. Optimization of single photon detection model based on GM-APD

    NASA Astrophysics Data System (ADS)

    Chen, Yu; Yang, Yi; Hao, Peiyu

    2017-11-01

    One hundred kilometers high precision laser ranging hopes the detector has very strong detection ability for very weak light. At present, Geiger-Mode of Avalanche Photodiode has more use. It has high sensitivity and high photoelectric conversion efficiency. Selecting and designing the detector parameters according to the system index is of great importance to the improvement of photon detection efficiency. Design optimization requires a good model. In this paper, we research the existing Poisson distribution model, and consider the important detector parameters of dark count rate, dead time, quantum efficiency and so on. We improve the optimization of detection model, select the appropriate parameters to achieve optimal photon detection efficiency. The simulation is carried out by using Matlab and compared with the actual test results. The rationality of the model is verified. It has certain reference value in engineering applications.

  1. Novel Photon-Counting Detectors for Free-Space Communication

    NASA Technical Reports Server (NTRS)

    Krainak, M. A.; Yang, G.; Sun, X.; Lu, W.; Merritt, S.; Beck, J.

    2016-01-01

    We present performance data for novel photon-counting detectors for free space optical communication. NASA GSFC is testing the performance of two types of novel photon-counting detectors 1) a 2x8 mercury cadmium telluride (HgCdTe) avalanche array made by DRS Inc., and a 2) a commercial 2880-element silicon avalanche photodiode (APD) array. We present and compare dark count, photon-detection efficiency, wavelength response and communication performance data for these detectors. We successfully measured real-time communication performance using both the 2 detected-photon threshold and AND-gate coincidence methods. Use of these methods allows mitigation of dark count, after-pulsing and background noise effects. The HgCdTe APD array routinely demonstrated photon detection efficiencies of greater than 50% across 5 arrays, with one array reaching a maximum PDE of 70%. We performed high-resolution pixel-surface spot scans and measured the junction diameters of its diodes. We found that decreasing the junction diameter from 31 micrometers to 25 micrometers doubled the e- APD gain from 470 for an array produced in the year 2010 to a gain of 1100 on an array delivered to NASA GSFC recently. The mean single-photon SNR was over 12 and the excess noise factors measurements were 1.2-1.3. The commercial silicon APD array exhibited a fast output with rise times of 300 ps and pulse widths of 600 ps. On-chip individually filtered signals from the entire array were multiplexed onto a single fast output.

  2. Photon Counting Detectors for the 1.0 - 2.0 Micron Wavelength Range

    NASA Technical Reports Server (NTRS)

    Krainak, Michael A.

    2004-01-01

    We describe results on the development of greater than 200 micron diameter, single-element photon-counting detectors for the 1-2 micron wavelength range. The technical goals include quantum efficiency in the range 10-70%; detector diameter greater than 200 microns; dark count rate below 100 kilo counts-per-second (cps), and maximum count rate above 10 Mcps.

  3. Deriving the Contribution of Blazars to the Fermi-LAT Extragalactic γ-ray Background at E > 10 GeV with Efficiency Corrections and Photon Statistics

    NASA Astrophysics Data System (ADS)

    Di Mauro, M.; Manconi, S.; Zechlin, H.-S.; Ajello, M.; Charles, E.; Donato, F.

    2018-04-01

    The Fermi Large Area Telescope (LAT) Collaboration has recently released the Third Catalog of Hard Fermi-LAT Sources (3FHL), which contains 1556 sources detected above 10 GeV with seven years of Pass 8 data. Building upon the 3FHL results, we investigate the flux distribution of sources at high Galactic latitudes (| b| > 20^\\circ ), which are mostly blazars. We use two complementary techniques: (1) a source-detection efficiency correction method and (2) an analysis of pixel photon count statistics with the one-point probability distribution function (1pPDF). With the first method, using realistic Monte Carlo simulations of the γ-ray sky, we calculate the efficiency of the LAT to detect point sources. This enables us to find the intrinsic source-count distribution at photon fluxes down to 7.5 × 10‑12 ph cm‑2 s‑1. With this method, we detect a flux break at (3.5 ± 0.4) × 10‑11 ph cm‑2 s‑1 with a significance of at least 5.4σ. The power-law indexes of the source-count distribution above and below the break are 2.09 ± 0.04 and 1.07 ± 0.27, respectively. This result is confirmed with the 1pPDF method, which has a sensitivity reach of ∼10‑11 ph cm‑2 s‑1. Integrating the derived source-count distribution above the sensitivity of our analysis, we find that (42 ± 8)% of the extragalactic γ-ray background originates from blazars.

  4. Ultra High Work, High Efficiency Turbines For UAVs

    DTIC Science & Technology

    2006-06-01

    same amount of work, thereby reducing the weight of the LPT. Howell et al. and Arts and Coton , 15-17 estimated that a 34% reduction in blade count...dimple a few mm in diameter and 0.1 mm to 0.3 mm deep. A typical surface finish on newly manufactured blading is typically 1-2 μm. In- use LP

  5. Receptor binding assay for paralytic shellfish poisoning toxins: optimization and interlaboratory comparison.

    PubMed

    Ruberu, Shryamalie R; Liu, Yun-Gang; Wong, Carolyn T; Perera, S Kusum; Langlois, Gregg W; Doucette, Gregory J; Powell, Christine L

    2003-01-01

    A receptor binding assay (RBA) for detection of paralytic shellfish poisoning (PSP) toxins was formatted for use in a high throughput detection system using microplate scintillation counting. The RBA technology was transferred from the National Ocean Service, which uses a Wallac TriLux 1450 MicroBeta microplate scintillation counter, to the California Department of Health Services, which uses a Packard TopCount scintillation counter. Due to differences in the detector arrangement between these 2 counters, markedly different counting efficiencies were exhibited, requiring optimization of the RBA protocol for the TopCount instrument. Precision, accuracy, and sensitivity [limit of detection = 0.2 microg saxitoxin (STX) equiv/100 g shellfish tissue] of the modified protocol were equivalent to those of the original protocol. The RBA robustness and adaptability were demonstrated by an interlaboratory study, in which STX concentrations in shellfish generated by the TopCount were consistent with MicroBeta-derived values. Comparison of STX reference standards obtained from the U.S. Food and Drug Administration and the National Research Council, Canada, showed no observable differences. This study confirms the RBA's value as a rapid, high throughput screen prior to testing by the conventional mouse bioassay (MBA) and its suitability for providing an early warning of increasing PSP toxicity when toxin levels are below the MBA limit of detection.

  6. Tutorial on X-ray photon counting detector characterization.

    PubMed

    Ren, Liqiang; Zheng, Bin; Liu, Hong

    2018-01-01

    Recent advances in photon counting detection technology have led to significant research interest in X-ray imaging. As a tutorial level review, this paper covers a wide range of aspects related to X-ray photon counting detector characterization. The tutorial begins with a detailed description of the working principle and operating modes of a pixelated X-ray photon counting detector with basic architecture and detection mechanism. Currently available methods and techniques for charactering major aspects including energy response, noise floor, energy resolution, count rate performance (detector efficiency), and charge sharing effect of photon counting detectors are comprehensively reviewed. Other characterization aspects such as point spread function (PSF), line spread function (LSF), contrast transfer function (CTF), modulation transfer function (MTF), noise power spectrum (NPS), detective quantum efficiency (DQE), bias voltage, radiation damage, and polarization effect are also remarked. A cadmium telluride (CdTe) pixelated photon counting detector is employed for part of the characterization demonstration and the results are presented. This review can serve as a tutorial for X-ray imaging researchers and investigators to understand, operate, characterize, and optimize photon counting detectors for a variety of applications.

  7. Comparison between Phase-Shift Full-Bridge Converters with Noncoupled and Coupled Current-Doubler Rectifier

    PubMed Central

    Tsai, Cheng-Tao; Tseng, Sheng-Yu

    2013-01-01

    This paper presents comparison between phase-shift full-bridge converters with noncoupled and coupled current-doubler rectifier. In high current capability and high step-down voltage conversion, a phase-shift full-bridge converter with a conventional current-doubler rectifier has the common limitations of extremely low duty ratio and high component stresses. To overcome these limitations, a phase-shift full-bridge converter with a noncoupled current-doubler rectifier (NCDR) or a coupled current-doubler rectifier (CCDR) is, respectively, proposed and implemented. In this study, performance analysis and efficiency obtained from a 500 W phase-shift full-bridge converter with two improved current-doubler rectifiers are presented and compared. From their prototypes, experimental results have verified that the phase-shift full-bridge converter with NCDR has optimal duty ratio, lower component stresses, and output current ripple. In component count and efficiency comparison, CCDR has fewer components and higher efficiency at full load condition. For small size and high efficiency requirements, CCDR is relatively suitable for high step-down voltage and high efficiency applications. PMID:24381521

  8. Comparison between phase-shift full-bridge converters with noncoupled and coupled current-doubler rectifier.

    PubMed

    Tsai, Cheng-Tao; Su, Jye-Chau; Tseng, Sheng-Yu

    2013-01-01

    This paper presents comparison between phase-shift full-bridge converters with noncoupled and coupled current-doubler rectifier. In high current capability and high step-down voltage conversion, a phase-shift full-bridge converter with a conventional current-doubler rectifier has the common limitations of extremely low duty ratio and high component stresses. To overcome these limitations, a phase-shift full-bridge converter with a noncoupled current-doubler rectifier (NCDR) or a coupled current-doubler rectifier (CCDR) is, respectively, proposed and implemented. In this study, performance analysis and efficiency obtained from a 500 W phase-shift full-bridge converter with two improved current-doubler rectifiers are presented and compared. From their prototypes, experimental results have verified that the phase-shift full-bridge converter with NCDR has optimal duty ratio, lower component stresses, and output current ripple. In component count and efficiency comparison, CCDR has fewer components and higher efficiency at full load condition. For small size and high efficiency requirements, CCDR is relatively suitable for high step-down voltage and high efficiency applications.

  9. Efficient single photon detection by quantum dot resonant tunneling diodes.

    PubMed

    Blakesley, J C; See, P; Shields, A J; Kardynał, B E; Atkinson, P; Farrer, I; Ritchie, D A

    2005-02-18

    We demonstrate that the resonant tunnel current through a double-barrier structure is sensitive to the capture of single photoexcited holes by an adjacent layer of quantum dots. This phenomenon could allow the detection of single photons with low dark count rates and high quantum efficiencies. The magnitude of the sensing current may be controlled via the thickness of the tunnel barriers. Larger currents give improved signal to noise and allow sub-mus photon time resolution.

  10. High-dimensional cluster analysis with the Masked EM Algorithm

    PubMed Central

    Kadir, Shabnam N.; Goodman, Dan F. M.; Harris, Kenneth D.

    2014-01-01

    Cluster analysis faces two problems in high dimensions: first, the “curse of dimensionality” that can lead to overfitting and poor generalization performance; and second, the sheer time taken for conventional algorithms to process large amounts of high-dimensional data. We describe a solution to these problems, designed for the application of “spike sorting” for next-generation high channel-count neural probes. In this problem, only a small subset of features provide information about the cluster member-ship of any one data vector, but this informative feature subset is not the same for all data points, rendering classical feature selection ineffective. We introduce a “Masked EM” algorithm that allows accurate and time-efficient clustering of up to millions of points in thousands of dimensions. We demonstrate its applicability to synthetic data, and to real-world high-channel-count spike sorting data. PMID:25149694

  11. On the use of positron counting for radio-Assay in nuclear pharmaceutical production.

    PubMed

    Maneuski, D; Giacomelli, F; Lemaire, C; Pimlott, S; Plenevaux, A; Owens, J; O'Shea, V; Luxen, A

    2017-07-01

    Current techniques for the measurement of radioactivity at various points during PET radiopharmaceutical production and R&D are based on the detection of the annihilation gamma rays from the radionuclide in the labelled compound. The detection systems to measure these gamma rays are usually variations of NaI or CsF scintillation based systems requiring costly and heavy lead shielding to reduce background noise. These detectors inherently suffer from low detection efficiency, high background noise and very poor linearity. They are also unable to provide any reasonably useful position information. A novel positron counting technique is proposed for the radioactivity assay during radiopharmaceutical manufacturing that overcomes these limitations. Detection of positrons instead of gammas offers an unprecedented level of position resolution of the radiation source (down to sub-mm) thanks to the nature of the positron interaction with matter. Counting capability instead of charge integration in the detector brings the sensitivity down to the statistical limits at the same time as offering very high dynamic range and linearity from zero to any arbitrarily high activity. This paper reports on a quantitative comparison between conventional detector systems and the proposed positron counting detector. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Logistic regression for dichotomized counts.

    PubMed

    Preisser, John S; Das, Kalyan; Benecha, Habtamu; Stamm, John W

    2016-12-01

    Sometimes there is interest in a dichotomized outcome indicating whether a count variable is positive or zero. Under this scenario, the application of ordinary logistic regression may result in efficiency loss, which is quantifiable under an assumed model for the counts. In such situations, a shared-parameter hurdle model is investigated for more efficient estimation of regression parameters relating to overall effects of covariates on the dichotomous outcome, while handling count data with many zeroes. One model part provides a logistic regression containing marginal log odds ratio effects of primary interest, while an ancillary model part describes the mean count of a Poisson or negative binomial process in terms of nuisance regression parameters. Asymptotic efficiency of the logistic model parameter estimators of the two-part models is evaluated with respect to ordinary logistic regression. Simulations are used to assess the properties of the models with respect to power and Type I error, the latter investigated under both misspecified and correctly specified models. The methods are applied to data from a randomized clinical trial of three toothpaste formulations to prevent incident dental caries in a large population of Scottish schoolchildren. © The Author(s) 2014.

  13. Real Time Coincidence Detection Engine for High Count Rate Timestamp Based PET

    NASA Astrophysics Data System (ADS)

    Tetrault, M.-A.; Oliver, J. F.; Bergeron, M.; Lecomte, R.; Fontaine, R.

    2010-02-01

    Coincidence engines follow two main implementation flows: timestamp based systems and AND-gate based systems. The latter have been more widespread in recent years because of its lower cost and high efficiency. However, they are highly dependent on the selected electronic components, they have limited flexibility once assembled and they are customized to fit a specific scanner's geometry. Timestamp based systems are gathering more attention lately, especially with high channel count fully digital systems. These new systems must however cope with important singles count rates. One option is to record every detected event and postpone coincidence detection offline. For daily use systems, a real time engine is preferable because it dramatically reduces data volume and hence image preprocessing time and raw data management. This paper presents the timestamp based coincidence engine for the LabPET¿, a small animal PET scanner with up to 4608 individual readout avalanche photodiode channels. The engine can handle up to 100 million single events per second and has extensive flexibility because it resides in programmable logic devices. It can be adapted for any detector geometry or channel count, can be ported to newer, faster programmable devices and can have extra modules added to take advantage of scanner-specific features. Finally, the user can select between full processing mode for imaging protocols and minimum processing mode to study different approaches for coincidence detection with offline software.

  14. Monitoring planktivorous seabird populations: Validating surface counts of crevice-nesting auklets using mark-resight techniques

    USGS Publications Warehouse

    Sheffield, L.M.; Gall, Adrian E.; Roby, D.D.; Irons, D.B.; Dugger, K.M.

    2006-01-01

    Least Auklets (Aethia pusilla (Pallas, 1811)) are the most abundant species of seabird in the Bering Sea and offer a relatively efficient means of monitoring secondary productivity in the marine environment. Counting auklets on surface plots is the primary method used to track changes in numbers of these crevice-nesters, but counts can be highly variable and may not be representative of the number of nesting individuals. We compared average maximum counts of Least Auklets on surface plots with density estimates based on mark–resight data at a colony on St. Lawrence Island, Alaska, during 2001–2004. Estimates of breeding auklet abundance from mark–resight averaged 8 times greater than those from maximum surface counts. Our results also indicate that average maximum surface counts are poor indicators of breeding auklet abundance and do not vary consistently with auklet nesting density across the breeding colony. Estimates of Least Auklet abundance from mark–resight were sufficiently precise to meet management goals for tracking changes in seabird populations. We recommend establishing multiple permanent banding plots for mark–resight studies on colonies selected for intensive long-term monitoring. Mark–resight is more likely to detect biologically significant changes in size of auklet breeding colonies than traditional surface count techniques.

  15. An efficient annotation and gene-expression derivation tool for Illumina Solexa datasets.

    PubMed

    Hosseini, Parsa; Tremblay, Arianne; Matthews, Benjamin F; Alkharouf, Nadim W

    2010-07-02

    The data produced by an Illumina flow cell with all eight lanes occupied, produces well over a terabyte worth of images with gigabytes of reads following sequence alignment. The ability to translate such reads into meaningful annotation is therefore of great concern and importance. Very easily, one can get flooded with such a great volume of textual, unannotated data irrespective of read quality or size. CASAVA, a optional analysis tool for Illumina sequencing experiments, enables the ability to understand INDEL detection, SNP information, and allele calling. To not only extract from such analysis, a measure of gene expression in the form of tag-counts, but furthermore to annotate such reads is therefore of significant value. We developed TASE (Tag counting and Analysis of Solexa Experiments), a rapid tag-counting and annotation software tool specifically designed for Illumina CASAVA sequencing datasets. Developed in Java and deployed using jTDS JDBC driver and a SQL Server backend, TASE provides an extremely fast means of calculating gene expression through tag-counts while annotating sequenced reads with the gene's presumed function, from any given CASAVA-build. Such a build is generated for both DNA and RNA sequencing. Analysis is broken into two distinct components: DNA sequence or read concatenation, followed by tag-counting and annotation. The end result produces output containing the homology-based functional annotation and respective gene expression measure signifying how many times sequenced reads were found within the genomic ranges of functional annotations. TASE is a powerful tool to facilitate the process of annotating a given Illumina Solexa sequencing dataset. Our results indicate that both homology-based annotation and tag-count analysis are achieved in very efficient times, providing researchers to delve deep in a given CASAVA-build and maximize information extraction from a sequencing dataset. TASE is specially designed to translate sequence data in a CASAVA-build into functional annotations while producing corresponding gene expression measurements. Achieving such analysis is executed in an ultrafast and highly efficient manner, whether the analysis be a single-read or paired-end sequencing experiment. TASE is a user-friendly and freely available application, allowing rapid analysis and annotation of any given Illumina Solexa sequencing dataset with ease.

  16. Portable Neutron Sensors for Emergency Response Operations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    ,

    2012-06-24

    This article presents the experimental work performed in the area of neutron detector development at the Remote Sensing Laboratory–Andrews Operations (RSL-AO) sponsored by the U.S. Department of Energy, National Nuclear Security Administration (NNSA) in the last four years. During the 1950s neutron detectors were developed mostly to characterize nuclear reactors where the neutron flux is high. Due to the indirect nature of neutron detection via interaction with other particles, neutron counting and neutron energy measurements have never been as precise as gamma-ray counting measurements and gamma-ray spectroscopy. This indirect nature is intrinsic to all neutron measurement endeavors (except perhaps formore » neutron spin-related experiments, viz. neutron spin-echo measurements where one obtains μeV energy resolution). In emergency response situations generally the count rates are low, and neutrons may be scattered around in inhomogeneous intervening materials. It is also true that neutron sensors are most efficient for the lowest energy neutrons, so it is not as easy to detect and count energetic neutrons. Most of the emergency response neutron detectors are offshoots of nuclear device diagnostics tools and special nuclear materials characterization equipment, because that is what is available commercially. These instruments mostly are laboratory equipment, and not field-deployable gear suited for mobile teams. Our goal is to design and prototype field-deployable, ruggedized, lightweight, efficient neutron detectors.« less

  17. A new colorimetrically-calibrated automated video-imaging protocol for day-night fish counting at the OBSEA coastal cabled observatory.

    PubMed

    del Río, Joaquín; Aguzzi, Jacopo; Costa, Corrado; Menesatti, Paolo; Sbragaglia, Valerio; Nogueras, Marc; Sarda, Francesc; Manuèl, Antoni

    2013-10-30

    Field measurements of the swimming activity rhythms of fishes are scant due to the difficulty of counting individuals at a high frequency over a long period of time. Cabled observatory video monitoring allows such a sampling at a high frequency over unlimited periods of time. Unfortunately, automation for the extraction of biological information (i.e., animals' visual counts per unit of time) is still a major bottleneck. In this study, we describe a new automated video-imaging protocol for the 24-h continuous counting of fishes in colorimetrically calibrated time-lapse photographic outputs, taken by a shallow water (20 m depth) cabled video-platform, the OBSEA. The spectral reflectance value for each patch was measured between 400 to 700 nm and then converted into standard RGB, used as a reference for all subsequent calibrations. All the images were acquired within a standardized Region Of Interest (ROI), represented by a 2 × 2 m methacrylate panel, endowed with a 9-colour calibration chart, and calibrated using the recently implemented "3D Thin-Plate Spline" warping approach in order to numerically define color by its coordinates in n-dimensional space. That operation was repeated on a subset of images, 500 images as a training set, manually selected since acquired under optimum visibility conditions. All images plus those for the training set were ordered together through Principal Component Analysis allowing the selection of 614 images (67.6%) out of 908 as a total corresponding to 18 days (at 30 min frequency). The Roberts operator (used in image processing and computer vision for edge detection) was used to highlights regions of high spatial colour gradient corresponding to fishes' bodies. Time series in manual and visual counts were compared together for efficiency evaluation. Periodogram and waveform analysis outputs provided very similar results, although quantified parameters in relation to the strength of respective rhythms were different. Results indicate that automation efficiency is limited by optimum visibility conditions. Data sets from manual counting present the larger day-night fluctuations in comparison to those derived from automation. This comparison indicates that the automation protocol subestimate fish numbers but it is anyway suitable for the study of community activity rhythms.

  18. A New Colorimetrically-Calibrated Automated Video-Imaging Protocol for Day-Night Fish Counting at the OBSEA Coastal Cabled Observatory

    PubMed Central

    del Río, Joaquín; Aguzzi, Jacopo; Costa, Corrado; Menesatti, Paolo; Sbragaglia, Valerio; Nogueras, Marc; Sarda, Francesc; Manuèl, Antoni

    2013-01-01

    Field measurements of the swimming activity rhythms of fishes are scant due to the difficulty of counting individuals at a high frequency over a long period of time. Cabled observatory video monitoring allows such a sampling at a high frequency over unlimited periods of time. Unfortunately, automation for the extraction of biological information (i.e., animals' visual counts per unit of time) is still a major bottleneck. In this study, we describe a new automated video-imaging protocol for the 24-h continuous counting of fishes in colorimetrically calibrated time-lapse photographic outputs, taken by a shallow water (20 m depth) cabled video-platform, the OBSEA. The spectral reflectance value for each patch was measured between 400 to 700 nm and then converted into standard RGB, used as a reference for all subsequent calibrations. All the images were acquired within a standardized Region Of Interest (ROI), represented by a 2 × 2 m methacrylate panel, endowed with a 9-colour calibration chart, and calibrated using the recently implemented “3D Thin-Plate Spline” warping approach in order to numerically define color by its coordinates in n-dimensional space. That operation was repeated on a subset of images, 500 images as a training set, manually selected since acquired under optimum visibility conditions. All images plus those for the training set were ordered together through Principal Component Analysis allowing the selection of 614 images (67.6%) out of 908 as a total corresponding to 18 days (at 30 min frequency). The Roberts operator (used in image processing and computer vision for edge detection) was used to highlights regions of high spatial colour gradient corresponding to fishes' bodies. Time series in manual and visual counts were compared together for efficiency evaluation. Periodogram and waveform analysis outputs provided very similar results, although quantified parameters in relation to the strength of respective rhythms were different. Results indicate that automation efficiency is limited by optimum visibility conditions. Data sets from manual counting present the larger day-night fluctuations in comparison to those derived from automation. This comparison indicates that the automation protocol subestimate fish numbers but it is anyway suitable for the study of community activity rhythms. PMID:24177726

  19. Combined steam-ultrasound treatment of 2 seconds achieves significant high aerobic count and Enterobacteriaceae reduction on naturally contaminated food boxes, crates, conveyor belts, and meat knives.

    PubMed

    Musavian, Hanieh S; Butt, Tariq M; Larsen, Annette Baltzer; Krebs, Niels

    2015-02-01

    Food contact surfaces require rigorous sanitation procedures for decontamination, although these methods very often fail to efficiently clean and disinfect surfaces that are visibly contaminated with food residues and possible biofilms. In this study, the results of a short treatment (1 to 2 s) of combined steam (95°C) and ultrasound (SonoSteam) of industrial fish and meat transportation boxes and live-chicken transportation crates naturally contaminated with food and fecal residues were investigated. Aerobic counts of 5.0 to 6.0 log CFU/24 cm(2) and an Enterobacteriaceae spp. level of 2.0 CFU/24 cm(2) were found on the surfaces prior to the treatment. After 1 s of treatment, the aerobic counts were significantly (P < 0.0001) reduced, and within 2 s, reductions below the detection limit (<10 CFU) were reached. Enterobacteriaceae spp. were reduced to a level below the detection limit with only 1 s of treatment. Two seconds of steam-ultrasound treatment was also applied on two different types of plastic modular conveyor belts with hinge pins and one type of flat flexible rubber belt, all visibly contaminated with food residues. The aerobic counts of 3.0 to 5.0 CFU/50 cm(2) were significantly (P < 0.05) reduced, while Enterobacteriaceae spp. were reduced to a level below the detection limit. Industrial meat knives were contaminated with aerobic counts of 6.0 log CFU/5 cm(2) on the handle and 5.2 log CFU/14 cm(2) on the steel. The level of Enterobacteriaceae spp. contamination was approximately 2.5 log CFU on the handle and steel. Two seconds of steam-ultrasound treatment reduced the aerobic counts and Enterobacteriaceae spp. to levels below the detection limit on both handle and steel. This study shows that the steam-ultrasound treatment may be an effective replacement for disinfection processes and that it can be used for continuous disinfection at fast process lines. However, the treatment may not be able to replace efficient cleaning processes used to remove high loads of debris.

  20. Limits on Achievable Dimensional and Photon Efficiencies with Intensity-Modulation and Photon-Counting Due to Non-Ideal Photon-Counter Behavior

    NASA Technical Reports Server (NTRS)

    Moision, Bruce; Erkmen, Baris I.; Farr, William; Dolinar, Samuel J.; Birnbaum, Kevin M.

    2012-01-01

    An ideal intensity-modulated photon-counting channel can achieve unbounded photon information efficiencies (PIEs). However, a number of limitations of a physical system limit the practically achievable PIE. In this paper, we discuss several of these limitations and illustrate their impact on the channel. We show that, for the Poisson channel, noise does not strictly bound PIE, although there is an effective limit, as the dimensional information efficiency goes as e[overline] e PIE beyond a threshold PIE. Since the Holevo limit is bounded in the presence of noise, this illustrates that the Poisson approximation is invalid at large PIE for any number of noise modes. We show that a finite transmitter extinction ratio bounds the achievable PIE to a maximum that is logarithmic in the extinction ratio. We show how detector jitter limits the ability to mitigate noise in the PPM signaling framework. We illustrate a method to model detector blocking when the number of detectors is large, and illustrate mitigation of blocking with spatial spreading and altering. Finally, we illustrate the design of a high photon efficiency system using state-of-the-art photo-detectors and taking all these effects into account.

  1. Forced-air warming design: evaluation of intake filtration, internal microbial buildup, and airborne-contamination emissions.

    PubMed

    Reed, Mike; Kimberger, Oliver; McGovern, Paul D; Albrecht, Mark C

    2013-08-01

    Forced-air warming devices are effective for the prevention of surgical hypothermia. However, these devices intake nonsterile floor-level air, and it is unknown whether they have adequate filtration measures to prevent the internal buildup or emission of microbial contaminants. We rated the intake filtration efficiency of a popular current-generation forced-air warming device (Bair Hugger model 750, Arizant Healthcare) using a monodisperse sodium chloride aerosol in the laboratory. We further sampled 23 forced-air warming devices (same model) in daily hospital use for internal microbial buildup and airborne-contamination emissions via swabbing and particle counting. Laboratory testing found the intake filter to be 63.8% efficient. Swabbing detected microorganisms within 100% of the forced-air warming blowers sampled, with isolates of coagulase-negative staphylococci, mold, and micrococci identified. Particle counting showed 96% of forced-air warming blowers to be emitting significant levels of internally generated airborne contaminants out of the hose end. These findings highlight the need for upgraded intake filtration, preferably high-efficiency particulate air filtration (99.97% efficient), on current-generation forced-air warming devices to reduce contamination buildup and emission risks.

  2. Parametric normalization for full-energy peak efficiency of HPGe γ-ray spectrometers at different counting positions for bulky sources.

    PubMed

    Peng, Nie; Bang-Fa, Ni; Wei-Zhi, Tian

    2013-02-01

    Application of effective interaction depth (EID) principle for parametric normalization of full energy peak efficiencies at different counting positions, originally for quasi-point sources, has been extended to bulky sources (within ∅30 mm×40 mm) with arbitrary matrices. It is also proved that the EID function for quasi-point source can be directly used for cylindrical bulky sources (within ∅30 mm×40 mm) with the geometric center as effective point source for low atomic number (Z) and low density (D) media and high energy γ-rays. It is also found that in general EID for bulky sources is dependent upon Z and D of the medium and the energy of the γ-rays in question. In addition, the EID principle was theoretically verified by MCNP calculations. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. An analysis of I/O efficient order-statistic-based techniques for noise power estimation in the HRMS sky survey's operational system

    NASA Technical Reports Server (NTRS)

    Zimmerman, G. A.; Olsen, E. T.

    1992-01-01

    Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.

  4. Software defined photon counting system for time resolved x-ray experiments.

    PubMed

    Acremann, Y; Chembrolu, V; Strachan, J P; Tyliszczak, T; Stöhr, J

    2007-01-01

    The time structure of synchrotron radiation allows time resolved experiments with sub-100 ps temporal resolution using a pump-probe approach. However, the relaxation time of the samples may require a lower repetition rate of the pump pulse compared to the full repetition rate of the x-ray pulses from the synchrotron. The use of only the x-ray pulse immediately following the pump pulse is not efficient and often requires special operation modes where only a few buckets of the storage ring are filled. We designed a novel software defined photon counting system that allows to implement a variety of pump-probe schemes at the full repetition rate. The high number of photon counters allows to detect the response of the sample at multiple time delays simultaneously, thus improving the efficiency of the experiment. The system has been successfully applied to time resolved scanning transmission x-ray microscopy. However, this technique is applicable more generally.

  5. Deriving the Contribution of Blazars to the Fermi-LAT Extragalactic γ-ray Background at E > 10 GeV with Efficiency Corrections and Photon Statistics

    DOE PAGES

    Di Mauro, M.; Manconi, S.; Zechlin, H. -S.; ...

    2018-03-29

    Here, the Fermi Large Area Telescope (LAT) Collaboration has recently released the Third Catalog of Hard Fermi-LAT Sources (3FHL), which contains 1556 sources detected above 10 GeV with seven years of Pass 8 data. Building upon the 3FHL results, we investigate the flux distribution of sources at high Galactic latitudes (more » $$|b| \\gt 20^\\circ $$), which are mostly blazars. We use two complementary techniques: (1) a source-detection efficiency correction method and (2) an analysis of pixel photon count statistics with the one-point probability distribution function (1pPDF). With the first method, using realistic Monte Carlo simulations of the γ-ray sky, we calculate the efficiency of the LAT to detect point sources. This enables us to find the intrinsic source-count distribution at photon fluxes down to 7.5 × 10 –12 ph cm –2 s –1. With this method, we detect a flux break at (3.5 ± 0.4) × 10 –11 ph cm –2 s –1 with a significance of at least 5.4σ. The power-law indexes of the source-count distribution above and below the break are 2.09 ± 0.04 and 1.07 ± 0.27, respectively. This result is confirmed with the 1pPDF method, which has a sensitivity reach of ~10 –11 ph cm –2 s –1. Integrating the derived source-count distribution above the sensitivity of our analysis, we find that (42 ± 8)% of the extragalactic γ-ray background originates from blazars.« less

  6. Deriving the Contribution of Blazars to the Fermi-LAT Extragalactic γ-ray Background at E > 10 GeV with Efficiency Corrections and Photon Statistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Mauro, M.; Manconi, S.; Zechlin, H. -S.

    Here, the Fermi Large Area Telescope (LAT) Collaboration has recently released the Third Catalog of Hard Fermi-LAT Sources (3FHL), which contains 1556 sources detected above 10 GeV with seven years of Pass 8 data. Building upon the 3FHL results, we investigate the flux distribution of sources at high Galactic latitudes (more » $$|b| \\gt 20^\\circ $$), which are mostly blazars. We use two complementary techniques: (1) a source-detection efficiency correction method and (2) an analysis of pixel photon count statistics with the one-point probability distribution function (1pPDF). With the first method, using realistic Monte Carlo simulations of the γ-ray sky, we calculate the efficiency of the LAT to detect point sources. This enables us to find the intrinsic source-count distribution at photon fluxes down to 7.5 × 10 –12 ph cm –2 s –1. With this method, we detect a flux break at (3.5 ± 0.4) × 10 –11 ph cm –2 s –1 with a significance of at least 5.4σ. The power-law indexes of the source-count distribution above and below the break are 2.09 ± 0.04 and 1.07 ± 0.27, respectively. This result is confirmed with the 1pPDF method, which has a sensitivity reach of ~10 –11 ph cm –2 s –1. Integrating the derived source-count distribution above the sensitivity of our analysis, we find that (42 ± 8)% of the extragalactic γ-ray background originates from blazars.« less

  7. Avalanche photodiode photon counting receivers for space-borne lidars

    NASA Technical Reports Server (NTRS)

    Sun, Xiaoli; Davidson, Frederic M.

    1991-01-01

    Avalanche photodiodes (APD) are studied for uses as photon counting detectors in spaceborne lidars. Non-breakdown APD photon counters, in which the APD's are biased below the breakdown point, are shown to outperform: (1) conventional APD photon counters biased above the breakdown point; (2) conventional APD photon counters biased above the breakdown point; and (3) APD's in analog mode when the received optical signal is extremely weak. Non-breakdown APD photon counters were shown experimentally to achieve an effective photon counting quantum efficiency of 5.0 percent at lambda = 820 nm with a dead time of 15 ns and a dark count rate of 7000/s which agreed with the theoretically predicted values. The interarrival times of the counts followed an exponential distribution and the counting statistics appeared to follow a Poisson distribution with no after pulsing. It is predicted that the effective photon counting quantum efficiency can be improved to 18.7 percent at lambda = 820 nm and 1.46 percent at lambda = 1060 nm with a dead time of a few nanoseconds by using more advanced commercially available electronic components.

  8. Read count-based method for high-throughput allelic genotyping of transposable elements and structural variants.

    PubMed

    Kuhn, Alexandre; Ong, Yao Min; Quake, Stephen R; Burkholder, William F

    2015-07-08

    Like other structural variants, transposable element insertions can be highly polymorphic across individuals. Their functional impact, however, remains poorly understood. Current genome-wide approaches for genotyping insertion-site polymorphisms based on targeted or whole-genome sequencing remain very expensive and can lack accuracy, hence new large-scale genotyping methods are needed. We describe a high-throughput method for genotyping transposable element insertions and other types of structural variants that can be assayed by breakpoint PCR. The method relies on next-generation sequencing of multiplex, site-specific PCR amplification products and read count-based genotype calls. We show that this method is flexible, efficient (it does not require rounds of optimization), cost-effective and highly accurate. This method can benefit a wide range of applications from the routine genotyping of animal and plant populations to the functional study of structural variants in humans.

  9. Performance of coincidence-based PSD on LiF/ZnS Detectors for Multiplicity Counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robinson, Sean M.; Stave, Sean C.; Lintereur, Azaree

    Abstract: Mass accountancy measurement is a nuclear nonproliferation application which utilizes coincidence and multiplicity counters to verify special nuclear material declarations. With a well-designed and efficient detector system, several relevant parameters of the material can be verified simultaneously. 6LiF/ZnS scintillating sheets may be used for this purpose due to a combination of high efficiency and short die-away times in systems designed with this material, but involve choices of detector geometry and exact material composition (e.g., the addition of Ni-quenching in the material) that must be optimized for the application. Multiplicity counting for verification of declared nuclear fuel mass involves neutronmore » detection in conditions where several neutrons arrive in a short time window, with confounding gamma rays. This paper considers coincidence-based Pulse-Shape Discrimination (PSD) techniques developed to work under conditions of high pileup, and the performance of these algorithms with different detection materials. Simulated and real data from modern LiF/ZnS scintillator systems are evaluated with these techniques and the relationship between the performance under pileup and material characteristics (e.g., neutron peak width and total light collection efficiency) are determined, to allow for an optimal choice of detector and material.« less

  10. Optimizing autologous nonmobilized mononuclear cell collections for cellular therapy in pediatric patients with high-risk leukemia.

    PubMed

    Even-Or, Ehud; Di Mola, Maria; Ali, Muhammad; Courtney, Sarah; McDougall, Elizabeth; Alexander, Sarah; Schechter, Tal; Whitlock, James A; Licht, Christoph; Krueger, Joerg

    2017-06-01

    The manufacturing of cellular products for immunotherapy, such as chimeric antigen receptor T cells, requires successful collection of mononuclear cells. Collections from children with high-risk leukemia present a challenge, especially because the established COBE Spectra apheresis device is being replaced by the novel Spectra Optia device (Optia) in many institutions. Published experience for mononuclear cell collections in children with Optia is lacking. Our aim was to compare the two collection devices and describe modified settings on the Optia to optimize mononuclear cell collections. As a quality initiative, we retrospectively collected and compared data from mononuclear cell collections on both devices. Collected data included patient's clinical characteristics; collection parameters, including precollection lymphocyte/CD3 counts, total blood volumes processed, runtimes, and side effects (including complete blood count and electrolyte changes); and product characteristics, including volumes and cell counts. Collection efficiencies and collection ratios were calculated. Twenty-six mononuclear cell collections were performed on 20 pediatric patients: 11 with COBE and 15 with Optia. Adequate mononuclear cell products were successfully collected with a single procedure from all patients except one, with mean calculated mononuclear cell collection efficiency that was significantly higher from Optia collections compared with COBE collections (57.9 ± 4.6% vs 40.3 ± 6.2%, respectively; p = 0.04). CD3-positive yields were comparable on both machines (p = 0.34) with significantly smaller blood volumes processed on Optia. Collected products had larger volumes on Optia. No significant side effects attributed to the procedure were noted. Mononuclear cell apheresis using the Optia device in children is more efficient and is as safe as that with the COBE device. © 2017 AABB.

  11. Effects of Image Compression on Automatic Count of Immunohistochemically Stained Nuclei in Digital Images

    PubMed Central

    López, Carlos; Lejeune, Marylène; Escrivà, Patricia; Bosch, Ramón; Salvadó, Maria Teresa; Pons, Lluis E.; Baucells, Jordi; Cugat, Xavier; Álvaro, Tomás; Jaén, Joaquín

    2008-01-01

    This study investigates the effects of digital image compression on automatic quantification of immunohistochemical nuclear markers. We examined 188 images with a previously validated computer-assisted analysis system. A first group was composed of 47 images captured in TIFF format, and other three contained the same images converted from TIFF to JPEG format with 3×, 23× and 46× compression. Counts of TIFF format images were compared with the other three groups. Overall, differences in the count of the images increased with the percentage of compression. Low-complexity images (≤100 cells/field, without clusters or with small-area clusters) had small differences (<5 cells/field in 95–100% of cases) and high-complexity images showed substantial differences (<35–50 cells/field in 95–100% of cases). Compression does not compromise the accuracy of immunohistochemical nuclear marker counts obtained by computer-assisted analysis systems for digital images with low complexity and could be an efficient method for storing these images. PMID:18755997

  12. WDR-PK-AK-018

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hollister, R

    2009-08-26

    Method - CES SOP-HW-P556 'Field and Bulk Gamma Analysis'. Detector - High-purity germanium, 40% relative efficiency. Calibration - The detector was calibrated on February 8, 2006 using a NIST-traceable sealed source, and the calibration was verified using an independent sealed source. Count Time and Geometry - The sample was counted for 20 minutes at 72 inches from the detector. A lead collimator was used to limit the field-of-view to the region of the sample. The drum was rotated 180 degrees halfway through the count time. Date and Location of Scans - June 1,2006 in Building 235 Room 1136. Spectral Analysismore » Spectra were analyzed with ORTEC GammaVision software. Matrix and geometry corrections were calculated using OR TEC Isotopic software. A background spectrum was measured at the counting location. No man-made radioactivity was observed in the background. Results were determined from the sample spectra without background subtraction. Minimum detectable activities were calculated by the Nureg 4.16 method. Results - Detected Pu-238, Pu-239, Am-241 and Am-243.« less

  13. Safeguards Technology Development Program 1st Quarter FY 2018 Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prasad, Manoj K.

    LLNL will evaluate the performance of a stilbene-based scintillation detector array for IAEA neutron multiplicity counting (NMC) applications. This effort will combine newly developed modeling methodologies and recently acquired high-efficiency stilbene detector units to quantitatively compare the prototype system performance with the conventional He-3 counters and liquid scintillator alternatives.

  14. Extrusion die and method

    DOEpatents

    Lipp, G. Daniel

    1994-05-03

    A method and die apparatus for manufacturing a honeycomb body of triangular cell cross-section and high cell density, the die having a combination of (i) feedholes feeding slot intersections and (ii) feedholes feeding slot segments not supplied from slot intersections, whereby a reduction in feedhole count is achieved while still retaining good extrusion efficiency and extrudate uniformity.

  15. Monte Carlo simulations and measurements for efficiency determination of lead shielded plastic scintillator detectors

    NASA Astrophysics Data System (ADS)

    Yasin, Zafar; Negoita, Florin; Tabbassum, Sana; Borcea, Ruxandra; Kisyov, Stanimir

    2017-12-01

    The plastic scintillators are used in different areas of science and technology. One of the use of these scintillator detectors is as beam loss monitors (BLM) for new generation of high intensity heavy ion in superconducting linear accelerators. Operated in pulse counting mode with rather high thresholds and shielded by few centimeters of lead in order to cope with radiofrequency noise and X-ray background emitted by accelerator cavities, they preserve high efficiency for high energy gamma ray and neutrons produced in the nuclear reactions of lost beam particles with accelerator components. Efficiency calculation and calibration of detectors is very important before their practical usage. In the present work, the efficiency of plastic scintillator detectors is simulated using FLUKA for different gamma and neutron sources like, 60Co, 137Cs and 238Pu-Be. The sources are placed at different positions around the detector. Calculated values are compared with the measured values and a reasonable agreement is observed.

  16. Real-Time Microfluidic Blood-Counting System for PET and SPECT Preclinical Pharmacokinetic Studies.

    PubMed

    Convert, Laurence; Lebel, Réjean; Gascon, Suzanne; Fontaine, Réjean; Pratte, Jean-François; Charette, Paul; Aimez, Vincent; Lecomte, Roger

    2016-09-01

    Small-animal nuclear imaging modalities have become essential tools in the development process of new drugs, diagnostic procedures, and therapies. Quantification of metabolic or physiologic parameters is based on pharmacokinetic modeling of radiotracer biodistribution, which requires the blood input function in addition to tissue images. Such measurements are challenging in small animals because of their small blood volume. In this work, we propose a microfluidic counting system to monitor rodent blood radioactivity in real time, with high efficiency and small detection volume (∼1 μL). A microfluidic channel is built directly above unpackaged p-i-n photodiodes to detect β-particles with maximum efficiency. The device is embedded in a compact system comprising dedicated electronics, shielding, and pumping unit controlled by custom firmware to enable measurements next to small-animal scanners. Data corrections required to use the input function in pharmacokinetic models were established using calibrated solutions of the most common PET and SPECT radiotracers. Sensitivity, dead time, propagation delay, dispersion, background sensitivity, and the effect of sample temperature were characterized. The system was tested for pharmacokinetic studies in mice by quantifying myocardial perfusion and oxygen consumption with (11)C-acetate (PET) and by measuring the arterial input function using (99m)TcO4 (-) (SPECT). Sensitivity for PET isotopes reached 20%-47%, a 2- to 10-fold improvement relative to conventional catheter-based geometries. Furthermore, the system detected (99m)Tc-based SPECT tracers with an efficiency of 4%, an outcome not possible through a catheter. Correction for dead time was found to be unnecessary for small-animal experiments, whereas propagation delay and dispersion within the microfluidic channel were accurately corrected. Background activity and sample temperature were shown to have no influence on measurements. Finally, the system was successfully used in animal studies. A fully operational microfluidic blood-counting system for preclinical pharmacokinetic studies was developed. Microfluidics enabled reliable and high-efficiency measurement of the blood concentration of most common PET and SPECT radiotracers with high temporal resolution in small blood volume. © 2016 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  17. Ultrafast time measurements by time-correlated single photon counting coupled with superconducting single photon detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shcheslavskiy, V., E-mail: vis@becker-hickl.de; Becker, W.; Morozov, P.

    Time resolution is one of the main characteristics of the single photon detectors besides quantum efficiency and dark count rate. We demonstrate here an ultrafast time-correlated single photon counting (TCSPC) setup consisting of a newly developed single photon counting board SPC-150NX and a superconducting NbN single photon detector with a sensitive area of 7 × 7 μm. The combination delivers a record instrument response function with a full width at half maximum of 17.8 ps and system quantum efficiency ∼15% at wavelength of 1560 nm. A calculation of the root mean square value of the timing jitter for channels withmore » counts more than 1% of the peak value yielded about 7.6 ps. The setup has also good timing stability of the detector–TCSPC board.« less

  18. Simulation of HLNC and NCC measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lu, Ming-Shih; Teichmann, T.; De Ridder, P.

    1994-03-01

    This report discusses an automatic method of simulating the results of High Level Neutron Coincidence Counting (HLNC) and Neutron Collar Coincidence Counting (NCC) measurements to facilitate the safeguards` inspectors understanding and use of these instruments under realistic conditions. This would otherwise be expensive, and time-consuming, except at sites designed to handle radioactive materials, and having the necessary variety of fuel elements and other samples. This simulation must thus include the behavior of the instruments for variably constituted and composed fuel elements (including poison rods and Gd loading), and must display the changes in the count rates as a function ofmore » these characteristics, as well as of various instrumental parameters. Such a simulation is an efficient way of accomplishing the required familiarization and training of the inspectors by providing a realistic reproduction of the results of such measurements.« less

  19. Empirical assessment of the detection efficiency of CR-39 at high proton fluence and a compact, proton detector for high-fluence applications

    DOE PAGES

    Rosenberg, M. J.; Séguin, F. H.; Waugh, C. J.; ...

    2014-04-14

    CR-39 solid-state nuclear track detectors are widely used in physics and in many inertial confinement fusion (ICF) experiments, and under ideal conditions these detectors have 100% detection efficiency for ~0.5–8 MeV protons. When the fluence of incident particles becomes too high, the overlap of particle tracks leads to under-counting at typical processing conditions (5h etch in 6N NaOH at 80°C). Short etch times required to avoid overlap can cause under-counting as well, as tracks are not fully developed. Experiments have determined the minimum etch times for 100% detection of 1.7–4.3-MeV protons and established that for 2.4-MeV protons, relevant for detectionmore » of DD protons, the maximum fluence that can be detected using normal processing techniques is ≲3 ×10 6 cm -2. A CR-39-based proton detector has been developed to mitigate issues related to high particle fluences on ICF facilities. Using a pinhole and scattering foil several mm in front of the CR-39, proton fluences at the CR-39 are reduced by more than a factor of ~50, increasing the operating yield upper limit by a comparable amount.« less

  20. Evaluating point count efficiency relative to territory mapping in cropland birds

    Treesearch

    Andre Cyr; Denis Lepage; Kathryn Freemark

    1995-01-01

    Species richness, composition, and abundance of farmland birds were compared between point counts (50-m, 100-m, and 150-m radius half circles) and territory mapping on three 40-ha plots in Québec, Canada. Point counts of smaller radii tended to have larger density estimates than counts of larger radii. Territory mapping detected 10 species more than 150-m...

  1. Photon-counting image sensors for the ultraviolet

    NASA Technical Reports Server (NTRS)

    Jenkins, E. B.

    1985-01-01

    An investigation on specific performance details of photon counting, ultraviolet image sensors having 2-dimensional formats is reviewed. In one study, controlled experiments were performed which compare the quantum efficiencies, in pulse counting mode, of CsI photocathodes deposited on: (1) the front surface of a microchannel plate (MCP), (2) a solid surface in front of an MCP, and (3) an intensified CCD image sensor (ICCD) where a CCD is directly bombarded by accelerated photoelectrons. Tests indicated that the detection efficiency of the CsI-coated MCP at 1026 A is lower by a factor of 2.5 than that of the MCP with a separate, opaque CsI photocathode, and the detection efficiency ratio increases substantially at longer wavelengths (ratio is 5 at 1216 A and 20 at 1608 A).

  2. Investigation of the Performance of an Ultralow-Dark-Count Superconducting Nanowire Single-Photon Detector

    NASA Astrophysics Data System (ADS)

    Subashchandran, Shanthi; Okamoto, Ryo; Zhang, Labao; Tanaka, Akira; Okano, Masayuki; Kang, Lin; Chen, Jian; Wu, Peiheng; Takeuchi, Shigeki

    2013-10-01

    The realization of an ultralow-dark-count rate (DCR) along with the conservation of high detection efficiency (DE) is critical for many applications using single photon detectors in quantum information technologies, material sciences, and biological sensing. For this purpose, a fiber-coupled superconducting nanowire single-photon detector (SNSPD) with a meander-type niobium nitride nanowire (width: 50 nm) is studied. Precise measurements of the bias current dependence of DE are carried out for a wide spectral range (from 500 to 1650 nm in steps of 50 nm) using a white light source and a laser line Bragg tunable band-pass filter. An ultralow DCR (0.0015 cps) and high DE (32%) are simultaneously achieved by the SNSPD at a wavelength of 500 nm.

  3. Detector motion method to increase spatial resolution in photon-counting detectors

    NASA Astrophysics Data System (ADS)

    Lee, Daehee; Park, Kyeongjin; Lim, Kyung Taek; Cho, Gyuseong

    2017-03-01

    Medical imaging requires high spatial resolution of an image to identify fine lesions. Photon-counting detectors in medical imaging have recently been rapidly replacing energy-integrating detectors due to the former`s high spatial resolution, high efficiency and low noise. Spatial resolution in a photon counting image is determined by the pixel size. Therefore, the smaller the pixel size, the higher the spatial resolution that can be obtained in an image. However, detector redesigning is required to reduce pixel size, and an expensive fine process is required to integrate a signal processing unit with reduced pixel size. Furthermore, as the pixel size decreases, charge sharing severely deteriorates spatial resolution. To increase spatial resolution, we propose a detector motion method using a large pixel detector that is less affected by charge sharing. To verify the proposed method, we utilized a UNO-XRI photon-counting detector (1-mm CdTe, Timepix chip) at the maximum X-ray tube voltage of 80 kVp. A similar spatial resolution of a 55- μm-pixel image was achieved by application of the proposed method to a 110- μm-pixel detector with a higher signal-to-noise ratio. The proposed method could be a way to increase spatial resolution without a pixel redesign when pixels severely suffer from charge sharing as pixel size is reduced.

  4. Micro-computed tomography in murine models of cerebral cavernous malformations as a paradigm for brain disease.

    PubMed

    Girard, Romuald; Zeineddine, Hussein A; Orsbon, Courtney; Tan, Huan; Moore, Thomas; Hobson, Nick; Shenkar, Robert; Lightle, Rhonda; Shi, Changbin; Fam, Maged D; Cao, Ying; Shen, Le; Neander, April I; Rorrer, Autumn; Gallione, Carol; Tang, Alan T; Kahn, Mark L; Marchuk, Douglas A; Luo, Zhe-Xi; Awad, Issam A

    2016-09-15

    Cerebral cavernous malformations (CCMs) are hemorrhagic brain lesions, where murine models allow major mechanistic discoveries, ushering genetic manipulations and preclinical assessment of therapies. Histology for lesion counting and morphometry is essential yet tedious and time consuming. We herein describe the application and validations of X-ray micro-computed tomography (micro-CT), a non-destructive technique allowing three-dimensional CCM lesion count and volumetric measurements, in transgenic murine brains. We hereby describe a new contrast soaking technique not previously applied to murine models of CCM disease. Volumetric segmentation and image processing paradigm allowed for histologic correlations and quantitative validations not previously reported with the micro-CT technique in brain vascular disease. Twenty-two hyper-dense areas on micro-CT images, identified as CCM lesions, were matched by histology. The inter-rater reliability analysis showed strong consistency in the CCM lesion identification and staging (K=0.89, p<0.0001) between the two techniques. Micro-CT revealed a 29% greater CCM lesion detection efficiency, and 80% improved time efficiency. Serial integrated lesional area by histology showed a strong positive correlation with micro-CT estimated volume (r(2)=0.84, p<0.0001). Micro-CT allows high throughput assessment of lesion count and volume in pre-clinical murine models of CCM. This approach complements histology with improved accuracy and efficiency, and can be applied for lesion burden assessment in other brain diseases. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Rapid enumeration of viable bacteria by image analysis

    NASA Technical Reports Server (NTRS)

    Singh, A.; Pyle, B. H.; McFeters, G. A.

    1989-01-01

    A direct viable counting method for enumerating viable bacteria was modified and made compatible with image analysis. A comparison was made between viable cell counts determined by the spread plate method and direct viable counts obtained using epifluorescence microscopy either manually or by automatic image analysis. Cultures of Escherichia coli, Salmonella typhimurium, Vibrio cholerae, Yersinia enterocolitica and Pseudomonas aeruginosa were incubated at 35 degrees C in a dilute nutrient medium containing nalidixic acid. Filtered samples were stained for epifluorescence microscopy and analysed manually as well as by image analysis. Cells enlarged after incubation were considered viable. The viable cell counts determined using image analysis were higher than those obtained by either the direct manual count of viable cells or spread plate methods. The volume of sample filtered or the number of cells in the original sample did not influence the efficiency of the method. However, the optimal concentration of nalidixic acid (2.5-20 micrograms ml-1) and length of incubation (4-8 h) varied with the culture tested. The results of this study showed that under optimal conditions, the modification of the direct viable count method in combination with image analysis microscopy provided an efficient and quantitative technique for counting viable bacteria in a short time.

  6. Design Aids for Real-Time Systems (DARTS)

    NASA Technical Reports Server (NTRS)

    Szulewski, P. A.

    1982-01-01

    Design-Aids for Real-Time Systems (DARTS) is a tool that assists in defining embedded computer systems through tree structured graphics, military standard documentation support, and various analyses including automated Software Science parameter counting and metrics calculation. These analyses provide both static and dynamic design quality feedback which can potentially aid in producing efficient, high quality software systems.

  7. Novel Photon-Counting Detectors for Free-Space Communication

    NASA Technical Reports Server (NTRS)

    Krainak, Michael A.; Yang, Guan; Sun, Xiaoli; Lu, Wei; Merritt, Scott; Beck, Jeff

    2016-01-01

    We present performance data for novel photon counting detectors for free space optical communication. NASA GSFC is testing the performance of three novel photon counting detectors 1) a 2x8 mercury cadmium telluride avalanche array made by DRS Inc. 2) a commercial 2880 silicon avalanche photodiode array and 3) a prototype resonant cavity silicon avalanche photodiode array. We will present and compare dark count, photon detection efficiency, wavelength response and communication performance data for these detectors. We discuss system wavelength trades and architectures for optimizing overall communication link sensitivity, data rate and cost performance. The HgCdTe APD array has photon detection efficiencies of greater than 50 were routinely demonstrated across 5 arrays, with one array reaching a maximum PDE of 70. High resolution pixel-surface spot scans were performed and the junction diameters of the diodes were measured. The junction diameter was decreased from 31 m to 25 m resulting in a 2x increase in e-APD gain from 470 on the 2010 array to 1100 on the array delivered to NASA GSFC. Mean single photon SNRs of over 12 were demonstrated at excess noise factors of 1.2-1.3.The commercial silicon APD array has a fast output with rise times of 300ps and pulse widths of 600ps. Received and filtered signals from the entire array are multiplexed onto this single fast output. The prototype resonant cavity silicon APD array is being developed for use at 1 micron wavelength.

  8. Fundamentals of Free-Space Optical Communications

    NASA Technical Reports Server (NTRS)

    Dolinar, Sam; Moision, Bruce; Erkmen, Baris

    2012-01-01

    Free-space optical communication systems potentially gain many dBs over RF systems. There is no upper limit on the theoretically achievable photon efficiency when the system is quantum-noise-limited: a) Intensity modulations plus photon counting can achieve arbitrarily high photon efficiency, but with sub-optimal spectral efficiency. b) Quantum-ideal number states can achieve the ultimate capacity in the limit of perfect transmissivity. Appropriate error correction codes are needed to communicate reliably near the capacity limits. Poisson-modeled noises, detector losses, and atmospheric effects must all be accounted for: a) Theoretical models are used to analyze performance degradations. b) Mitigation strategies derived from this analysis are applied to minimize these degradations.

  9. Polarization entangled photons from quantum dots embedded in nanowires.

    PubMed

    Huber, Tobias; Predojević, Ana; Khoshnegar, Milad; Dalacu, Dan; Poole, Philip J; Majedi, Hamed; Weihs, Gregor

    2014-12-10

    In this Letter, we present entanglement generated from a novel structure: a single InAsP quantum dot embedded in an InP nanowire. These structures can grow in a site-controlled way and exhibit high collection efficiency; we detect 0.5 million biexciton counts per second coupled into a single mode fiber with a standard commercial avalanche photo diode. If we correct for the known setup losses and detector efficiency, we get an extraction efficiency of 15(3) %. For the measured polarization entanglement, we observe a fidelity of 0.76(2) to a reference maximally entangled state as well as a concurrence of 0.57(6).

  10. ESTIMATION OF RADIOACTIVE CALCIUM-45 BY LIQUID SCINTILLATION COUNTING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lutwak, L.

    1959-03-01

    A liquid sclntillation counting method is developed for determining radioactive calcium-45 in biological materials. The calcium-45 is extracted, concentrated, and dissolved in absolute ethyl alcohol to which is added 0.4% diphenyloxazol in toluene. Counting efficiency is about 65 percent with standard deviation of the J-57 engin 7.36 percent. (auth)

  11. Evaluating the Performance of a Commercial Silicon Drift Detector for X-ray Microanalysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kenik, Edward A

    2011-01-01

    Silicon drift detectors (SDDs) are rapidly becoming the energy dispersive spectrometer (EDS) of choice, especially for scanning electron microscopy x-ray microanalysis. The complementary features of large active areas (i.e., high collection angle) and high count rate capability of these detector contribute to their popularity, as well as the absence of liquid nitrogen cooling and good energy resolution of these detectors. The performance of an EDAX Apollo 40 SDD on a JEOL 6500F SEM is discussed. The larger detector resulted in an significant increase (~3.5x) in geometric collection efficiency compared to the original 10mm2 Si(Li) detector that it replaced. The SEMmore » can provide high beam currents (up to 200nA in some conditions) at small probe diameters. The high count rate capability of the SDD and the high current capability of the SEM compliment each other and provide excellent EDS analytical capabilities for both single point and spectrum imaging applications.« less

  12. Increasing point-count duration increases standard error

    USGS Publications Warehouse

    Smith, W.P.; Twedt, D.J.; Hamel, P.B.; Ford, R.P.; Wiedenfeld, D.A.; Cooper, R.J.

    1998-01-01

    We examined data from point counts of varying duration in bottomland forests of west Tennessee and the Mississippi Alluvial Valley to determine if counting interval influenced sampling efficiency. Estimates of standard error increased as point count duration increased both for cumulative number of individuals and species in both locations. Although point counts appear to yield data with standard errors proportional to means, a square root transformation of the data may stabilize the variance. Using long (>10 min) point counts may reduce sample size and increase sampling error, both of which diminish statistical power and thereby the ability to detect meaningful changes in avian populations.

  13. Detective quantum efficiency of photon-counting x-ray detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tanguay, Jesse, E-mail: jessetan@mail.ubc.ca; Yun, Seungman; Kim, Ho Kyung

    Purpose: Single-photon-counting (SPC) x-ray imaging has the potential to improve image quality and enable novel energy-dependent imaging methods. Similar to conventional detectors, optimizing image SPC quality will require systems that produce the highest possible detective quantum efficiency (DQE). This paper builds on the cascaded-systems analysis (CSA) framework to develop a comprehensive description of the DQE of SPC detectors that implement adaptive binning. Methods: The DQE of SPC systems can be described using the CSA approach by propagating the probability density function (PDF) of the number of image-forming quanta through simple quantum processes. New relationships are developed to describe PDF transfermore » through serial and parallel cascades to accommodate scatter reabsorption. Results are applied to hypothetical silicon and selenium-based flat-panel SPC detectors including the effects of reabsorption of characteristic/scatter photons from photoelectric and Compton interactions, stochastic conversion of x-ray energy to secondary quanta, depth-dependent charge collection, and electronic noise. Results are compared with a Monte Carlo study. Results: Depth-dependent collection efficiency can result in substantial broadening of photopeaks that in turn may result in reduced DQE at lower x-ray energies (20–45 keV). Double-counting interaction events caused by reabsorption of characteristic/scatter photons may result in falsely inflated image signal-to-noise ratio and potential overestimation of the DQE. Conclusions: The CSA approach is extended to describe signal and noise propagation through photoelectric and Compton interactions in SPC detectors, including the effects of escape and reabsorption of emission/scatter photons. High-performance SPC systems can be achieved but only for certain combinations of secondary conversion gain, depth-dependent collection efficiency, electronic noise, and reabsorption characteristics.« less

  14. Detective quantum efficiency of photon-counting x-ray detectors.

    PubMed

    Tanguay, Jesse; Yun, Seungman; Kim, Ho Kyung; Cunningham, Ian A

    2015-01-01

    Single-photon-counting (SPC) x-ray imaging has the potential to improve image quality and enable novel energy-dependent imaging methods. Similar to conventional detectors, optimizing image SPC quality will require systems that produce the highest possible detective quantum efficiency (DQE). This paper builds on the cascaded-systems analysis (CSA) framework to develop a comprehensive description of the DQE of SPC detectors that implement adaptive binning. The DQE of SPC systems can be described using the CSA approach by propagating the probability density function (PDF) of the number of image-forming quanta through simple quantum processes. New relationships are developed to describe PDF transfer through serial and parallel cascades to accommodate scatter reabsorption. Results are applied to hypothetical silicon and selenium-based flat-panel SPC detectors including the effects of reabsorption of characteristic/scatter photons from photoelectric and Compton interactions, stochastic conversion of x-ray energy to secondary quanta, depth-dependent charge collection, and electronic noise. Results are compared with a Monte Carlo study. Depth-dependent collection efficiency can result in substantial broadening of photopeaks that in turn may result in reduced DQE at lower x-ray energies (20-45 keV). Double-counting interaction events caused by reabsorption of characteristic/scatter photons may result in falsely inflated image signal-to-noise ratio and potential overestimation of the DQE. The CSA approach is extended to describe signal and noise propagation through photoelectric and Compton interactions in SPC detectors, including the effects of escape and reabsorption of emission/scatter photons. High-performance SPC systems can be achieved but only for certain combinations of secondary conversion gain, depth-dependent collection efficiency, electronic noise, and reabsorption characteristics.

  15. Construction of Chinese adult male phantom library and its application in the virtual calibration of in vivo measurement.

    PubMed

    Chen, Yizheng; Qiu, Rui; Li, Chunyan; Wu, Zhen; Li, Junli

    2016-03-07

    In vivo measurement is a main method of internal contamination evaluation, particularly for large numbers of people after a nuclear accident. Before the practical application, it is necessary to obtain the counting efficiency of the detector by calibration. The virtual calibration based on Monte Carlo simulation usually uses the reference human computational phantom, and the morphological difference between the monitored personnel with the calibrated phantom may lead to the deviation of the counting efficiency. Therefore, a phantom library containing a wide range of heights and total body masses is needed. In this study, a Chinese reference adult male polygon surface (CRAM_S) phantom was constructed based on the CRAM voxel phantom, with the organ models adjusted to match the Chinese reference data. CRAM_S phantom was then transformed to sitting posture for convenience in practical monitoring. Referring to the mass and height distribution of the Chinese adult male, a phantom library containing 84 phantoms was constructed by deforming the reference surface phantom. Phantoms in the library have 7 different heights ranging from 155 cm to 185 cm, and there are 12 phantoms with different total body masses in each height. As an example of application, organ specific and total counting efficiencies of Ba-133 were calculated using the MCNPX code, with two series of phantoms selected from the library. The influence of morphological variation on the counting efficiency was analyzed. The results show only using the reference phantom in virtual calibration may lead to an error of 68.9% for total counting efficiency. Thus the influence of morphological difference on virtual calibration can be greatly reduced using the phantom library with a wide range of masses and heights instead of a single reference phantom.

  16. Construction of Chinese adult male phantom library and its application in the virtual calibration of in vivo measurement

    NASA Astrophysics Data System (ADS)

    Chen, Yizheng; Qiu, Rui; Li, Chunyan; Wu, Zhen; Li, Junli

    2016-03-01

    In vivo measurement is a main method of internal contamination evaluation, particularly for large numbers of people after a nuclear accident. Before the practical application, it is necessary to obtain the counting efficiency of the detector by calibration. The virtual calibration based on Monte Carlo simulation usually uses the reference human computational phantom, and the morphological difference between the monitored personnel with the calibrated phantom may lead to the deviation of the counting efficiency. Therefore, a phantom library containing a wide range of heights and total body masses is needed. In this study, a Chinese reference adult male polygon surface (CRAM_S) phantom was constructed based on the CRAM voxel phantom, with the organ models adjusted to match the Chinese reference data. CRAMS phantom was then transformed to sitting posture for convenience in practical monitoring. Referring to the mass and height distribution of the Chinese adult male, a phantom library containing 84 phantoms was constructed by deforming the reference surface phantom. Phantoms in the library have 7 different heights ranging from 155 cm to 185 cm, and there are 12 phantoms with different total body masses in each height. As an example of application, organ specific and total counting efficiencies of Ba-133 were calculated using the MCNPX code, with two series of phantoms selected from the library. The influence of morphological variation on the counting efficiency was analyzed. The results show only using the reference phantom in virtual calibration may lead to an error of 68.9% for total counting efficiency. Thus the influence of morphological difference on virtual calibration can be greatly reduced using the phantom library with a wide range of masses and heights instead of a single reference phantom.

  17. An efficient annotation and gene-expression derivation tool for Illumina Solexa datasets

    PubMed Central

    2010-01-01

    Background The data produced by an Illumina flow cell with all eight lanes occupied, produces well over a terabyte worth of images with gigabytes of reads following sequence alignment. The ability to translate such reads into meaningful annotation is therefore of great concern and importance. Very easily, one can get flooded with such a great volume of textual, unannotated data irrespective of read quality or size. CASAVA, a optional analysis tool for Illumina sequencing experiments, enables the ability to understand INDEL detection, SNP information, and allele calling. To not only extract from such analysis, a measure of gene expression in the form of tag-counts, but furthermore to annotate such reads is therefore of significant value. Findings We developed TASE (Tag counting and Analysis of Solexa Experiments), a rapid tag-counting and annotation software tool specifically designed for Illumina CASAVA sequencing datasets. Developed in Java and deployed using jTDS JDBC driver and a SQL Server backend, TASE provides an extremely fast means of calculating gene expression through tag-counts while annotating sequenced reads with the gene's presumed function, from any given CASAVA-build. Such a build is generated for both DNA and RNA sequencing. Analysis is broken into two distinct components: DNA sequence or read concatenation, followed by tag-counting and annotation. The end result produces output containing the homology-based functional annotation and respective gene expression measure signifying how many times sequenced reads were found within the genomic ranges of functional annotations. Conclusions TASE is a powerful tool to facilitate the process of annotating a given Illumina Solexa sequencing dataset. Our results indicate that both homology-based annotation and tag-count analysis are achieved in very efficient times, providing researchers to delve deep in a given CASAVA-build and maximize information extraction from a sequencing dataset. TASE is specially designed to translate sequence data in a CASAVA-build into functional annotations while producing corresponding gene expression measurements. Achieving such analysis is executed in an ultrafast and highly efficient manner, whether the analysis be a single-read or paired-end sequencing experiment. TASE is a user-friendly and freely available application, allowing rapid analysis and annotation of any given Illumina Solexa sequencing dataset with ease. PMID:20598141

  18. Does the covariance structure matter in longitudinal modelling for the prediction of future CD4 counts?

    PubMed

    Taylor, J M; Law, N

    1998-10-30

    We investigate the importance of the assumed covariance structure for longitudinal modelling of CD4 counts. We examine how individual predictions of future CD4 counts are affected by the covariance structure. We consider four covariance structures: one based on an integrated Ornstein-Uhlenbeck stochastic process; one based on Brownian motion, and two derived from standard linear and quadratic random-effects models. Using data from the Multicenter AIDS Cohort Study and from a simulation study, we show that there is a noticeable deterioration in the coverage rate of confidence intervals if we assume the wrong covariance. There is also a loss in efficiency. The quadratic random-effects model is found to be the best in terms of correctly calibrated prediction intervals, but is substantially less efficient than the others. Incorrectly specifying the covariance structure as linear random effects gives too narrow prediction intervals with poor coverage rates. Fitting using the model based on the integrated Ornstein-Uhlenbeck stochastic process is the preferred one of the four considered because of its efficiency and robustness properties. We also use the difference between the future predicted and observed CD4 counts to assess an appropriate transformation of CD4 counts; a fourth root, cube root and square root all appear reasonable choices.

  19. Influence of Survey Length and Radius Size on Grassland Bird Surveys by Point Counts at Williams Lake, British Columbia

    Treesearch

    Jean-Pierre L. Savard; Tracey D. Hooper

    1995-01-01

    We examine the effect of survey length and radius on the results of point count surveys for grassland birds at Williams Lake, British Columbia. Four- and 8-minute counts detected on average 68 percent and 85 percent of the number of birds detected during 12-minute counts. The most efficient sampling duration was 4 minutes, as long as travel time between points was...

  20. Examples of Mesh and NURBS modelling for in vivo lung counting studies.

    PubMed

    Farah, Jad; Broggio, David; Franck, Didier

    2011-03-01

    Realistic calibration coefficients for in vivo counting installations are assessed using voxel phantoms and Monte Carlo calculations. However, voxel phantoms construction is time consuming and their flexibility extremely limited. This paper involves Mesh and non-uniform rational B-splines graphical formats, of greater flexibility, to optimise the calibration of in vivo counting installations. Two studies validating the use of such phantoms and involving geometry deformation and modelling were carried out to study the morphologic effect on lung counting efficiency. The created 3D models fitted with the reference ones, with volumetric differences of <5 %. Moreover, it was found that counting efficiency varies with the inverse of lungs' volume and that the latter primes when compared with chest wall thickness. Finally, a series of different thoracic female phantoms of various cup sizes, chest girths and internal organs' volumes were created starting from the International Commission on Radiological Protection (ICRP) adult female reference computational phantom to give correction factors for the lung monitoring of female workers.

  1. Plutonium and uranium determination in environmental samples: combined solvent extraction-liquid scintillation method.

    PubMed

    McDowell, W J; Farrar, D T; Billings, M R

    1974-12-01

    A method for the determination of uranium and plutonium by a combined high-resolution liquid scintillation-solvent extraction method is presented. Assuming a sample count equal to background count to be the detection limit, the lower detection limit for these and other alpha-emitting nuclides is 1.0 dpm with a Pyrex sample tube, 0.3 dpm with a quartz sample tube using present detector shielding or 0.02 d.p.m. with pulse-shape discrimination. Alpha-counting efficiency is 100%. With the counting data presented as an alpha-energy spectrum, an energy resolution of 0.2-0.3 MeV peak half-width and an energy identification to +/-0.1 MeV are possible. Thus, within these limits, identification and quantitative determination of a specific alpha-emitter, independent of chemical separation, are possible. The separation procedure allows greater than 98% recovery of uranium and plutonium from solution containing large amounts of iron and other interfering substances. In most cases uranium, even when present in 10(8)-fold molar ratio, may be quantitatively separated from plutonium without loss of the plutonium. Potential applications of this general analytical concept to other alpha-counting problems are noted. Special problems associated with the determination of plutonium in soil and water samples are discussed. Results of tests to determine the pulse-height and energy-resolution characteristics of several scintillators are presented. Construction of the high-resolution liquid scintillation detector is described.

  2. Conditional analysis of mixed Poisson processes with baseline counts: implications for trial design and analysis.

    PubMed

    Cook, Richard J; Wei, Wei

    2003-07-01

    The design of clinical trials is typically based on marginal comparisons of a primary response under two or more treatments. The considerable gains in efficiency afforded by models conditional on one or more baseline responses has been extensively studied for Gaussian models. The purpose of this article is to present methods for the design and analysis of clinical trials in which the response is a count or a point process, and a corresponding baseline count is available prior to randomization. The methods are based on a conditional negative binomial model for the response given the baseline count and can be used to examine the effect of introducing selection criteria on power and sample size requirements. We show that designs based on this approach are more efficient than those proposed by McMahon et al. (1994).

  3. Counting in Lattices: Combinatorial Problems from Statistical Mechanics.

    NASA Astrophysics Data System (ADS)

    Randall, Dana Jill

    In this thesis we consider two classical combinatorial problems arising in statistical mechanics: counting matchings and self-avoiding walks in lattice graphs. The first problem arises in the study of the thermodynamical properties of monomers and dimers (diatomic molecules) in crystals. Fisher, Kasteleyn and Temperley discovered an elegant technique to exactly count the number of perfect matchings in two dimensional lattices, but it is not applicable for matchings of arbitrary size, or in higher dimensional lattices. We present the first efficient approximation algorithm for computing the number of matchings of any size in any periodic lattice in arbitrary dimension. The algorithm is based on Monte Carlo simulation of a suitable Markov chain and has rigorously derived performance guarantees that do not rely on any assumptions. In addition, we show that these results generalize to counting matchings in any graph which is the Cayley graph of a finite group. The second problem is counting self-avoiding walks in lattices. This problem arises in the study of the thermodynamics of long polymer chains in dilute solution. While there are a number of Monte Carlo algorithms used to count self -avoiding walks in practice, these are heuristic and their correctness relies on unproven conjectures. In contrast, we present an efficient algorithm which relies on a single, widely-believed conjecture that is simpler than preceding assumptions and, more importantly, is one which the algorithm itself can test. Thus our algorithm is reliable, in the sense that it either outputs answers that are guaranteed, with high probability, to be correct, or finds a counterexample to the conjecture. In either case we know we can trust our results and the algorithm is guaranteed to run in polynomial time. This is the first algorithm for counting self-avoiding walks in which the error bounds are rigorously controlled. This work was supported in part by an AT&T graduate fellowship, a University of California dissertation year fellowship and Esprit working group "RAND". Part of this work was done while visiting ICSI and the University of Edinburgh.

  4. On the single-photon-counting (SPC) modes of imaging using an XFEL source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Zhehui

    In this study, the requirements to achieve high detection efficiency (above 50%) and gigahertz (GHz) frame rate for the proposed 42-keV X-ray free-electron laser (XFEL) at Los Alamos are summarized. Direct detection scenarios using C (diamond), Si, Ge and GaAs semiconductor sensors are analyzed. Single-photon counting (SPC) mode and weak SPC mode using Si can potentially meet the efficiency and frame rate requirements and be useful to both photoelectric absorption and Compton physics as the photon energy increases. Multilayer three-dimensional (3D) detector architecture, as a possible means to realize SPC modes, is compared with the widely used two-dimensional (2D) hybridmore » planar electrode structure and 3D deeply entrenched electrode architecture. Demonstration of thin film cameras less than 100-μm thick with onboard thin ASICs could be an initial step to realize multilayer 3D detectors and SPC modes for XFELs.« less

  5. On the single-photon-counting (SPC) modes of imaging using an XFEL source

    DOE PAGES

    Wang, Zhehui

    2015-12-14

    In this study, the requirements to achieve high detection efficiency (above 50%) and gigahertz (GHz) frame rate for the proposed 42-keV X-ray free-electron laser (XFEL) at Los Alamos are summarized. Direct detection scenarios using C (diamond), Si, Ge and GaAs semiconductor sensors are analyzed. Single-photon counting (SPC) mode and weak SPC mode using Si can potentially meet the efficiency and frame rate requirements and be useful to both photoelectric absorption and Compton physics as the photon energy increases. Multilayer three-dimensional (3D) detector architecture, as a possible means to realize SPC modes, is compared with the widely used two-dimensional (2D) hybridmore » planar electrode structure and 3D deeply entrenched electrode architecture. Demonstration of thin film cameras less than 100-μm thick with onboard thin ASICs could be an initial step to realize multilayer 3D detectors and SPC modes for XFELs.« less

  6. Evaluation of a near-infrared photomultiplier

    NASA Technical Reports Server (NTRS)

    Evans, W. E.

    1978-01-01

    A high performance near infrared sensitive photomultiplier tube was procured and evaluated with emphasis on those characteristics affecting its use over the very large amplitude range of signals encountered by an airborne lidar intended for mapping the distribution of stratospheric aerosols. A cathode quantum efficiency of 4.3 percent at 1.06 micrometer wavelength and a background count of less than 10,000 per second were realized. It is recommended that the tube be stored and operated at a temperature near -20 C, or cooler. Performance was found acceptable for the application in both pulse counting and analog modes, but careful design, probably including dynamic gain control, will be required to effectively utilize both modes on the same lidar shot.

  7. Beyond core count: a look at new mainstream computing platforms for HEP workloads

    NASA Astrophysics Data System (ADS)

    Szostek, P.; Nowak, A.; Bitzes, G.; Valsan, L.; Jarp, S.; Dotti, A.

    2014-06-01

    As Moore's Law continues to deliver more and more transistors, the mainstream processor industry is preparing to expand its investments in areas other than simple core count. These new interests include deep integration of on-chip components, advanced vector units, memory, cache and interconnect technologies. We examine these moving trends with parallelized and vectorized High Energy Physics workloads in mind. In particular, we report on practical experience resulting from experiments with scalable HEP benchmarks on the Intel "Ivy Bridge-EP" and "Haswell" processor families. In addition, we examine the benefits of the new "Haswell" microarchitecture and its impact on multiple facets of HEP software. Finally, we report on the power efficiency of new systems.

  8. Radiation and Temperature Hard Multi-Pixel Avalanche Photodiodes

    NASA Technical Reports Server (NTRS)

    Bensaoula, Abdelhak (Inventor); Starikov, David (Inventor); Pillai, Rajeev (Inventor)

    2017-01-01

    The structure and method of fabricating a radiation and temperature hard avalanche photodiode with integrated radiation and temperature hard readout circuit, comprising a substrate, an avalanche region, an absorption region, and a plurality of Ohmic contacts are presented. The present disclosure provides for tuning of spectral sensitivity and high device efficiency, resulting in photon counting capability with decreased crosstalk and reduced dark current.

  9. Microfluidic assay of circulating endothelial cells in coronary artery disease patients with angina pectoris

    PubMed Central

    Chen, Shuiyu; Sun, Yukun; Neoh, Kuang Hong; Chen, Anqi; Li, Weiju; Yang, Xiaorui

    2017-01-01

    Background Circulating endothelial cells (CECs) are widely reported as a promising biomarker of endothelial damage/dysfunction in coronary artery disease (CAD). The two popular methods of CEC quantification include the use of immunomagnetic beads separation (IB) and flow cytometry analysis (FC); however, they suffer from two main shortcomings that affect their diagnostic and prognostic responses: non-specific bindings of magnetic beads to non-target cells and a high degree of variability in rare cell identification, respectively. We designed a microfluidic chip with spatially staggered micropillars for the efficient harvesting of CECs with intact cellular morphology in an attempt to revisit the diagnostic goal of CEC counts in CAD patients with angina pectoris. Methods A label-free microfluidic assay that involved an in-situ enumeration and immunofluorescent identification (DAPI+/CD146+/VEGFR1+/CD45-) of CECs was carried out to assess the CEC count in human peripheral blood samples. A total of 55 CAD patients with angina pectoris [16 with chronic stable angina (CSA) and 39 with unstable angina (UA)], together with 15 heathy controls (HCs) were enrolled in the study. Results CEC counts are significantly higher in both CSA and UA groups compared to the HC group [respective medians of 6.9, 10.0 and 1.5 cells/ml (p < 0.01)]. Further, a significant elevation of CEC count was observed in the three UA subgroups [low risk (5.3) vs. intermediate risk (10.8) vs. high risk (18.0) cells/ml, p < 0.001) classified in accordance to the TIMI NSTEMI/UA risk score system. From the receiver-operating characteristic curve analysis, the AUCs for distinguishing CSA and UA from HC were 0.867 and 0.938, respectively. The corresponding sensitivities were 87.5% and 84.6% and the specificities were 66.7% and 86.7%, respectively. Conclusions Our microfluidic assay system is efficient and stable for CEC capture and enumeration. The results showed that the CEC count has the potential to be a promising clinical biomarker for the assessment of endothelial damage/dysfunction in CAD patients with angina pectoris. PMID:28704506

  10. Microfluidic assay of circulating endothelial cells in coronary artery disease patients with angina pectoris.

    PubMed

    Chen, Shuiyu; Sun, Yukun; Neoh, Kuang Hong; Chen, Anqi; Li, Weiju; Yang, Xiaorui; Han, Ray P S

    2017-01-01

    Circulating endothelial cells (CECs) are widely reported as a promising biomarker of endothelial damage/dysfunction in coronary artery disease (CAD). The two popular methods of CEC quantification include the use of immunomagnetic beads separation (IB) and flow cytometry analysis (FC); however, they suffer from two main shortcomings that affect their diagnostic and prognostic responses: non-specific bindings of magnetic beads to non-target cells and a high degree of variability in rare cell identification, respectively. We designed a microfluidic chip with spatially staggered micropillars for the efficient harvesting of CECs with intact cellular morphology in an attempt to revisit the diagnostic goal of CEC counts in CAD patients with angina pectoris. A label-free microfluidic assay that involved an in-situ enumeration and immunofluorescent identification (DAPI+/CD146+/VEGFR1+/CD45-) of CECs was carried out to assess the CEC count in human peripheral blood samples. A total of 55 CAD patients with angina pectoris [16 with chronic stable angina (CSA) and 39 with unstable angina (UA)], together with 15 heathy controls (HCs) were enrolled in the study. CEC counts are significantly higher in both CSA and UA groups compared to the HC group [respective medians of 6.9, 10.0 and 1.5 cells/ml (p < 0.01)]. Further, a significant elevation of CEC count was observed in the three UA subgroups [low risk (5.3) vs. intermediate risk (10.8) vs. high risk (18.0) cells/ml, p < 0.001) classified in accordance to the TIMI NSTEMI/UA risk score system. From the receiver-operating characteristic curve analysis, the AUCs for distinguishing CSA and UA from HC were 0.867 and 0.938, respectively. The corresponding sensitivities were 87.5% and 84.6% and the specificities were 66.7% and 86.7%, respectively. Our microfluidic assay system is efficient and stable for CEC capture and enumeration. The results showed that the CEC count has the potential to be a promising clinical biomarker for the assessment of endothelial damage/dysfunction in CAD patients with angina pectoris.

  11. Sample to answer visualization pipeline for low-cost point-of-care blood cell counting

    NASA Astrophysics Data System (ADS)

    Smith, Suzanne; Naidoo, Thegaran; Davies, Emlyn; Fourie, Louis; Nxumalo, Zandile; Swart, Hein; Marais, Philip; Land, Kevin; Roux, Pieter

    2015-03-01

    We present a visualization pipeline from sample to answer for point-of-care blood cell counting applications. Effective and low-cost point-of-care medical diagnostic tests provide developing countries and rural communities with accessible healthcare solutions [1], and can be particularly beneficial for blood cell count tests, which are often the starting point in the process of diagnosing a patient [2]. The initial focus of this work is on total white and red blood cell counts, using a microfluidic cartridge [3] for sample processing. Analysis of the processed samples has been implemented by means of two main optical visualization systems developed in-house: 1) a fluidic operation analysis system using high speed video data to determine volumes, mixing efficiency and flow rates, and 2) a microscopy analysis system to investigate homogeneity and concentration of blood cells. Fluidic parameters were derived from the optical flow [4] as well as color-based segmentation of the different fluids using a hue-saturation-value (HSV) color space. Cell count estimates were obtained using automated microscopy analysis and were compared to a widely accepted manual method for cell counting using a hemocytometer [5]. The results using the first iteration microfluidic device [3] showed that the most simple - and thus low-cost - approach for microfluidic component implementation was not adequate as compared to techniques based on manual cell counting principles. An improved microfluidic design has been developed to incorporate enhanced mixing and metering components, which together with this work provides the foundation on which to successfully implement automated, rapid and low-cost blood cell counting tests.

  12. Counter-Based Broadcast Scheme Considering Reachability, Network Density, and Energy Efficiency for Wireless Sensor Networks.

    PubMed

    Jung, Ji-Young; Seo, Dong-Yoon; Lee, Jung-Ryun

    2018-01-04

    A wireless sensor network (WSN) is emerging as an innovative method for gathering information that will significantly improve the reliability and efficiency of infrastructure systems. Broadcast is a common method to disseminate information in WSNs. A variety of counter-based broadcast schemes have been proposed to mitigate the broadcast-storm problems, using the count threshold value and a random access delay. However, because of the limited propagation of the broadcast-message, there exists a trade-off in a sense that redundant retransmissions of the broadcast-message become low and energy efficiency of a node is enhanced, but reachability become low. Therefore, it is necessary to study an efficient counter-based broadcast scheme that can dynamically adjust the random access delay and count threshold value to ensure high reachability, low redundant of broadcast-messages, and low energy consumption of nodes. Thus, in this paper, we first measure the additional coverage provided by a node that receives the same broadcast-message from two neighbor nodes, in order to achieve high reachability with low redundant retransmissions of broadcast-messages. Second, we propose a new counter-based broadcast scheme considering the size of the additional coverage area, distance between the node and the broadcasting node, remaining battery of the node, and variations of the node density. Finally, we evaluate performance of the proposed scheme compared with the existing counter-based broadcast schemes. Simulation results show that the proposed scheme outperforms the existing schemes in terms of saved rebroadcasts, reachability, and total energy consumption.

  13. Machine Learning Based Single-Frame Super-Resolution Processing for Lensless Blood Cell Counting

    PubMed Central

    Huang, Xiwei; Jiang, Yu; Liu, Xu; Xu, Hang; Han, Zhi; Rong, Hailong; Yang, Haiping; Yan, Mei; Yu, Hao

    2016-01-01

    A lensless blood cell counting system integrating microfluidic channel and a complementary metal oxide semiconductor (CMOS) image sensor is a promising technique to miniaturize the conventional optical lens based imaging system for point-of-care testing (POCT). However, such a system has limited resolution, making it imperative to improve resolution from the system-level using super-resolution (SR) processing. Yet, how to improve resolution towards better cell detection and recognition with low cost of processing resources and without degrading system throughput is still a challenge. In this article, two machine learning based single-frame SR processing types are proposed and compared for lensless blood cell counting, namely the Extreme Learning Machine based SR (ELMSR) and Convolutional Neural Network based SR (CNNSR). Moreover, lensless blood cell counting prototypes using commercial CMOS image sensors and custom designed backside-illuminated CMOS image sensors are demonstrated with ELMSR and CNNSR. When one captured low-resolution lensless cell image is input, an improved high-resolution cell image will be output. The experimental results show that the cell resolution is improved by 4×, and CNNSR has 9.5% improvement over the ELMSR on resolution enhancing performance. The cell counting results also match well with a commercial flow cytometer. Such ELMSR and CNNSR therefore have the potential for efficient resolution improvement in lensless blood cell counting systems towards POCT applications. PMID:27827837

  14. Monitoring Oilfield Operations and GHG Emissions Sources Using Object-based Image Analysis of High Resolution Spatial Imagery

    NASA Astrophysics Data System (ADS)

    Englander, J. G.; Brodrick, P. G.; Brandt, A. R.

    2015-12-01

    Fugitive emissions from oil and gas extraction have become a greater concern with the recent increases in development of shale hydrocarbon resources. There are significant gaps in the tools and research used to estimate fugitive emissions from oil and gas extraction. Two approaches exist for quantifying these emissions: atmospheric (or 'top down') studies, which measure methane fluxes remotely, or inventory-based ('bottom up') studies, which aggregate leakage rates on an equipment-specific basis. Bottom-up studies require counting or estimating how many devices might be leaking (called an 'activity count'), as well as how much each device might leak on average (an 'emissions factor'). In a real-world inventory, there is uncertainty in both activity counts and emissions factors. Even at the well level there are significant disagreements in data reporting. For example, some prior studies noted a ~5x difference in the number of reported well completions in the United States between EPA and private data sources. The purpose of this work is to address activity count uncertainty by using machine learning algorithms to classify oilfield surface facilities using high-resolution spatial imagery. This method can help estimate venting and fugitive emissions sources from regions where reporting of oilfield equipment is incomplete or non-existent. This work will utilize high resolution satellite imagery to count well pads in the Bakken oil field of North Dakota. This initial study examines an area of ~2,000 km2 with ~1000 well pads. We compare different machine learning classification techniques, and explore the impact of training set size, input variables, and image segmentation settings to develop efficient and robust techniques identifying well pads. We discuss the tradeoffs inherent to different classification algorithms, and determine the optimal algorithms for oilfield feature detection. In the future, the results of this work will be leveraged to be provide activity counts of oilfield surface equipment including tanks, pumpjacks, and holding ponds.

  15. Fixed-Radius Point Counts in Forests: Factors Influencing Effectiveness and Efficiency

    Treesearch

    Daniel R. Petit; Lisa J. Petit; Victoria A. Saab; Thomas E. Martin

    1995-01-01

    The effectiveness of fixed-radius point counts in quantifying abundance and richness of bird species in oak-hickory, pine-hardwoods, mixed-mesophytic, beech-maple, and riparian cottonwood forests was evaluated in Arkansas, Ohio, Kentucky, and Idaho. Effects of count duration and numbers of stations and visits per stand were evaluated in May to July 1991 by conducting...

  16. Summing coincidence correction for γ-ray measurements using the HPGe detector with a low background shielding system

    NASA Astrophysics Data System (ADS)

    He, L.-C.; Diao, L.-J.; Sun, B.-H.; Zhu, L.-H.; Zhao, J.-W.; Wang, M.; Wang, K.

    2018-02-01

    A Monte Carlo method based on the GEANT4 toolkit has been developed to correct the full-energy peak (FEP) efficiencies of a high purity germanium (HPGe) detector equipped with a low background shielding system, and moreover evaluated using summing peaks in a numerical way. It is found that the FEP efficiencies of 60Co, 133Ba and 152Eu can be improved up to 18% by taking the calculated true summing coincidence factors (TSCFs) correction into account. Counts of summing coincidence γ peaks in the spectrum of 152Eu can be well reproduced using the corrected efficiency curve within an accuracy of 3%.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, T; Graham, C L; Sundsmo, T

    This procedure provides instructions for the calibration and use of the Canberra iSolo Low Background Alpha/Beta Counting System (iSolo) that is used for counting air filters and swipe samples. This detector is capable of providing radioisotope identification (e.g., it can discriminate between radon daughters and plutonium). This procedure includes step-by-step instructions for: (1) Performing periodic or daily 'Background' and 'Efficiency QC' checks; (2) Setting-up the iSolo for counting swipes and air filters; (3) Counting swipes and air filters for alpha and beta activity; and (4) Annual calibration.

  18. Color quench correction for low level Cherenkov counting.

    PubMed

    Tsroya, S; Pelled, O; German, U; Marco, R; Katorza, E; Alfassi, Z B

    2009-05-01

    The Cherenkov counting efficiency varies strongly with color quenching, thus correction curves must be used to obtain correct results. The external (152)Eu source of a Quantulus 1220 liquid scintillation counting (LSC) system was used to obtain a quench indicative parameter based on spectra area ratio. A color quench correction curve for aqueous samples containing (90)Sr/(90)Y was prepared. The main advantage of this method over the common spectra indicators is its usefulness also for low level Cherenkov counting.

  19. The development of strategy use in elementary school children: working memory and individual differences.

    PubMed

    Imbo, Ineke; Vandierendonck, André

    2007-04-01

    The current study tested the development of working memory involvement in children's arithmetic strategy selection and strategy efficiency. To this end, an experiment in which the dual-task method and the choice/no-choice method were combined was administered to 10- to 12-year-olds. Working memory was needed in retrieval, transformation, and counting strategies, but the ratio between available working memory resources and arithmetic task demands changed across development. More frequent retrieval use, more efficient memory retrieval, and more efficient counting processes reduced the working memory requirements. Strategy efficiency and strategy selection were also modified by individual differences such as processing speed, arithmetic skill, gender, and math anxiety. Short-term memory capacity, in contrast, was not related to children's strategy selection or strategy efficiency.

  20. Validation of an automated colony counting system for group A Streptococcus.

    PubMed

    Frost, H R; Tsoi, S K; Baker, C A; Laho, D; Sanderson-Smith, M L; Steer, A C; Smeesters, P R

    2016-02-08

    The practice of counting bacterial colony forming units on agar plates has long been used as a method to estimate the concentration of live bacteria in culture. However, due to the laborious and potentially error prone nature of this measurement technique, an alternative method is desirable. Recent technologic advancements have facilitated the development of automated colony counting systems, which reduce errors introduced during the manual counting process and recording of information. An additional benefit is the significant reduction in time taken to analyse colony counting data. Whilst automated counting procedures have been validated for a number of microorganisms, the process has not been successful for all bacteria due to the requirement for a relatively high contrast between bacterial colonies and growth medium. The purpose of this study was to validate an automated counting system for use with group A Streptococcus (GAS). Twenty-one different GAS strains, representative of major emm-types, were selected for assessment. In order to introduce the required contrast for automated counting, 2,3,5-triphenyl-2H-tetrazolium chloride (TTC) dye was added to Todd-Hewitt broth with yeast extract (THY) agar. Growth on THY agar with TTC was compared with growth on blood agar and THY agar to ensure the dye was not detrimental to bacterial growth. Automated colony counts using a ProtoCOL 3 instrument were compared with manual counting to confirm accuracy over the stages of the growth cycle (latent, mid-log and stationary phases) and in a number of different assays. The average percentage differences between plating and counting methods were analysed using the Bland-Altman method. A percentage difference of ±10 % was determined as the cut-off for a critical difference between plating and counting methods. All strains measured had an average difference of less than 10 % when plated on THY agar with TTC. This consistency was also observed over all phases of the growth cycle and when plated in blood following bactericidal assays. Agreement between these methods suggest the use of an automated colony counting technique for GAS will significantly reduce time spent counting bacteria to enable a more efficient and accurate measurement of bacteria concentration in culture.

  1. Feasibility of a high-speed gamma-camera design using the high-yield-pileup-event-recovery method.

    PubMed

    Wong, W H; Li, H; Uribe, J; Baghaei, H; Wang, Y; Yokoyama, S

    2001-04-01

    Higher count-rate gamma cameras than are currently used are needed if the technology is to fulfill its promise in positron coincidence imaging, radionuclide therapy dosimetry imaging, and cardiac first-pass imaging. The present single-crystal design coupled with conventional detector electronics and the traditional Anger-positioning algorithm hinder higher count-rate imaging because of the pileup of gamma-ray signals in the detector and electronics. At an interaction rate of 2 million events per second, the fraction of nonpileup events is < 20% of the total incident events. Hence, the recovery of pileup events can significantly increase the count-rate capability, increase the yield of imaging photons, and minimize image artifacts associated with pileups. A new technology to significantly enhance the performance of gamma cameras in this area is introduced. We introduce a new electronic design called high-yield-pileup-event-recovery (HYPER) electronics for processing the detector signal in gamma cameras so that the individual gamma energies and positions of pileup events, including multiple pileups, can be resolved and recovered despite the mixing of signals. To illustrate the feasibility of the design concept, we have developed a small gamma-camera prototype with the HYPER-Anger electronics. The camera has a 10 x 10 x 1 cm NaI(Tl) crystal with four photomultipliers. Hot-spot and line sources with very high 99mTc activities were imaged. The phantoms were imaged continuously from 60,000 to 3,500,000 counts per second to illustrate the efficacy of the method as a function of counting rates. At 2-3 million events per second, all phantoms were imaged with little distortion, pileup, and dead-time loss. At these counting rates, multiple pileup events (> or = 3 events piling together) were the predominate occurrences, and the HYPER circuit functioned well to resolve and recover these events. The full width at half maximum of the line-spread function at 3,000,000 counts per second was 1.6 times that at 60,000 counts per second. This feasibility study showed that the HYPER electronic concept works; it can significantly increase the count-rate capability and dose efficiency of gamma cameras. In a larger clinical camera, multiple HYPER-Anger circuits may be implemented to further improve the imaging counting rates that we have shown by multiple times. This technology would facilitate the use of gamma cameras for radionuclide therapy dosimetry imaging, cardiac first-pass imaging, and positron coincidence imaging and the simultaneous acquisition of transmission and emission data using different isotopes with less cross-contamination between transmission and emission data.

  2. Energy-resolved CT imaging with a photon-counting silicon-strip detector

    NASA Astrophysics Data System (ADS)

    Persson, Mats; Huber, Ben; Karlsson, Staffan; Liu, Xuejin; Chen, Han; Xu, Cheng; Yveborg, Moa; Bornefalk, Hans; Danielsson, Mats

    2014-03-01

    Photon-counting detectors are promising candidates for use in the next generation of x-ray CT scanners. Among the foreseen benefits are higher spatial resolution, better trade-off between noise and dose, and energy discriminating capabilities. Silicon is an attractive detector material because of its low cost, mature manufacturing process and high hole mobility. However, it is sometimes claimed to be unsuitable for use in computed tomography because of its low absorption efficiency and high fraction of Compton scatter. The purpose of this work is to demonstrate that high-quality energy-resolved CT images can nonetheless be acquired with clinically realistic exposure parameters using a photon-counting silicon-strip detector with eight energy thresholds developed in our group. We use a single detector module, consisting of a linear array of 50 0.5 × 0.4 mm detector elements, to image a phantom in a table-top lab setup. The phantom consists of a plastic cylinder with circular inserts containing water, fat and aqueous solutions of calcium, iodine and gadolinium, in different concentrations. We use basis material decomposition to obtain water, calcium, iodine and gadolinium basis images and demonstrate that these basis images can be used to separate the different materials in the inserts. We also show results showing that the detector has potential for quantitative measurements of substance concentrations.

  3. Measurement of tritium with high efficiency by using liquid scintillation counter with plastic scintillator.

    PubMed

    Furuta, Etsuko; Ohyama, Ryu-ichiro; Yokota, Shigeaki; Nakajo, Toshiya; Yamada, Yuka; Kawano, Takao; Uda, Tatsuhiko; Watanabe, Yasuo

    2014-11-01

    The detection efficiencies of tritium samples by using liquid scintillation counter with hydrophilic plastic scintillator (PS) was approximately 48% when the sample of 20 μL was held between 2 PS sheets treated by plasma. The activity and count rates showed a good relationship between 400 Bq to 410 KBq mL(-1). The calculated detection limit of 2 min measurement by the PS was 13 Bq mL(-1) when a confidence was 95%. The plasma method for PS produces no radioactive waste. Copyright © 2014 Elsevier Ltd. All rights reserved.

  4. Femtosecond Laser--Pumped Source of Entangled Photons for Quantum Cryptography Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pan, D.; Donaldson, W.; Sobolewski, R.

    2007-07-31

    We present an experimental setup for generation of entangled-photon pairs via spontaneous parametric down-conversion, based on the femtosecond-pulsed laser. Our entangled-photon source utilizes a 76-MHz-repetition-rate, 100-fs-pulse-width, mode-locked, ultrafast femtosecond laser, which can produce, on average, more photon pairs than a cw laser of an equal pump power. The resulting entangled pairs are counted by a pair of high-quantum-efficiency, single-photon, silicon avalanche photodiodes. Our apparatus is intended as an efficient source/receiver system for the quantum communications and quantum cryptography applications.

  5. Designing primers and evaluation of the efficiency of propidium monoazide - Quantitative polymerase chain reaction for counting the viable cells of Lactobacillus gasseri and Lactobacillus salivarius.

    PubMed

    Lai, Chieh-Hsien; Wu, Sih-Rong; Pang, Jen-Chieh; Ramireddy, Latha; Chiang, Yu-Cheng; Lin, Chien-Ku; Tsen, Hau-Yang

    2017-07-01

    The purpose of this study is to evaluate the efficiency of using propidium monoazide (PMA) real-time quantitative polymerase chain reaction (qPCR) to count the viable cells of Lactobacillus gasseri and Lactobacillus salivarius in probiotic products. Based on the internal transcription spacer and 23S rRNA genes, two primer sets specific for these two Lactobacillus species were designed. For a probiotic product, the total deMan Rogosa Sharpe plate count was 8.65±0.69 log CFU/g, while for qPCR, the cell counts of L. gasseri and L. salivarius were 8.39±0.14 log CFU/g and 8.57±0.24 log CFU/g, respectively. Under the same conditions, for its heat-killed product, qPCR counts for L. gasseri and L. salivarius were 6.70±0.16 log cells/g and 7.67±0.20 log cells/g, while PMA-qPCR counts were 5.33±0.18 log cells/g and 5.05±0.23 log cells/g, respectively. For cell dilutions with a viable cell count of 8.5 log CFU/mL for L. gasseri and L. salivarius, after heat killing, the PMA-qPCR count for both Lactobacillus species was near 5.5 log cells/mL. When the PMA-qPCR counts of these cell dilutions were compared before and after heat killing, although some DNA might be lost during the heat killing, significant qPCR signals from dead cells, i.e., about 4-5 log cells/mL, could not be reduced by PMA treatment. Increasing PMA concentrations from 100 μM to 200 μM or light exposure time from 5 minutes to 15 minutes had no or, if any, only minor effect on the reduction of qPCR signals from their dead cells. Thus, to differentiate viable lactic acid bacterial cells from dead cells using the PMA-qPCR method, the efficiency of PMA to reduce the qPCR signals from dead cells should be notable. Copyright © 2016. Published by Elsevier B.V.

  6. Single-particle detection of products from atomic and molecular reactions in a cryogenic ion storage ring

    NASA Astrophysics Data System (ADS)

    Krantz, C.; Novotný, O.; Becker, A.; George, S.; Grieser, M.; Hahn, R. von; Meyer, C.; Schippers, S.; Spruck, K.; Vogel, S.; Wolf, A.

    2017-04-01

    We have used a single-particle detector system, based on secondary electron emission, for counting low-energetic (∼keV/u) massive products originating from atomic and molecular ion reactions in the electrostatic Cryogenic Storage Ring (CSR). The detector is movable within the cryogenic vacuum chamber of CSR, and was used to measure production rates of a variety of charged and neutral daughter particles. In operation at a temperature of ∼ 6 K , the detector is characterised by a high dynamic range, combining a low dark event rate with good high-rate particle counting capability. On-line measurement of the pulse height distributions proved to be an important monitor of the detector response at low temperature. Statistical pulse-height analysis allows to infer the particle detection efficiency of the detector, which has been found to be close to unity also in cryogenic operation at 6 K.

  7. Strategies and limitations for fluorescence detection of XAFS at high flux beamlines

    DOE PAGES

    Heald, Steve M.

    2015-02-17

    The issue of detecting the XAFS signal from dilute samples is discussed in detail with the aim of making best use of high flux beamlines that provide up to 10 13 photons -1. Various detection methods are compared, including filters with slits, solid state detectors, crystal analyzers and combinations of these. These comparisons rely on simulations that use experimentally determined parameters. It is found that inelastic scattering places a fundamental limit on detection, and that it is important to take proper account of the polarization dependence of the signals. The combination of a filter–slit system with a solid state detectormore » is a promising approach. With an optimized system good performance can be obtained even if the total count rate is limited to 10 7 Hz. Detection schemes with better energy resolution can help at the largest dilutions if their collection efficiency and count rate limits can be improved.« less

  8. Strategies and limitations for fluorescence detection of XAFS at high flux beamlines

    PubMed Central

    Heald, Steve M.

    2015-01-01

    The issue of detecting the XAFS signal from dilute samples is discussed in detail with the aim of making best use of high flux beamlines that provide up to 1013 photons s−1. Various detection methods are compared, including filters with slits, solid state detectors, crystal analyzers and combinations of these. These comparisons rely on simulations that use experimentally determined parameters. It is found that inelastic scattering places a fundamental limit on detection, and that it is important to take proper account of the polarization dependence of the signals. The combination of a filter–slit system with a solid state detector is a promising approach. With an optimized system good performance can be obtained even if the total count rate is limited to 107 Hz. Detection schemes with better energy resolution can help at the largest dilutions if their collection efficiency and count rate limits can be improved. PMID:25723945

  9. Performance of a Micro-Strip Gas Chamber for event wise, high rate thermal neutron detection with accurate 2D position determination

    NASA Astrophysics Data System (ADS)

    Mindur, B.; Alimov, S.; Fiutowski, T.; Schulz, C.; Wilpert, T.

    2014-12-01

    A two-dimensional (2D) position sensitive detector for neutron scattering applications based on low-pressure gas amplification and micro-strip technology was built and tested with an innovative readout electronics and data acquisition system. This detector contains a thin solid neutron converter and was developed for time- and thus wavelength-resolved neutron detection in single-event counting mode, which improves the image contrast in comparison with integrating detectors. The prototype detector of a Micro-Strip Gas Chamber (MSGC) was built with a solid natGd/CsI thermal neutron converter for spatial resolutions of about 100 μm and counting rates up to 107 neutrons/s. For attaining very high spatial resolutions and counting rates via micro-strip readout with centre-of-gravity evaluation of the signal amplitude distributions, a fast, channel-wise, self-triggering ASIC was developed. The front-end chips (MSGCROCs), which are very first signal processing components, are read out into powerful ADC-FPGA boards for on-line data processing and thereafter via Gigabit Ethernet link into the data receiving PC. The workstation PC is controlled by a modular, high performance dedicated software suite. Such a fast and accurate system is crucial for efficient radiography/tomography, diffraction or imaging applications based on high flux thermal neutron beam. In this paper a brief description of the detector concept with its operation principles, readout electronics requirements and design together with the signals processing stages performed in hardware and software are presented. In more detail the neutron test beam conditions and measurement results are reported. The focus of this paper is on the system integration, two dimensional spatial resolution, the time resolution of the readout system and the imaging capabilities of the overall setup. The detection efficiency of the detector prototype is estimated as well.

  10. Spent Fuel Assay with an Ultra-High Rate HPGe Spectrometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fast, James; Fulsom, Bryan; Pitts, Karl

    2015-07-01

    Traditional verification of spent nuclear fuel (SNF) includes determination of initial enrichment, burnup and cool down time (IE, BU, CT). Along with neutron measurements, passive gamma assay provides important information for determining BU and CT. Other gamma-ray-based assay methods such as passive tomography and active delayed gamma offer the potential to measure the spatial distribution of fission products and the fissile isotopic concentration of the fuel, respectively. All fuel verification methods involving gamma-ray spectroscopy require that the spectrometers manage very high count rates while extracting the signatures of interest. PNNL has developed new digital filtering and analysis techniques to producemore » an ultra-high rate gamma-ray spectrometer from a standard coaxial high-purity germanium (HPGe) crystal. This 37% relative efficiency detector has been operated for SNF measurements at input count rates of 500-1300 kcps and throughput in excess of 150 kcps. Optimized filtering algorithms preserve the spectroscopic capability of the system even at these high rates. This paper will present the results of both passive and active SNF measurement performed with this system at PNNL. (authors)« less

  11. Highly efficient router-based readout algorithm for single-photon-avalanche-diode imagers for time-correlated experiments

    NASA Astrophysics Data System (ADS)

    Cominelli, A.; Acconcia, G.; Caldi, F.; Peronio, P.; Ghioni, M.; Rech, I.

    2018-02-01

    Time-Correlated Single Photon Counting (TCSPC) is a powerful tool that permits to record extremely fast optical signals with a precision down to few picoseconds. On the other hand, it is recognized as a relatively slow technique, especially when a large time-resolved image is acquired exploiting a single acquisition channel and a scanning system. During the last years, much effort has been made towards the parallelization of many acquisition and conversion chains. In particular, the exploitation of Single-Photon Avalanche Diodes in standard CMOS technology has paved the way to the integration of thousands of independent channels on the same chip. Unfortunately, the presence of a large number of detectors can give rise to a huge rate of events, which can easily lead to the saturation of the transfer rate toward the elaboration unit. As a result, a smart readout approach is needed to guarantee an efficient exploitation of the limited transfer bandwidth. We recently introduced a novel readout architecture, aimed at maximizing the counting efficiency of the system in typical TCSPC measurements. It features a limited number of high-performance converters, which are shared with a much larger array, while a smart routing logic provides a dynamic multiplexing between the two parts. Here we propose a novel routing algorithm, which exploits standard digital gates distributed among a large 32x32 array to ensure a dynamic connection between detectors and external time-measurement circuits.

  12. Expanding the detection efficiency of silicon drift detectors

    NASA Astrophysics Data System (ADS)

    Schlosser, D. M.; Lechner, P.; Lutz, G.; Niculae, A.; Soltau, H.; Strüder, L.; Eckhardt, R.; Hermenau, K.; Schaller, G.; Schopper, F.; Jaritschin, O.; Liebel, A.; Simsek, A.; Fiorini, C.; Longoni, A.

    2010-12-01

    To expand the detection efficiency Silicon Drift Detectors (SDDs) with various customized radiation entrance windows, optimized detector areas and geometries have been developed. Optimum values for energy resolution, peak to background ratio (P/B) and high count rate capability support the development. Detailed results on sensors optimized for light element detection down to Boron or even lower will be reported. New developments for detecting medium and high X-ray energies by increasing the effective detector thickness will be presented. Gamma-ray detectors consisting of a SDD coupled to scintillators like CsI(Tl) and LaBr 3(Ce) have been examined. Results of the energy resolution for the 137Cs 662 keV line and the light yield (LY) of such detector systems will be reported.

  13. Robust interferon-α and IL-12 responses by dendritic cells are related to efficient CD4+ T-cell recovery in HIV patients on ART.

    PubMed

    Tan, Dino Bee Aik; Yong, Yean Kong; Lim, Andrew; Tan, Hong Yien; Kamarulzaman, Adeeba; French, Martyn; Price, Patricia

    2011-05-01

    Amongst HIV patients with successful virological responses to antiretroviral therapy (ART), poor CD4(+) T-cell recovery is associated with low nadir CD4(+) T-cell counts and persistent immune activation. These factors might be influenced by dendritic cell (DC) function. Interferon-α-producing plasmacytoid DC and IL-12-producing myeloid DC were quantified by flow cytometry after stimulation with agonists to TLR7/8 (CL075) or TLR9 (CpG-ODN). These were compared between patients who achieved CD4(+) T-cell counts above or below 200 cells/μL after 6 months on ART (High vs. Low groups). High Group patients had more DC producing interferon-α or IL-12 at Weeks 6 and 12 on ART than Low Group patients. The frequencies of cytokine-producing DC at Week 12 were directly correlated with CD4(+) T-cell counts at baseline and at Week 12. Patients with good recovery of CD4(+) T-cells had robust TLR-mediated interferon-α responses by plasmacytoid DC and IL-12 responses by myeloid DC during early ART (1-3 months). Copyright © 2011 Elsevier Inc. All rights reserved.

  14. Integrated four-channel all-fiber up-conversion single-photon-detector with adjustable efficiency and dark count.

    PubMed

    Zheng, Ming-Yang; Shentu, Guo-Liang; Ma, Fei; Zhou, Fei; Zhang, Hai-Ting; Dai, Yun-Qi; Xie, Xiuping; Zhang, Qiang; Pan, Jian-Wei

    2016-09-01

    Up-conversion single photon detector (UCSPD) has been widely used in many research fields including quantum key distribution, lidar, optical time domain reflectrometry, and deep space communication. For the first time in laboratory, we have developed an integrated four-channel all-fiber UCSPD which can work in both free-running and gate modes. This compact module can satisfy different experimental demands with adjustable detection efficiency and dark count. We have characterized the key parameters of the UCSPD system.

  15. Aerosol Generation by Modern Flush Toilets.

    PubMed

    Johnson, David; Lynch, Robert; Marshall, Charles; Mead, Kenneth; Hirst, Deborah

    A microbe-contaminated toilet will produce bioaerosols when flushed. We assessed toilet plume aerosol from high efficiency (HET), pressure-assisted high efficiency (PAT), and flushometer (FOM) toilets with similar bowl water and flush volumes. Total and droplet nuclei "bioaerosols" were assessed. Monodisperse 0.25-1.9- μ m fluorescent microspheres served as microbe surrogates in separate trials in a mockup 5 m 3 water closet (WC). Bowl water seeding was approximately 10 12 particles/mL. Droplet nuclei were sampled onto 0.2- μ m pore size mixed cellulose ester filters beginning 15 min after the flush using open-face cassettes mounted on the WC walls. Pre- and postflush bowl water concentrations were measured. Filter particle counts were analyzed via fluorescent microscopy. Bowl headspace droplet count size distributions were bimodal and similar for all toilet types and flush conditions, with 95% of droplets < 2 μ m diameter and > 99% < 5 μ m. Up to 145,000 droplets were produced per flush, with the high-energy flushometer producing over three times as many as the lower energy PAT and over 12 times as many as the lowest energy HET despite similar flush volumes. The mean numbers of fluorescent droplet nuclei particles aerosolized and remaining airborne also increased with flush energy. Fluorescent droplet nuclei per flush decreased with increasing particle size. These findings suggest two concurrent aerosolization mechanisms-splashing for large droplets and bubble bursting for the fine droplets that form droplet nuclei.

  16. Longitudinal monitoring of whole body counter NaI(TI) detector efficiency

    USDA-ARS?s Scientific Manuscript database

    Assessing accuracy of radiation counting systems over time is critical. We examined long-term WBC performance in detail. Efficiency factors for 54 detectors were updated annually over several years. Newer efficiency values were compared with baseline and with annual values. Overall system efficiency...

  17. High-Efficiency Photovoltaic Energy Conversion using Surface Acoustic Waves in Piezoelectric Semiconductors

    NASA Astrophysics Data System (ADS)

    Yakovenko, Victor

    2010-03-01

    We propose a radically new design for photovoltaic energy conversion using surface acoustic waves (SAWs) in piezoelectric semiconductors. The periodically modulated electric field from SAW spatially separates photogenerated electrons and holes to the maxima and minima of SAW, thus preventing their recombination. The segregated electrons and holes are transported by the moving SAW to the collecting electrodes of two types, which produce dc electric output. Recent experiments [1] using SAWs in GaAs have demonstrated the photon to current conversion efficiency of 85%. These experiments were designed for photon counting, but we propose to adapt these techniques for highly efficient photovoltaic energy conversion. The advantages are that the electron-hole segregation takes place in the whole volume where SAW is present, and the electrons and holes are transported in the organized, collective manner at high speed, as opposed to random diffusion in conventional devices.[4pt] [1] S. J. Jiao, P. D. Batista, K. Biermann, R. Hey, and P. V. Santos, J. Appl. Phys. 106, 053708 (2009).

  18. Early endothelial damage detected by circulating particles in baboons fed a diet high in simple carbohydrates in conjunction with saturated or unsaturated fat.

    PubMed

    Shi, Qiang; Hodara, Vida; Meng, Qinghe; Voruganti, V Saroja; Rice, Karen; Michalek, Joel E; Comuzzie, Anthony G; VandeBerg, John L

    2014-01-01

    Studies have shown that high-fat diets cause blood vessel damage, however, assessing pathological effects accurately and efficiently is difficult. In this study, we measured particle levels of static endothelium (CD31+ and CD105+) and activated endothelium (CD62E+, CD54+ and CD106+) in plasma. We determined individual responses to two dietary regimens in two groups of baboons. One group (n = 10), was fed a diet high in simple carbohydrates and saturated fats (the HSF diet) and the other (n = 8) received a diet high in simple carbohydrates and unsaturated fats (the HUF diet). Plasma samples were collected at 0, 3, and 7 weeks. The percentages of CD31+ and CD62E+ particles were elevated at 3 weeks in animals fed either diet, but these elevations were statistically significant only in animals fed the HUF diet. Surprisingly, both percentages and counts of CD31+ particles were significantly lower at week 7 compared to week 0 and 3 in the HSF group. The median absolute counts of CD105+ particles were progressively elevated over time in the HSF group with a significant increase from week 0 to 7; the pattern was somewhat different for the HUF group with significant increase from week 3 to 7. The counts of CD54+ particles exhibited wide variation in both groups during the dietary challenge, while the median counts of CD106+ particles were significantly lower at week 3 than at week 0 and week 7. Endothelial particles exhibited time-dependent changes, suggesting they were behaving as quantifiable surrogates for the early detection of vascular damage caused by dietary factors.

  19. The research of data acquisition system for Raman spectrometer

    NASA Astrophysics Data System (ADS)

    Cui, Xiao; Guo, Pan; Zhang, Yinchao; Chen, Siying; Chen, He; Chen, Wenbo

    2011-11-01

    Raman spectrometer has been widely used as an identification tool for analyzing material structure and composition in many fields. However, Raman scattering echo signal is very weak, about dozens of photons at most in one laser plus signal. Therefore, it is a great challenge to design a Raman spectrum data acquisition system which could accurately receive the weak echo signal. The system designed in this paper receives optical signals with the principle of photon counter and could detect single photon. The whole system consists of a photoelectric conversion module H7421-40 and a photo counting card including a field programmable gate array (FPGA) chip and a PCI9054 chip. The module H7421-40 including a PMT, an amplifier and a discriminator has high sensitivity on wavelength from 300nm to 720nm. The Center Wavelength is 580nm which is close to the excitation wavelength (532nm), QE 40% at peak wavelength, Count Sensitivity is 7.8*105(S-1PW-1) and Count Linearity is 1.5MHZ. In FPGA chip, the functions are divided into three parts: parameter setting module, controlling module, data collection and storage module. All the commands, parameters and data are transmitted between FPGA and computer by PCI9054 chip through the PCI interface. The result of experiment shows that the Raman spectrum data acquisition system is reasonable and efficient. There are three primary advantages of the data acquisition system: the first one is the high sensitivity with single photon detection capability; the second one is the high integrated level which means all the operation could be done by the photo counting card; and the last one is the high expansion ability because of the smart reconfigurability of FPGA chip.

  20. Compensated gadolinium-loaded plastic scintillators for thermal neutron detection (and counting)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumazert, Jonathan; Coulon, Romain; Bertrand, Guillaume H. V.

    2015-07-01

    Plastic scintillator loading with gadolinium-rich organometallic complexes shows a high potential for the deployment of efficient and cost-effective neutron detectors. Due to the low-energy photon and electron signature of thermal neutron capture by gadolinium-155 and gadolinium-157, alternative treatment to Pulse Shape Discrimination has to be proposed in order to display a trustable count rate. This paper discloses the principle of a compensation method applied to a two-scintillator system: a detection scintillator interacts with photon radiation and is loaded with gadolinium organometallic compound to become a thermal neutron absorber, while a non-gadolinium loaded compensation scintillator solely interacts with the photon partmore » of the incident radiation. Posterior to the nonlinear smoothing of the counting signals, a hypothesis test determines whether the resulting count rate after photon response compensation falls into statistical fluctuations or provides a robust image of a neutron activity. A laboratory prototype is tested under both photon and neutron irradiations, allowing us to investigate the performance of the overall compensation system in terms of neutron detection, especially with regards to a commercial helium-3 counter. The study reveals satisfactory results in terms of sensitivity and orientates future investigation toward promising axes. (authors)« less

  1. A new multivariate zero-adjusted Poisson model with applications to biomedicine.

    PubMed

    Liu, Yin; Tian, Guo-Liang; Tang, Man-Lai; Yuen, Kam Chuen

    2018-05-25

    Recently, although advances were made on modeling multivariate count data, existing models really has several limitations: (i) The multivariate Poisson log-normal model (Aitchison and Ho, ) cannot be used to fit multivariate count data with excess zero-vectors; (ii) The multivariate zero-inflated Poisson (ZIP) distribution (Li et al., 1999) cannot be used to model zero-truncated/deflated count data and it is difficult to apply to high-dimensional cases; (iii) The Type I multivariate zero-adjusted Poisson (ZAP) distribution (Tian et al., 2017) could only model multivariate count data with a special correlation structure for random components that are all positive or negative. In this paper, we first introduce a new multivariate ZAP distribution, based on a multivariate Poisson distribution, which allows the correlations between components with a more flexible dependency structure, that is some of the correlation coefficients could be positive while others could be negative. We then develop its important distributional properties, and provide efficient statistical inference methods for multivariate ZAP model with or without covariates. Two real data examples in biomedicine are used to illustrate the proposed methods. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Homicides by Police: Comparing Counts From the National Violent Death Reporting System, Vital Statistics, and Supplementary Homicide Reports.

    PubMed

    Barber, Catherine; Azrael, Deborah; Cohen, Amy; Miller, Matthew; Thymes, Deonza; Wang, David Enze; Hemenway, David

    2016-05-01

    To evaluate the National Violent Death Reporting System (NVDRS) as a surveillance system for homicides by law enforcement officers. We assessed sensitivity and positive predictive value of the NVDRS "type of death" variable against our study count of homicides by police, which we derived from NVDRS coded and narrative data for states participating in NVDRS 2005 to 2012. We compared state counts of police homicides from NVDRS, Vital Statistics, and Federal Bureau of Investigation Supplementary Homicide Reports. We identified 1552 police homicides in the 16 states. Positive predictive value and sensitivity of the NVDRS "type of death" variable for police homicides were high (98% and 90%, respectively). Counts from Vital Statistics and Supplementary Homicide Reports were 58% and 48%, respectively, of our study total; gaps varied widely by state. The annual rate of police homicide (0.24/100,000) varied 5-fold by state and 8-fold by race/ethnicity. NVDRS provides more complete data on police homicides than do existing systems. Expanding NVDRS to all 50 states and making 2 improvements we identify will be an efficient way to provide the nation with more accurate, detailed data on homicides by law enforcement.

  3. Energy harvesting using AC machines with high effective pole count

    NASA Astrophysics Data System (ADS)

    Geiger, Richard Theodore

    In this thesis, ways to improve the power conversion of rotating generators at low rotor speeds in energy harvesting applications were investigated. One method is to increase the pole count, which increases the generator back-emf without also increasing the I2R losses, thereby increasing both torque density and conversion efficiency. One machine topology that has a high effective pole count is a hybrid "stepper" machine. However, the large self inductance of these machines decreases their power factor and hence the maximum power that can be delivered to a load. This effect can be cancelled by the addition of capacitors in series with the stepper windings. A circuit was designed and implemented to automatically vary the series capacitance over the entire speed range investigated. The addition of the series capacitors improved the power output of the stepper machine by up to 700%. At low rotor speeds, with the addition of series capacitance, the power output of the hybrid "stepper" was more than 200% that of a similarly sized PMDC brushed motor. Finally, in this thesis a hybrid lumped parameter / finite element model was used to investigate the impact of number, shape and size of the rotor and stator teeth on machine performance. A typical off-the-shelf hybrid stepper machine has significant cogging torque by design. This cogging torque is a major problem in most small energy harvesting applications. In this thesis it was shown that the cogging and ripple torque can be dramatically reduced. These findings confirm that high-pole-count topologies, and specifically the hybrid stepper configuration, are an attractive choice for energy harvesting applications.

  4. Microbiology of cooked and dried edible Mediterranean field crickets (Gryllus bimaculatus) and superworms (Zophobas atratus) submitted to four different heating treatments.

    PubMed

    Grabowski, Nils Th; Klein, Günter

    2017-01-01

    To increase the shelf life of edible insects, modern techniques (e.g. freeze-drying) add to the traditional methods (degutting, boiling, sun-drying or roasting). However, microorganisms become inactivated rather than being killed, and when rehydrated, many return to vegetative stadia. Crickets (Gryllus bimaculatus) and superworms (Zophobas atratus) were submitted to four different drying techniques (T1 = 10' cooking, 24 h drying at 60℃; T2 = 10' cooking, 24 h drying at 80℃; T3 = 30' cooking, 12 h drying at 80℃, and 12 h drying at 100℃; T4 = boiling T3-treated insects after five days) and analysed for total bacteria counts, Enterobacteriaceae, staphylococci, bacilli, yeasts and moulds counts, E. coli, salmonellae, and Listeria monocytogenes (the latter three being negative throughout). The microbial counts varied strongly displaying species- and treatment-specific patterns. T3 was the most effective of the drying treatments tested to decrease all counts but bacilli, for which T2 was more efficient. Still, total bacteria counts remained high (G. bimaculatus > Z. atratus). Other opportunistically pathogenic microorganisms (Bacillus thuringiensis, B. licheniformis, B. pumilis, Pseudomonas aeruginosa, and Cryptococcus neoformans) were also encountered. The tyndallisation-like T4 reduced all counts to below detection limit, but nutrients leakage should be considered regarding food quality. In conclusion, species-specific drying procedures should be devised to ensure food safety. © The Author(s) 2016.

  5. Single Photon Counting Detectors for Low Light Level Imaging Applications

    NASA Astrophysics Data System (ADS)

    Kolb, Kimberly

    2015-10-01

    This dissertation presents the current state-of-the-art of semiconductor-based photon counting detector technologies. HgCdTe linear-mode avalanche photodiodes (LM-APDs), silicon Geiger-mode avalanche photodiodes (GM-APDs), and electron-multiplying CCDs (EMCCDs) are compared via their present and future performance in various astronomy applications. LM-APDs are studied in theory, based on work done at the University of Hawaii. EMCCDs are studied in theory and experimentally, with a device at NASA's Jet Propulsion Lab. The emphasis of the research is on GM-APD imaging arrays, developed at MIT Lincoln Laboratory and tested at the RIT Center for Detectors. The GM-APD research includes a theoretical analysis of SNR and various performance metrics, including dark count rate, afterpulsing, photon detection efficiency, and intrapixel sensitivity. The effects of radiation damage on the GM-APD were also characterized by introducing a cumulative dose of 50 krad(Si) via 60 MeV protons. Extensive development of Monte Carlo simulations and practical observation simulations was completed, including simulated astronomical imaging and adaptive optics wavefront sensing. Based on theoretical models and experimental testing, both the current state-of-the-art performance and projected future performance of each detector are compared for various applications. LM-APD performance is currently not competitive with other photon counting technologies, and are left out of the application-based comparisons. In the current state-of-the-art, EMCCDs in photon counting mode out-perform GM-APDs for long exposure scenarios, though GM-APDs are better for short exposure scenarios (fast readout) due to clock-induced-charge (CIC) in EMCCDs. In the long term, small improvements in GM-APD dark current will make them superior in both long and short exposure scenarios for extremely low flux. The efficiency of GM-APDs will likely always be less than EMCCDs, however, which is particularly disadvantageous for moderate to high flux rates where dark noise and CIC are insignificant noise sources. Research into decreasing the dark count rate of GM-APDs will lead to development of imaging arrays that are competitive for low light level imaging and spectroscopy applications in the near future.

  6. Fundamental performance differences between CMOS and CCD imagers: Part II

    NASA Astrophysics Data System (ADS)

    Janesick, James; Andrews, James; Tower, John; Grygon, Mark; Elliott, Tom; Cheng, John; Lesser, Michael; Pinter, Jeff

    2007-09-01

    A new class of CMOS imagers that compete with scientific CCDs is presented. The sensors are based on deep depletion backside illuminated technology to achieve high near infrared quantum efficiency and low pixel cross-talk. The imagers deliver very low read noise suitable for single photon counting - Fano-noise limited soft x-ray applications. Digital correlated double sampling signal processing necessary to achieve low read noise performance is analyzed and demonstrated for CMOS use. Detailed experimental data products generated by different pixel architectures (notably 3TPPD, 5TPPD and 6TPG designs) are presented including read noise, charge capacity, dynamic range, quantum efficiency, charge collection and transfer efficiency and dark current generation. Radiation damage data taken for the imagers is also reported.

  7. Detection of sub-kilometer craters in high resolution planetary images using shape and texture features

    NASA Astrophysics Data System (ADS)

    Bandeira, Lourenço; Ding, Wei; Stepinski, Tomasz F.

    2012-01-01

    Counting craters is a paramount tool of planetary analysis because it provides relative dating of planetary surfaces. Dating surfaces with high spatial resolution requires counting a very large number of small, sub-kilometer size craters. Exhaustive manual surveys of such craters over extensive regions are impractical, sparking interest in designing crater detection algorithms (CDAs). As a part of our effort to design a CDA, which is robust and practical for planetary research analysis, we propose a crater detection approach that utilizes both shape and texture features to identify efficiently sub-kilometer craters in high resolution panchromatic images. First, a mathematical morphology-based shape analysis is used to identify regions in an image that may contain craters; only those regions - crater candidates - are the subject of further processing. Second, image texture features in combination with the boosting ensemble supervised learning algorithm are used to accurately classify previously identified candidates into craters and non-craters. The design of the proposed CDA is described and its performance is evaluated using a high resolution image of Mars for which sub-kilometer craters have been manually identified. The overall detection rate of the proposed CDA is 81%, the branching factor is 0.14, and the overall quality factor is 72%. This performance is a significant improvement over the previous CDA based exclusively on the shape features. The combination of performance level and computational efficiency offered by this CDA makes it attractive for practical application.

  8. Division of methods for counting helminths' eggs and the problem of efficiency of these methods.

    PubMed

    Jaromin-Gleń, Katarzyna; Kłapeć, Teresa; Łagód, Grzegorz; Karamon, Jacek; Malicki, Jacek; Skowrońska, Agata; Bieganowski, Andrzej

    2017-03-21

    From the sanitary and epidemiological aspects, information concerning the developmental forms of intestinal parasites, especially the eggs of helminths present in our environment in: water, soil, sandpits, sewage sludge, crops watered with wastewater are very important. The methods described in the relevant literature may be classified in various ways, primarily according to the methodology of the preparation of samples from environmental matrices prepared for analysis, and the sole methods of counting and chambers/instruments used for this purpose. In addition, there is a possibility to perform the classification of the research methods analyzed from the aspect of the method and time of identification of the individuals counted, or the necessity for staining them. Standard methods for identification of helminths' eggs from environmental matrices are usually characterized by low efficiency, i.e. from 30% to approximately 80%. The efficiency of the method applied may be measured in a dual way, either by using the method of internal standard or the 'Split/Spike' method. While measuring simultaneously in an examined object the efficiency of the method and the number of eggs, the 'actual' number of eggs may be calculated by multiplying the obtained value of the discovered eggs of helminths by inverse efficiency.

  9. Characterisation of Geiger-mode avalanche photodiodes for medical imaging applications

    NASA Astrophysics Data System (ADS)

    Britvitch, I.; Johnson, I.; Renker, D.; Stoykov, A.; Lorenz, E.

    2007-02-01

    Recently developed multipixel Geiger-mode avalanche photodiodes (G-APDs) are very promising candidates for the detection of light in medical imaging instruments (e.g. positron emission tomography) as well as in high-energy physics experiments and astrophysical applications. G-APDs are especially well suited for morpho-functional imaging (multimodality PET/CT, SPECT/CT, PET/MRI, SPECT/MRI). G-APDs have many advantages compared to conventional photosensors such as photomultiplier tubes because of their compact size, low-power consumption, high quantum efficiency and insensitivity to magnetic fields. Compared to avalanche photodiodes and PIN diodes, they are advantageous because of their high gain, reduced sensitivity to pick up and the so-called nuclear counter effect and lower noise. We present measurements of the basic G-APD characteristics: photon detection efficiency, gain, inter-cell crosstalk, dynamic range, recovery time and dark count rate.

  10. A Method Based on Wavelet Transforms for Source Detection in Photon-counting Detector Images. II. Application to ROSAT PSPC Images

    NASA Astrophysics Data System (ADS)

    Damiani, F.; Maggio, A.; Micela, G.; Sciortino, S.

    1997-07-01

    We apply to the specific case of images taken with the ROSAT PSPC detector our wavelet-based X-ray source detection algorithm presented in a companion paper. Such images are characterized by the presence of detector ``ribs,'' strongly varying point-spread function, and vignetting, so that their analysis provides a challenge for any detection algorithm. First, we apply the algorithm to simulated images of a flat background, as seen with the PSPC, in order to calibrate the number of spurious detections as a function of significance threshold and to ascertain that the spatial distribution of spurious detections is uniform, i.e., unaffected by the ribs; this goal was achieved using the exposure map in the detection procedure. Then, we analyze simulations of PSPC images with a realistic number of point sources; the results are used to determine the efficiency of source detection and the accuracy of output quantities such as source count rate, size, and position, upon a comparison with input source data. It turns out that sources with 10 photons or less may be confidently detected near the image center in medium-length (~104 s), background-limited PSPC exposures. The positions of sources detected near the image center (off-axis angles < 15') are accurate to within a few arcseconds. Output count rates and sizes are in agreement with the input quantities, within a factor of 2 in 90% of the cases. The errors on position, count rate, and size increase with off-axis angle and for detections of lower significance. We have also checked that the upper limits computed with our method are consistent with the count rates of undetected input sources. Finally, we have tested the algorithm by applying it on various actual PSPC images, among the most challenging for automated detection procedures (crowded fields, extended sources, and nonuniform diffuse emission). The performance of our method in these images is satisfactory and outperforms those of other current X-ray detection techniques, such as those employed to produce the MPE and WGA catalogs of PSPC sources, in terms of both detection reliability and efficiency. We have also investigated the theoretical limit for point-source detection, with the result that even sources with only 2-3 photons may be reliably detected using an efficient method in images with sufficiently high resolution and low background.

  11. Direct Detection Electron Energy-Loss Spectroscopy: A Method to Push the Limits of Resolution and Sensitivity.

    PubMed

    Hart, James L; Lang, Andrew C; Leff, Asher C; Longo, Paolo; Trevor, Colin; Twesten, Ray D; Taheri, Mitra L

    2017-08-15

    In many cases, electron counting with direct detection sensors offers improved resolution, lower noise, and higher pixel density compared to conventional, indirect detection sensors for electron microscopy applications. Direct detection technology has previously been utilized, with great success, for imaging and diffraction, but potential advantages for spectroscopy remain unexplored. Here we compare the performance of a direct detection sensor operated in counting mode and an indirect detection sensor (scintillator/fiber-optic/CCD) for electron energy-loss spectroscopy. Clear improvements in measured detective quantum efficiency and combined energy resolution/energy field-of-view are offered by counting mode direct detection, showing promise for efficient spectrum imaging, low-dose mapping of beam-sensitive specimens, trace element analysis, and time-resolved spectroscopy. Despite the limited counting rate imposed by the readout electronics, we show that both core-loss and low-loss spectral acquisition are practical. These developments will benefit biologists, chemists, physicists, and materials scientists alike.

  12. Single photon detection in a waveguide-coupled Ge-on-Si lateral avalanche photodiode.

    PubMed

    Martinez, Nicholas J D; Gehl, Michael; Derose, Christopher T; Starbuck, Andrew L; Pomerene, Andrew T; Lentine, Anthony L; Trotter, Douglas C; Davids, Paul S

    2017-07-10

    We examine gated-Geiger mode operation of an integrated waveguide-coupled Ge-on-Si lateral avalanche photodiode (APD) and demonstrate single photon detection at low dark count for this mode of operation. Our integrated waveguide-coupled APD is fabricated using a selective epitaxial Ge-on-Si growth process resulting in a separate absorption and charge multiplication (SACM) design compatible with our silicon photonics platform. Single photon detection efficiency and dark count rate is measured as a function of temperature in order to understand and optimize performance characteristics in this device. We report single photon detection of 5.27% at 1310 nm and a dark count rate of 534 kHz at 80 K for a Ge-on-Si single photon avalanche diode. Dark count rate is the lowest for a Ge-on-Si single photon detector in this range of temperatures while maintaining competitive detection efficiency. A jitter of 105 ps was measured for this device.

  13. Modeling the frequency-dependent detective quantum efficiency of photon-counting x-ray detectors.

    PubMed

    Stierstorfer, Karl

    2018-01-01

    To find a simple model for the frequency-dependent detective quantum efficiency (DQE) of photon-counting detectors in the low flux limit. Formula for the spatial cross-talk, the noise power spectrum and the DQE of a photon-counting detector working at a given threshold are derived. Parameters are probabilities for types of events like single counts in the central pixel, double counts in the central pixel and a neighboring pixel or single count in a neighboring pixel only. These probabilities can be derived in a simple model by extensive use of Monte Carlo techniques: The Monte Carlo x-ray propagation program MOCASSIM is used to simulate the energy deposition from the x-rays in the detector material. A simple charge cloud model using Gaussian clouds of fixed width is used for the propagation of the electric charge generated by the primary interactions. Both stages are combined in a Monte Carlo simulation randomizing the location of impact which finally produces the required probabilities. The parameters of the charge cloud model are fitted to the spectral response to a polychromatic spectrum measured with our prototype detector. Based on the Monte Carlo model, the DQE of photon-counting detectors as a function of spatial frequency is calculated for various pixel sizes, photon energies, and thresholds. The frequency-dependent DQE of a photon-counting detector in the low flux limit can be described with an equation containing only a small set of probabilities as input. Estimates for the probabilities can be derived from a simple model of the detector physics. © 2017 American Association of Physicists in Medicine.

  14. Energy dispersive CdTe and CdZnTe detectors for spectral clinical CT and NDT applications

    NASA Astrophysics Data System (ADS)

    Barber, W. C.; Wessel, J. C.; Nygard, E.; Iwanczyk, J. S.

    2015-06-01

    We are developing room temperature compound semiconductor detectors for applications in energy-resolved high-flux single x-ray photon-counting spectral computed tomography (CT), including functional imaging with nanoparticle contrast agents for medical applications and non-destructive testing (NDT) for security applications. Energy-resolved photon-counting can provide reduced patient dose through optimal energy weighting for a particular imaging task in CT, functional contrast enhancement through spectroscopic imaging of metal nanoparticles in CT, and compositional analysis through multiple basis function material decomposition in CT and NDT. These applications produce high input count rates from an x-ray generator delivered to the detector. Therefore, in order to achieve energy-resolved single photon counting in these applications, a high output count rate (OCR) for an energy-dispersive detector must be achieved at the required spatial resolution and across the required dynamic range for the application. The required performance in terms of the OCR, spatial resolution, and dynamic range must be obtained with sufficient field of view (FOV) for the application thus requiring the tiling of pixel arrays and scanning techniques. Room temperature cadmium telluride (CdTe) and cadmium zinc telluride (CdZnTe) compound semiconductors, operating as direct conversion x-ray sensors, can provide the required speed when connected to application specific integrated circuits (ASICs) operating at fast peaking times with multiple fixed thresholds per pixel provided the sensors are designed for rapid signal formation across the x-ray energy ranges of the application at the required energy and spatial resolutions, and at a sufficiently high detective quantum efficiency (DQE). We have developed high-flux energy-resolved photon-counting x-ray imaging array sensors using pixellated CdTe and CdZnTe semiconductors optimized for clinical CT and security NDT. We have also fabricated high-flux ASICs with a two dimensional (2D) array of inputs for readout from the sensors. The sensors are guard ring free and have a 2D array of pixels and can be tiled in 2D while preserving pixel pitch. The 2D ASICs have four energy bins with a linear energy response across sufficient dynamic range for clinical CT and some NDT applications. The ASICs can also be tiled in 2D and are designed to fit within the active area of the sensors. We have measured several important performance parameters including: the output count rate (OCR) in excess of 20 million counts per second per square mm with a minimum loss of counts due to pulse pile-up, an energy resolution of 7 keV full width at half-maximum (FWHM) across the entire dynamic range, and a noise floor about 20 keV. This is achieved by directly interconnecting the ASIC inputs to the pixels of the CdZnTe sensors incurring very little input capacitance to the ASICs. We present measurements of the performance of the CdTe and CdZnTe sensors including the OCR, FWHM energy resolution, noise floor, as well as the temporal stability and uniformity under the rapidly varying high flux expected in CT and NDT applications.

  15. Energy dispersive CdTe and CdZnTe detectors for spectral clinical CT and NDT applications

    PubMed Central

    Barber, W. C.; Wessel, J. C.; Nygard, E.; Iwanczyk, J. S.

    2014-01-01

    We are developing room temperature compound semiconductor detectors for applications in energy-resolved high-flux single x-ray photon-counting spectral computed tomography (CT), including functional imaging with nanoparticle contrast agents for medical applications and non destructive testing (NDT) for security applications. Energy-resolved photon-counting can provide reduced patient dose through optimal energy weighting for a particular imaging task in CT, functional contrast enhancement through spectroscopic imaging of metal nanoparticles in CT, and compositional analysis through multiple basis function material decomposition in CT and NDT. These applications produce high input count rates from an x-ray generator delivered to the detector. Therefore, in order to achieve energy-resolved single photon counting in these applications, a high output count rate (OCR) for an energy-dispersive detector must be achieved at the required spatial resolution and across the required dynamic range for the application. The required performance in terms of the OCR, spatial resolution, and dynamic range must be obtained with sufficient field of view (FOV) for the application thus requiring the tiling of pixel arrays and scanning techniques. Room temperature cadmium telluride (CdTe) and cadmium zinc telluride (CdZnTe) compound semiconductors, operating as direct conversion x-ray sensors, can provide the required speed when connected to application specific integrated circuits (ASICs) operating at fast peaking times with multiple fixed thresholds per pixel provided the sensors are designed for rapid signal formation across the x-ray energy ranges of the application at the required energy and spatial resolutions, and at a sufficiently high detective quantum efficiency (DQE). We have developed high-flux energy-resolved photon-counting x-ray imaging array sensors using pixellated CdTe and CdZnTe semiconductors optimized for clinical CT and security NDT. We have also fabricated high-flux ASICs with a two dimensional (2D) array of inputs for readout from the sensors. The sensors are guard ring free and have a 2D array of pixels and can be tiled in 2D while preserving pixel pitch. The 2D ASICs have four energy bins with a linear energy response across sufficient dynamic range for clinical CT and some NDT applications. The ASICs can also be tiled in 2D and are designed to fit within the active area of the sensors. We have measured several important performance parameters including; the output count rate (OCR) in excess of 20 million counts per second per square mm with a minimum loss of counts due to pulse pile-up, an energy resolution of 7 keV full width at half maximum (FWHM) across the entire dynamic range, and a noise floor about 20keV. This is achieved by directly interconnecting the ASIC inputs to the pixels of the CdZnTe sensors incurring very little input capacitance to the ASICs. We present measurements of the performance of the CdTe and CdZnTe sensors including the OCR, FWHM energy resolution, noise floor, as well as the temporal stability and uniformity under the rapidly varying high flux expected in CT and NDT applications. PMID:25937684

  16. Energy dispersive CdTe and CdZnTe detectors for spectral clinical CT and NDT applications.

    PubMed

    Barber, W C; Wessel, J C; Nygard, E; Iwanczyk, J S

    2015-06-01

    We are developing room temperature compound semiconductor detectors for applications in energy-resolved high-flux single x-ray photon-counting spectral computed tomography (CT), including functional imaging with nanoparticle contrast agents for medical applications and non destructive testing (NDT) for security applications. Energy-resolved photon-counting can provide reduced patient dose through optimal energy weighting for a particular imaging task in CT, functional contrast enhancement through spectroscopic imaging of metal nanoparticles in CT, and compositional analysis through multiple basis function material decomposition in CT and NDT. These applications produce high input count rates from an x-ray generator delivered to the detector. Therefore, in order to achieve energy-resolved single photon counting in these applications, a high output count rate (OCR) for an energy-dispersive detector must be achieved at the required spatial resolution and across the required dynamic range for the application. The required performance in terms of the OCR, spatial resolution, and dynamic range must be obtained with sufficient field of view (FOV) for the application thus requiring the tiling of pixel arrays and scanning techniques. Room temperature cadmium telluride (CdTe) and cadmium zinc telluride (CdZnTe) compound semiconductors, operating as direct conversion x-ray sensors, can provide the required speed when connected to application specific integrated circuits (ASICs) operating at fast peaking times with multiple fixed thresholds per pixel provided the sensors are designed for rapid signal formation across the x-ray energy ranges of the application at the required energy and spatial resolutions, and at a sufficiently high detective quantum efficiency (DQE). We have developed high-flux energy-resolved photon-counting x-ray imaging array sensors using pixellated CdTe and CdZnTe semiconductors optimized for clinical CT and security NDT. We have also fabricated high-flux ASICs with a two dimensional (2D) array of inputs for readout from the sensors. The sensors are guard ring free and have a 2D array of pixels and can be tiled in 2D while preserving pixel pitch. The 2D ASICs have four energy bins with a linear energy response across sufficient dynamic range for clinical CT and some NDT applications. The ASICs can also be tiled in 2D and are designed to fit within the active area of the sensors. We have measured several important performance parameters including; the output count rate (OCR) in excess of 20 million counts per second per square mm with a minimum loss of counts due to pulse pile-up, an energy resolution of 7 keV full width at half maximum (FWHM) across the entire dynamic range, and a noise floor about 20keV. This is achieved by directly interconnecting the ASIC inputs to the pixels of the CdZnTe sensors incurring very little input capacitance to the ASICs. We present measurements of the performance of the CdTe and CdZnTe sensors including the OCR, FWHM energy resolution, noise floor, as well as the temporal stability and uniformity under the rapidly varying high flux expected in CT and NDT applications.

  17. Calibration of the JET neutron activation system for DT operation

    NASA Astrophysics Data System (ADS)

    Bertalot, L.; Roquemore, A. L.; Loughlin, M.; Esposito, B.

    1999-01-01

    The neutron activation system at JET is a pneumatic transfer system capable of positioning activation samples close to the plasma. Its primary purpose is to provide a calibration for the time-dependent neutron yield monitors (fission chambers and solid state detectors). Various activation reactions with different high energy thresholds were used including 56Fe(n,p) 56Mn, 27Al(n,α) 24Na, 93Nb(n,2n) 92mNb, and 28Si(n,p) 28Al reactions. The silicon reaction, with its short half life (2.25 min), provides a prompt determination of the 14 MeV DT yield. The neutron induced γ-ray activity of the Si samples was measured using three sodium iodide scintillators, while two high purity germanium detectors were used for other foils. It was necessary to use a range of sample masses and different counting geometries in order to cover the wide range of neutron yields (1015-1019 neutrons) while avoiding excessive count rates in the detectors. The absolute full energy peak efficiency calibration of the detectors was measured taking into account the source-detector geometry, the self-attenuation of the samples and cross-talk effects. An error analysis of the neutron yield measurement was performed including uncertainties in efficiency calibration, neutron transport calculations, cross sections, and counting statistics. Cross calibrations between the different irradiation ends were carried out in DD and DT (with 1% and 10% tritium content) discharges. The effect of the plasma vertical displacement was also experimentally studied. An agreement within 10% was found between the 14 MeV neutron yields measured from Si, Fe, Al, Nb samples in DT discharges.

  18. Comparison of planar, PET and well-counter measurements of total tumor radioactivity in a mouse xenograft model.

    PubMed

    Green, Michael V; Seidel, Jurgen; Williams, Mark R; Wong, Karen J; Ton, Anita; Basuli, Falguni; Choyke, Peter L; Jagoda, Elaine M

    2017-10-01

    Quantitative small animal radionuclide imaging studies are often carried out with the intention of estimating the total radioactivity content of various tissues such as the radioactivity content of mouse xenograft tumors exposed to putative diagnostic or therapeutic agents. We show that for at least one specific application, positron projection imaging (PPI) and PET yield comparable estimates of absolute total tumor activity and that both of these estimates are highly correlated with direct well-counting of these same tumors. These findings further suggest that in this particular application, PPI is a far more efficient data acquisition and processing methodology than PET. Forty-one athymic mice were implanted with PC3 human prostate cancer cells transfected with prostate-specific membrane antigen (PSMA (+)) and one additional animal (for a total of 42) with a control blank vector (PSMA (-)). All animals were injected with [ 18 F] DCFPyl, a ligand for PSMA, and imaged for total tumor radioactivity with PET and PPI. The tumors were then removed, assayed by well counting for total radioactivity and the values between these methods intercompared. PET, PPI and well-counter estimates of total tumor radioactivity were highly correlated (R 2 >0.98) with regression line slopes near unity (0.95

  19. The prevalence of abnormal leukocyte count, and its predisposing factors, in patients with sickle cell disease in Saudi Arabia.

    PubMed

    Ahmed, Anwar E; Ali, Yosra Z; Al-Suliman, Ahmad M; Albagshi, Jafar M; Al Salamah, Majid; Elsayid, Mohieldin; Alanazi, Wala R; Ahmed, Rayan A; McClish, Donna K; Al-Jahdali, Hamdan

    2017-01-01

    High white blood cell (WBC) count is an indicator of sickle cell disease (SCD) severity, however, there are limited studies on WBC counts in Saudi Arabian patients with SCD. The aim of this study was to estimate the prevalence of abnormal leukocyte count (either low or high) and identify factors associated with high WBC counts in a sample of Saudi patients with SCD. A cross-sectional and retrospective chart review study was carried out on 290 SCD patients who were routinely treated at King Fahad Hospital in Hofuf, Saudi Arabia. An interview was conducted to assess clinical presentations, and we reviewed patient charts to collect data on blood test parameters for the previous 6 months. Almost half (131 [45.2%]) of the sample had abnormal leukocyte counts: low WBC counts 15 (5.2%) and high 116 (40%). High WBC counts were associated with shortness of breath ( P =0.022), tiredness ( P =0.039), swelling in hands/feet ( P =0.020), and back pain ( P =0.007). The mean hemoglobin was higher in patients with normal WBC counts ( P =0.024), while the mean hemoglobin S was high in patients with high WBC counts ( P =0.003). After adjustment for potential confounders, predictors of high WBC counts were male gender (adjusted odds ratio [aOR]=3.63) and patients with cough (aOR=2.18), low hemoglobin (aOR=0.76), and low heart rate (aOR=0.97). Abnormal leukocyte count was common: approximately five in ten Saudi SCD patients assessed in this sample. Male gender, cough, low hemoglobin, and low heart rate were associated with high WBC count. Strategies targeting high WBC count could prevent disease complication and thus could be beneficial for SCD patients.

  20. Sensitive and transportable gadolinium-core plastic scintillator sphere for neutron detection and counting

    NASA Astrophysics Data System (ADS)

    Dumazert, Jonathan; Coulon, Romain; Carrel, Frédérick; Corre, Gwenolé; Normand, Stéphane; Méchin, Laurence; Hamel, Matthieu

    2016-08-01

    Neutron detection forms a critical branch of nuclear-related issues, currently driven by the search for competitive alternative technologies to neutron counters based on the helium-3 isotope. The deployment of plastic scintillators shows a high potential for efficient detectors, safer and more reliable than liquids, more easily scalable and cost-effective than inorganic. In the meantime, natural gadolinium, through its 155 and mostly 157 isotopes, presents an exceptionally high interaction probability with thermal neutrons. This paper introduces a dual system including a metal gadolinium core inserted at the center of a high-scale plastic scintillator sphere. Incident fast neutrons are thermalized by the scintillator shell and then may be captured with a significant probability by gadolinium 155 and 157 nuclei in the core. The deposition of a sufficient fraction of the capture high-energy prompt gamma signature inside the scintillator shell will then allow discrimination from background radiations by energy threshold, and therefore neutron detection. The scaling of the system with the Monte Carlo MCNPX2.7 code was carried out according to a tradeoff between the moderation of incident fast neutrons and the probability of slow neutron capture by a moderate-cost metal gadolinium core. Based on the parameters extracted from simulation, a first laboratory prototype for the assessment of the detection method principle has been synthetized. The robustness and sensitivity of the neutron detection principle are then assessed by counting measurement experiments. Experimental results confirm the potential for a stable, highly sensitive, transportable and cost-efficient neutron detector and orientate future investigation toward promising axes.

  1. Fiber optic light collection system for scanning-tunneling-microscope-induced light emission.

    PubMed

    Watkins, Neil J; Long, James P; Kafafi, Zakya H; Mäkinen, Antti J

    2007-05-01

    We report a compact light collection scheme suitable for retrofitting a scanning tunneling microscope (STM) for STM-induced light emission experiments. The approach uses a pair of optical fibers with large core diameters and high numerical apertures to maximize light collection efficiency and to moderate the mechanical precision required for alignment. Bench tests indicate that efficiency reduction is almost entirely due to reflective losses at the fiber ends, while losses due to fiber misalignment have virtually been eliminated. Photon-map imaging with nanometer features is demonstrated on a stepped Au(111) surface with signal rates exceeding 10(4) counts/s.

  2. New Developments in Nickel-Hydrogen Dependent Pressure Vessel (DPV) Cell and Battery Design

    NASA Technical Reports Server (NTRS)

    Caldwell, Dwight B.; Fox, Chris L.; Miller, Lee E.

    1997-01-01

    THe Dependent Pressure Vessel (DPV) Nickel-Hydrogen (NiH2) design is being developed as an advanced battery for military and commercial, aerospace and terrestrial applications. The DPV cell design offers high specific energy and energy density as well as reduced cost, while retaining the established Individual Pressure Vessel (IPV) technology flight heritage and database. This advanced DPV design also offers a more efficient mechanical, electrical and thermal cell and battery configuration and a reduced part count. The DPV battery design promotes compact, minimum volume packaging and weight efficiency, and delivers cost and weight savings with minimal design risk.

  3. Negative Avalanche Feedback Detectors for Photon-Counting Optical Communications

    NASA Technical Reports Server (NTRS)

    Farr, William H.

    2009-01-01

    Negative Avalanche Feedback photon counting detectors with near-infrared spectral sensitivity offer an alternative to conventional Geiger mode avalanche photodiode or phototube detectors for free space communications links at 1 and 1.55 microns. These devices demonstrate linear mode photon counting without requiring any external reset circuitry and may even be operated at room temperature. We have now characterized the detection efficiency, dark count rate, after-pulsing, and single photon jitter for three variants of this new detector class, as well as operated these uniquely simple to use devices in actual photon starved free space optical communications links.

  4. Laboratory productivity and the rate of manual peripheral blood smear review: a College of American Pathologists Q-Probes study of 95,141 complete blood count determinations performed in 263 institutions.

    PubMed

    Novis, David A; Walsh, Molly; Wilkinson, David; St Louis, Mary; Ben-Ezra, Jonathon

    2006-05-01

    Automated laboratory hematology analyzers are capable of performing differential counts on peripheral blood smears with greater precision and more accurate detection of distributional and morphologic abnormalities than those performed by manual examinations of blood smears. Manual determinations of blood morphology and leukocyte differential counts are time-consuming, expensive, and may not always be necessary. The frequency with which hematology laboratory workers perform manual screens despite the availability of labor-saving features of automated analyzers is unknown. To determine the normative rates with which manual peripheral blood smears were performed in clinical laboratories, to examine laboratory practices associated with higher or lower manual review rates, and to measure the effects of manual smear review on the efficiency of generating complete blood count (CBC) determinations. From each of 3 traditional shifts per day, participants were asked to select serially, 10 automated CBC specimens, and to indicate whether manual scans and/or reviews with complete differential counts were performed on blood smears prepared from those specimens. Sampling continued until a total of 60 peripheral smears were reviewed manually. For each specimen on which a manual review was performed, participants indicated the patient's age, hemoglobin value, white blood cell count, platelet count, and the primary reason why the manual review was performed. Participants also submitted data concerning their institutions' demographic profiles and their laboratories' staffing, work volume, and practices regarding CBC determinations. The rates of manual reviews and estimations of efficiency in performing CBC determinations were obtained from the data. A total of 263 hospitals and independent laboratories, predominantly located in the United States, participating in the College of American Pathologists Q-Probes Program. There were 95,141 CBC determinations examined in this study; participants reviewed 15,423 (16.2%) peripheral blood smears manually. In the median institution (50th percentile), manual reviews of peripheral smears were performed on 26.7% of specimens. Manual differential count review rates were inversely associated with the magnitude of platelet counts that were required by laboratory policy to trigger smear reviews and with the efficiency of generating CBC reports. Lower manual differential count review rates were associated with laboratory policies that allowed manual reviews solely on the basis of abnormal automated red cell parameters and that precluded performing repeat manual reviews within designated time intervals. The manual scan rate elevated with increased number of hospital beds. In more than one third (35.7%) of the peripheral smears reviewed manually, participants claimed to have learned additional information beyond what was available on automated hematology analyzer printouts alone. By adopting certain laboratory practices, it may be possible to reduce the rates of manual reviews of peripheral blood smears and increase the efficiency of generating CBC results.

  5. Experimental Ten-Photon Entanglement.

    PubMed

    Wang, Xi-Lin; Chen, Luo-Kan; Li, W; Huang, H-L; Liu, C; Chen, C; Luo, Y-H; Su, Z-E; Wu, D; Li, Z-D; Lu, H; Hu, Y; Jiang, X; Peng, C-Z; Li, L; Liu, N-L; Chen, Yu-Ao; Lu, Chao-Yang; Pan, Jian-Wei

    2016-11-18

    We report the first experimental demonstration of quantum entanglement among ten spatially separated single photons. A near-optimal entangled photon-pair source was developed with simultaneously a source brightness of ∼12  MHz/W, a collection efficiency of ∼70%, and an indistinguishability of ∼91% between independent photons, which was used for a step-by-step engineering of multiphoton entanglement. Under a pump power of 0.57 W, the ten-photon count rate was increased by about 2 orders of magnitude compared to previous experiments, while maintaining a state fidelity sufficiently high for proving the genuine ten-particle entanglement. Our work created a state-of-the-art platform for multiphoton experiments, and enabled technologies for challenging optical quantum information tasks, such as the realization of Shor's error correction code and high-efficiency scattershot boson sampling.

  6. Measurement of tritium in natural water

    NASA Astrophysics Data System (ADS)

    Li, Meifen

    1985-06-01

    A detergent-scintillation liquid mixture applied to measure low specific activity of tritium in natural water was studied. The DYS-1 low level liquid scintillation counter designed and manufactured by our institute was employed. In comparing the Triton X-100 scintillation liquid mixture with the dioxane-based-scintillation liquid, a better formula for Triton X-100 scintillation liquid mixture was determined, the mixture possesses the quality of high water content; high efficiency and low back-ground in measuring tritium in water. Chemiluminescence of the Triton X-100 scintillation liquid mixture can be totally de-excited in short time. It can be employed at ambient temperature 11 28°C. For 20ml sample in quartz vials, counting efficiency is 15% with a background 2.17 cpm, Y=31 TU (t=30 min).

  7. Influence of raw milk quality on processed dairy products: How do raw milk quality test results relate to product quality and yield?

    PubMed

    Murphy, Steven C; Martin, Nicole H; Barbano, David M; Wiedmann, Martin

    2016-12-01

    This article provides an overview of the influence of raw milk quality on the quality of processed dairy products and offers a perspective on the merits of investing in quality. Dairy farmers are frequently offered monetary premium incentives to provide high-quality milk to processors. These incentives are most often based on raw milk somatic cell and bacteria count levels well below the regulatory public health-based limits. Justification for these incentive payments can be based on improved processed product quality and manufacturing efficiencies that provide the processor with a return on their investment for high-quality raw milk. In some cases, this return on investment is difficult to measure. Raw milks with high levels of somatic cells and bacteria are associated with increased enzyme activity that can result in product defects. Use of raw milk with somatic cell counts >100,000cells/mL has been shown to reduce cheese yields, and higher levels, generally >400,000 cells/mL, have been associated with textural and flavor defects in cheese and other products. Although most research indicates that fairly high total bacteria counts (>1,000,000 cfu/mL) in raw milk are needed to cause defects in most processed dairy products, receiving high-quality milk from the farm allows some flexibility for handling raw milk, which can increase efficiencies and reduce the risk of raw milk reaching bacterial levels of concern. Monitoring total bacterial numbers in regard to raw milk quality is imperative, but determining levels of specific types of bacteria present has gained increasing importance. For example, spores of certain spore-forming bacteria present in raw milk at very low levels (e.g., <1/mL) can survive pasteurization and grow in milk and cheese products to levels that result in defects. With the exception of meeting product specifications often required for milk powders, testing for specific spore-forming groups is currently not used in quality incentive programs in the United States but is used in other countries (e.g., the Netherlands). Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  8. Energy-correction photon counting pixel for photon energy extraction under pulse pile-up

    NASA Astrophysics Data System (ADS)

    Lee, Daehee; Park, Kyungjin; Lim, Kyung Taek; Cho, Gyuseong

    2017-06-01

    A photon counting detector (PCD) has been proposed as an alternative solution to an energy-integrating detector (EID) in medical imaging field due to its high resolution, high efficiency, and low noise. The PCD has expanded to variety of fields such as spectral CT, k-edge imaging, and material decomposition owing to its capability to count and measure the number and the energy of an incident photon, respectively. Nonetheless, pulse pile-up, which is a superimposition of pulses at the output of a charge sensitive amplifier (CSA) in each PC pixel, occurs frequently as the X-ray flux increases due to the finite pulse processing time (PPT) in CSAs. Pulse pile-up induces not only a count loss but also distortion in the measured X-ray spectrum from each PC pixel and thus it is a main constraint on the use of PCDs in high flux X-ray applications. To minimize these effects, an energy-correction PC (ECPC) pixel is proposed to resolve pulse pile-up without cutting off the PPT by adding an energy correction logic (ECL) via a cross detection method (CDM). The ECPC pixel with a size of 200×200 μm2 was fabricated by using a 6-metal 1-poly 0.18 μm CMOS process with a static power consumption of 7.2 μW/pixel. The maximum count rate of the ECPC pixel was extended by approximately three times higher than that of a conventional PC pixel with a PPT of 500 nsec. The X-ray spectrum of 90 kVp, filtered by 3 mm Al filter, was measured as the X-ray current was increased using the CdTe and the ECPC pixel. As a result, the ECPC pixel dramatically reduced the energy spectrum distortion at 2 Mphotons/pixel/s when compared to that of the ERCP pixel with the same 500 nsec PPT.

  9. Assessing avian richness in remnant wetlands: Towards an improved methodology

    USGS Publications Warehouse

    Krzys, Greg; Waite, Thomas A.; Stapanian, Martin; Vucetich, John A.

    2002-01-01

    Because the North American Breeding Bird Survey provides inadequate coverage of wetland habitat, the Wetland Breeding Bird Survey was recently established in Ohio, USA. This program relies on volunteers to conduct 3 counts at each monitored wetland. Currently, all counts are conducted during the morning. Under the premise that volunteer participation could be increased by allowing evening counts, we evaluated the potential for modifying the methodology. We evaluated the sampling efficiency of all 3-count combinations of morning and evening counts using data collected at 14 wetlands. Estimates of overall species richness decreased with increasing numbers of evening counts. However, this pattern did not hold when analyses were restricted to wetland-dependent species or those of conservation concern. Our findings suggest that it would be reasonable to permit evening counts, particularly if the data are to be used to monitor wetland dependent species and those of concern.

  10. Visual versus mechanised leucocyte differential counts: costing and evaluation of traditional and Hemalog D methods.

    PubMed

    Hudson, M J; Green, A E

    1980-11-01

    Visual differential counts were examined for efficiency, cost effectiveness, and staff acceptability within our laboratory. A comparison with the Hemalog D system was attempted. The advantages and disadvantages of each system are enumerated and discussed in the context of a large general hospital.

  11. Advanced electrical power, distribution and control for the Space Transportation System

    NASA Astrophysics Data System (ADS)

    Hansen, Irving G.; Brandhorst, Henry W., Jr.

    1990-08-01

    High frequency power distribution and management is a technology ready state of development. As such, a system employs the fewest power conversion steps, and employs zero current switching for those steps. It results in the most efficiency, and lowest total parts system count when equivalent systems are compared. The operating voltage and frequency are application specific trade off parameters. However, a 20 kHz Hertz system is suitable for wide range systems.

  12. Advanced electrical power, distribution and control for the Space Transportation System

    NASA Technical Reports Server (NTRS)

    Hansen, Irving G.; Brandhorst, Henry W., Jr.

    1990-01-01

    High frequency power distribution and management is a technology ready state of development. As such, a system employs the fewest power conversion steps, and employs zero current switching for those steps. It results in the most efficiency, and lowest total parts system count when equivalent systems are compared. The operating voltage and frequency are application specific trade off parameters. However, a 20 kHz Hertz system is suitable for wide range systems.

  13. Measurement of effective detective quantum efficiency for a photon counting scanning mammography system and comparison with two flat panel full-field digital mammography systems

    NASA Astrophysics Data System (ADS)

    Wood, Tim J.; Moore, Craig S.; Saunderson, John R.; Beavis, Andrew W.

    2018-01-01

    Effective detective quantum efficiency (eDQE) describes the resolution and noise properties of an imaging system along with scatter and primary transmission, all measured under clinically appropriate conditions. Effective dose efficiency (eDE) is the eDQE normalised to mean glandular dose and has been proposed as a useful metric for the optimisation of clinical imaging systems. The aim of this study was to develop a methodology for measuring eDQE and eDE on a Philips microdose mammography (MDM) L30 photon counting scanning system, and to compare performance with two conventional flat panel systems. A custom made lead-blocker was manufactured to enable the accurate determination of dose measurements, and modulation transfer functions were determined free-in-air at heights of 2, 4 and 6 cm above the breast support platform. eDQE were calculated for a Philips MDM L30, Hologic Dimensions and Siemens Inspiration digital mammography system for 2, 4 and 6 cm thick poly(methyl methacrylate) (PMMA). The beam qualities (target/filter and kilovoltage) assessed were those selected by the automatic exposure control, and anti-scatter grids were used where available. Measurements of eDQE demonstrate significant differences in performance between the slit- and scan-directions for the photon counting imaging system. MTF has been shown to be the limiting factor in the scan-direction, which results in a rapid fall in eDQE at mid-to-high spatial frequencies. A comparison with two flat panel mammography systems demonstrates that this may limit image quality for small details, such as micro-calcifications, which correlates with a more conventional image quality assessment with the CDMAM phantom. eDE has shown the scanning photon counting system offers superior performance for low spatial frequencies, which will be important for the detection of large low contrast masses. Both eDQE and eDE are proposed as useful metrics that should enable optimisation of the Philips MDM L30.

  14. A study of reconstruction accuracy for a cardiac SPECT system with multi-segmental collimation

    NASA Astrophysics Data System (ADS)

    Yu, D.-C.; Chang, W.; Pan, T.-S.

    1997-06-01

    To improve the geometric efficiency of cardiac SPECT imaging, the authors previously proposed to use a multi-segmental collimation with a cylindrical geometry. The proposed collimator consists of multiple parallel-hole collimators with most of the segments directed toward a small central region, where the patient's heart should be positioned. This technique provides a significantly increased detection efficiency for the central region, but at the expense of reduced efficiency for the surrounding region. The authors have used computer simulations to evaluate the implication of this technique on the accuracy of the reconstructed cardiac images. Two imaging situations were simulated: 1) the heart well placed inside the central region, and 2) the heart shifted and partially outside the central region. A neighboring high-uptake liver was simulated for both imaging situations. The images were reconstructed and corrected for attenuation with ML-EM and OS-FM methods using a complete attenuation map. The results indicate that errors caused by projection truncation are not significant and are not strongly dependent on the activity of the liver when the heart is well positioned within the central region. When the heart is partially outside the central region, hybrid emission data (a combination of high-count projections from the central region and low-count projections from the background region) can be used to restore the activity of the truncated section of the myocardium. However, the variance of the image in the section of the myocardium outside the central region is increased by 2-3 times when 10% of the collimator segments are used to image the background region.

  15. Geiger-mode avalanche photodiode focal plane arrays for three-dimensional imaging LADAR

    NASA Astrophysics Data System (ADS)

    Itzler, Mark A.; Entwistle, Mark; Owens, Mark; Patel, Ketan; Jiang, Xudong; Slomkowski, Krystyna; Rangwala, Sabbir; Zalud, Peter F.; Senko, Tom; Tower, John; Ferraro, Joseph

    2010-09-01

    We report on the development of focal plane arrays (FPAs) employing two-dimensional arrays of InGaAsP-based Geiger-mode avalanche photodiodes (GmAPDs). These FPAs incorporate InP/InGaAs(P) Geiger-mode avalanche photodiodes (GmAPDs) to create pixels that detect single photons at shortwave infrared wavelengths with high efficiency and low dark count rates. GmAPD arrays are hybridized to CMOS read-out integrated circuits (ROICs) that enable independent laser radar (LADAR) time-of-flight measurements for each pixel, providing three-dimensional image data at frame rates approaching 200 kHz. Microlens arrays are used to maintain high fill factor of greater than 70%. We present full-array performance maps for two different types of sensors optimized for operation at 1.06 μm and 1.55 μm, respectively. For the 1.06 μm FPAs, overall photon detection efficiency of >40% is achieved at <20 kHz dark count rates with modest cooling to ~250 K using integrated thermoelectric coolers. We also describe the first evalution of these FPAs when multi-photon pulses are incident on single pixels. The effective detection efficiency for multi-photon pulses shows excellent agreement with predictions based on Poisson statistics. We also characterize the crosstalk as a function of pulse mean photon number. Relative to the intrinsic crosstalk contribution from hot carrier luminescence that occurs during avalanche current flows resulting from single incident photons, we find a modest rise in crosstalk for multi-photon incident pulses that can be accurately explained by direct optical scattering.

  16. Time-resolved single-photon detection module based on silicon photomultiplier: A novel building block for time-correlated measurement systems

    NASA Astrophysics Data System (ADS)

    Martinenghi, E.; Di Sieno, L.; Contini, D.; Sanzaro, M.; Pifferi, A.; Dalla Mora, A.

    2016-07-01

    We present the design and preliminary characterization of the first detection module based on Silicon Photomultiplier (SiPM) tailored for single-photon timing applications. The aim of this work is to demonstrate, thanks to the design of a suitable module, the possibility to easily exploit SiPM in many applications as an interesting detector featuring large active area, similarly to photomultipliers tubes, but keeping the advantages of solid state detectors (high quantum efficiency, low cost, compactness, robustness, low bias voltage, and insensitiveness to magnetic field). The module integrates a cooled SiPM with a total photosensitive area of 1 mm2 together with the suitable avalanche signal read-out circuit, the signal conditioning, the biasing electronics, and a Peltier cooler driver for thermal stabilization. It is able to extract the single-photon timing information with resolution better than 100 ps full-width at half maximum. We verified the effective stabilization in response to external thermal perturbations, thus proving the complete insensitivity of the module to environment temperature variations, which represents a fundamental parameter to profitably use the instrument for real-field applications. We also characterized the single-photon timing resolution, the background noise due to both primary dark count generation and afterpulsing, the single-photon detection efficiency, and the instrument response function shape. The proposed module can become a reliable and cost-effective building block for time-correlated single-photon counting instruments in applications requiring high collection capability of isotropic light and detection efficiency (e.g., fluorescence decay measurements or time-domain diffuse optics systems).

  17. Efficient Detection of Copy Number Mutations in PMS2 Exons with a Close Homolog.

    PubMed

    Herman, Daniel S; Smith, Christina; Liu, Chang; Vaughn, Cecily P; Palaniappan, Selvi; Pritchard, Colin C; Shirts, Brian H

    2018-07-01

    Detection of 3' PMS2 copy-number mutations that cause Lynch syndrome is difficult because of highly homologous pseudogenes. To improve the accuracy and efficiency of clinical screening for these mutations, we developed a new method to analyze standard capture-based, next-generation sequencing data to identify deletions and duplications in PMS2 exons 9 to 15. The approach captures sequences using PMS2 targets, maps sequences randomly among regions with equal mapping quality, counts reads aligned to homologous exons and introns, and flags read count ratios outside of empirically derived reference ranges. The method was trained on 1352 samples, including 8 known positives, and tested on 719 samples, including 17 known positives. Clinical implementation of the first version of this method detected new mutations in the training (N = 7) and test (N = 2) sets that had not been identified by our initial clinical testing pipeline. The described final method showed complete sensitivity in both sample sets and false-positive rates of 5% (training) and 7% (test), dramatically decreasing the number of cases needing additional mutation evaluation. This approach leveraged the differences between gene and pseudogene to distinguish between PMS2 and PMS2CL copy-number mutations. These methods enable efficient and sensitive Lynch syndrome screening for 3' PMS2 copy-number mutations and may be applied similarly to other genomic regions with highly homologous pseudogenes. Copyright © 2018 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  18. Radiation detectors and sources enhanced with micro/nanotechnology

    NASA Astrophysics Data System (ADS)

    Whitney, Chad Michael

    The ongoing threat of nuclear terrorism presents major challenges to maintaining national security. Currently, only a small percentage of the cargo containers that enter America are searched for fissionable bomb making materials. This work reports on a multi-channel radiation detection platform enabled with nanoparticles that is capable of detecting and discriminating all types of radiation emitted from fissionable bomb making materials. Typical Geiger counters are limited to detecting only beta and gamma radiation. The micro-Geiger counter reported here detects all species of radiation including beta particles, gamma/X-rays, alpha particles, and neutrons. The multi-species detecting micro-Geiger counter contains a hermetically sealed and electrically biased fill gas. Impinging radiation interacts with tailored nanoparticles to release secondary charged particles that ionize the fill gas. The ionized particles collect on respectively biased electrodes resulting in a characteristic electrical pulse. Pulse height spectroscopy and radiation energy binning techniques can then be used to analyze the pulses to determine the specific radiation isotope. The ideal voltage range of operation for energy discrimination was found to be in the proportional region at 1000VDC. In this region, specific pulse heights for different radiation species resulted. The amplification region strength which determines the device sensitivity to radiation energy can be tuned with the electrode separation distance. Considerable improvements in count rates were achieved by using the charge conversion nanoparticles with the highest cross sections for particular radiation species. The addition of tungsten nanoparticles to the microGeiger counter enabled the device to be four times more efficient at detecting low level beta particles with a dose rate of 3.2uR/hr (micro-Roentgen per hour) and just under three times more efficient than an off the shelf Geiger counter. The addition of lead nanoparticles enabled the gamma/X-ray microGeiger counter channel to be 28 times more efficient at detecting low level gamma rays with a dose rate of 10uR/hr when compared to a device without nanoparticles. The addition of 10B nanoparticles enabled the neutron microGeiger counter channel to be 17 times more efficient at detecting neutrons. The device achieved a neutron count rate of 9,866 counts per minute when compared to a BF3 tube which resulted in a count rate of 9,000 counts per minute. By using a novel micro-injection ceramic molding and low temperature (950°C) silver paste metallizing process, the batch fabrication of essentially disposable micro-devices can be achieved. This novel fabrication technique was then applied to a MEMS neutron gun and water spectroscopy device that also utilizes the high voltage/temperature insulating packaging.

  19. Experimental evaluation of the extended Dytlewski-style dead time correction formalism for neutron multiplicity counting

    NASA Astrophysics Data System (ADS)

    Lockhart, M.; Henzlova, D.; Croft, S.; Cutler, T.; Favalli, A.; McGahee, Ch.; Parker, R.

    2018-01-01

    Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli(DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory and implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. The current paper discusses and presents the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. In order to assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. The DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.

  20. Measurement of the Solar Electron Neutrino Flux with the Homestake Chlorine Detector

    NASA Astrophysics Data System (ADS)

    Cleveland, Bruce T.; Daily, Timothy; Davis, Raymond, Jr.; Distel, James R.; Lande, Kenneth; Lee, C. K.; Wildenhain, Paul S.; Ullman, Jack

    1998-03-01

    The Homestake Solar Neutrino Detector, based on the inverse beta-decay reaction νe + 37Cl --> 37Ar + e-, has been measuring the flux of solar neutrinos since 1970. The experiment has operated in a stable manner throughout this time period. All aspects of this detector are reviewed, with particular emphasis on the determination of the extraction and counting efficiencies, the key experimental parameters that are necessary to convert the measured 37Ar count rate to the solar neutrino production rate. A thorough consideration is also given to the systematics of the detector, including the measurement of the extraction and counting efficiencies and the nonsolar production of 37Ar. The combined result of 108 extractions is a solar neutrino-induced 37Ar production rate of 2.56 +/- 0.l6 (statistical) +/- 0.16 (systematic) SNU.

  1. Seamless Insert-Plasmid Assembly at High Efficiency and Low Cost

    PubMed Central

    Benoit, Roger M.; Ostermeier, Christian; Geiser, Martin; Li, Julia Su Zhou; Widmer, Hans; Auer, Manfred

    2016-01-01

    Seamless cloning methods, such as co-transformation cloning, sequence- and ligation-independent cloning (SLIC) or the Gibson assembly, are essential tools for the precise construction of plasmids. The efficiency of co-transformation cloning is however low and the Gibson assembly reagents are expensive. With the aim to improve the robustness of seamless cloning experiments while keeping costs low, we examined the importance of complementary single-stranded DNA ends for co-transformation cloning and the influence of single-stranded gaps in circular plasmids on SLIC cloning efficiency. Most importantly, our data show that single-stranded gaps in double-stranded plasmids, which occur in typical SLIC protocols, can drastically decrease the efficiency at which the DNA transforms competent E. coli bacteria. Accordingly, filling-in of single-stranded gaps using DNA polymerase resulted in increased transformation efficiency. Ligation of the remaining nicks did not lead to a further increase in transformation efficiency. These findings demonstrate that highly efficient insert-plasmid assembly can be achieved by using only T5 exonuclease and Phusion DNA polymerase, without Taq DNA ligase from the original Gibson protocol, which significantly reduces the cost of the reactions. We successfully used this modified Gibson assembly protocol with two short insert-plasmid overlap regions, each counting only 15 nucleotides. PMID:27073895

  2. The detective quantum efficiency of photon-counting x-ray detectors using cascaded-systems analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tanguay, Jesse; Yun, Seungman; School of Mechanical Engineering, Pusan National University, Jangjeon-dong, Geumjeong-gu, Busan 609-735

    Purpose: Single-photon counting (SPC) x-ray imaging has the potential to improve image quality and enable new advanced energy-dependent methods. The purpose of this study is to extend cascaded-systems analyses (CSA) to the description of image quality and the detective quantum efficiency (DQE) of SPC systems. Methods: Point-process theory is used to develop a method of propagating the mean signal and Wiener noise-power spectrum through a thresholding stage (required to identify x-ray interaction events). The new transfer relationships are used to describe the zero-frequency DQE of a hypothetical SPC detector including the effects of stochastic conversion of incident photons to secondarymore » quanta, secondary quantum sinks, additive noise, and threshold level. Theoretical results are compared with Monte Carlo calculations assuming the same detector model. Results: Under certain conditions, the CSA approach can be applied to SPC systems with the additional requirement of propagating the probability density function describing the total number of image-forming quanta through each stage of a cascaded model. Theoretical results including DQE show excellent agreement with Monte Carlo calculations under all conditions considered. Conclusions: Application of the CSA method shows that false counts due to additive electronic noise results in both a nonlinear image signal and increased image noise. There is a window of allowable threshold values to achieve a high DQE that depends on conversion gain, secondary quantum sinks, and additive noise.« less

  3. Comparative investigation of the detective quantum efficiency of direct and indirect conversion detector technologies in dedicated breast CT.

    PubMed

    Kuttig, Jan D; Steiding, Christian; Kolditz, Daniel; Hupfer, Martin; Karolczak, Marek; Kalender, Willi A

    2015-06-01

    To investigate the dose saving potential of direct-converting CdTe photon-counting detector technology for dedicated breast CT. We analyzed the modulation transfer function (MTF), the noise power spectrum (NPS) and the detective quantum efficiency (DQE) of two detector technologies, suitable for breast CT (BCT): a flat-panel energy-integrating detector with a 70 μm and a 208 μm thick gadolinium oxysulfide (GOS) and a 150 μm thick cesium iodide (CsI) scintillator and a photon-counting detector with a 1000 μm thick CdTe sensor. The measurements for GOS scintillator thicknesses of 70 μm and 208 μm delivered 10% pre-sampled MTF values of 6.6 mm(-1) and 3.2 mm(-1), and DQE(0) values of 23% and 61%. The 10% pre-sampled MTF value for the 150 μm thick CsI scintillator 6.9 mm(-1), and the DQE(0) value was 49%. The CdTe sensor reached a 10% pre-sampled MTF value of 8.5 mm(-1) and a DQE(0) value of 85%. The photon-counting CdTe detector technology allows for significant dose reduction compared to the energy-integrating scintillation detector technology used in BCT today. Our comparative evaluation indicates that a high potential dose saving may be possible for BCT by using CdTe detectors, without loss of spatial resolution. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  4. Aerosol Generation by Modern Flush Toilets

    PubMed Central

    Johnson, David; Lynch, Robert; Marshall, Charles; Mead, Kenneth; Hirst, Deborah

    2015-01-01

    A microbe-contaminated toilet will produce bioaerosols when flushed. We assessed toilet plume aerosol from high efficiency (HET), pressure-assisted high efficiency (PAT), and flushometer (FOM) toilets with similar bowl water and flush volumes. Total and droplet nuclei “bioaerosols” were assessed. Monodisperse 0.25–1.9-μm fluorescent microspheres served as microbe surrogates in separate trials in a mockup 5 m3 water closet (WC). Bowl water seeding was approximately 1012 particles/mL. Droplet nuclei were sampled onto 0.2-μm pore size mixed cellulose ester filters beginning 15 min after the flush using open-face cassettes mounted on the WC walls. Pre- and postflush bowl water concentrations were measured. Filter particle counts were analyzed via fluorescent microscopy. Bowl headspace droplet count size distributions were bimodal and similar for all toilet types and flush conditions, with 95% of droplets <2 μm diameter and >99% <5 μm. Up to 145,000 droplets were produced per flush, with the high-energy flushometer producing over three times as many as the lower energy PAT and over 12 times as many as the lowest energy HET despite similar flush volumes. The mean numbers of fluorescent droplet nuclei particles aerosolized and remaining airborne also increased with flush energy. Fluorescent droplet nuclei per flush decreased with increasing particle size. These findings suggest two concurrent aerosolization mechanisms—splashing for large droplets and bubble bursting for the fine droplets that form droplet nuclei. PMID:26635429

  5. Gallium nitride photocathodes for imaging photon counters

    NASA Astrophysics Data System (ADS)

    Siegmund, Oswald H. W.; Hull, Jeffrey S.; Tremsin, Anton S.; McPhate, Jason B.; Dabiran, Amir M.

    2010-07-01

    Gallium nitride opaque and semitransparent photocathodes provide high ultraviolet quantum efficiencies from 100 nm to a long wavelength cutoff at ~380 nm. P (Mg) doped GaN photocathode layers ~100 nm thick with a barrier layer of AlN (22 nm) on sapphire substrates also have low out of band response, and are highly robust. Opaque GaN photocathodes are relatively easy to optimize, and consistently provide high quantum efficiency (70% at 120 nm) provided the surface cleaning and activation (Cs) processes are well established. We have used two dimensional photon counting imaging microchannel plate detectors, with an active area of 25 mm diameter, to investigate the imaging characteristics of semitransparent GaN photocathodes. These can be produced with high (20%) efficiency, but the thickness and conductivity of the GaN must be carefully optimized. High spatial resolution of ~50 μm with low intrinsic background (~7 events sec-1 cm-2) and good image uniformity have been achieved. Selectively patterned deposited GaN photocathodes have also been used to allow quick diagnostics of optimization parameters. GaN photocathodes of both types show great promise for future detector applications in ultraviolet Astrophysical instruments.

  6. Evaluation of the charge transfer efficiency of organic thin-film photovoltaic devices fabricated using a photoprecursor approach.

    PubMed

    Masuo, Sadahiro; Sato, Wataru; Yamaguchi, Yuji; Suzuki, Mitsuharu; Nakayama, Ken-ichi; Yamada, Hiroko

    2015-05-01

    Recently, a unique 'photoprecursor approach' was reported as a new option to fabricate a p-i-n triple-layer organic photovoltaic device (OPV) through solution processes. By fabricating the p-i-n architecture using two kinds of photoprecursors and a [6,6]-phenyl C71 butyric acid methyl ester (PC71BM) as the donor and the acceptor, the p-i-n OPVs afforded a higher photovoltaic efficiency than the corresponding p-n devices and i-devices, while the photovoltaic efficiency of p-i-n OPVs depended on the photoprecursors. In this work, the charge transfer efficiency of the i-devices composed of the photoprecursors and PC71BM was investigated using high-sensitivity fluorescence microspectroscopy combined with a time-correlated single photon counting technique to elucidate the photovoltaic efficiency depending on the photoprecursors and the effects of the p-i-n architecture. The spatially resolved fluorescence images and fluorescence lifetime measurements clearly indicated that the compatibility of the photoprecursors with PC71BM influences the charge transfer and the photovoltaic efficiencies. Although the charge transfer efficiency of the i-device was quite high, the photovoltaic efficiency of the i-device was much lower than that of the p-i-n device. These results imply that the carrier generation and carrier transportation efficiencies can be increased by fabricating the p-i-n architecture.

  7. Evaluation of absolute measurement using a 4π plastic scintillator for the 4πβ-γ coincidence counting method.

    PubMed

    Unno, Y; Sanami, T; Sasaki, S; Hagiwara, M; Yunoki, A

    2018-04-01

    Absolute measurement by the 4πβ-γ coincidence counting method was conducted by two photomultipliers facing across a plastic scintillator to be focused on β ray counting efficiency. The detector was held with a through-hole-type NaI(Tl) detector. The results include absolutely determined activity and its uncertainty especially about extrapolation. A comparison between the obtained and known activities showed agreement within their uncertainties. Copyright © 2017 Elsevier Ltd. All rights reserved.

  8. Designing the X-Ray Microcalorimeter Spectrometer for Optimal Science Return

    NASA Technical Reports Server (NTRS)

    Ptak, Andrew; Bandler, Simon R.; Bookbinder, Jay; Kelley, Richard L.; Petre, Robert; Smith, Randall K.; Smith, Stephen

    2013-01-01

    Recent advances in X-ray microcalorimeters enable a wide range of possible focal plane designs for the X-ray Microcalorimeter Spectrometer (XMS) instrument on the future Advanced X-ray Spectroscopic Imaging Observatory (AXSIO) or X-ray Astrophysics Probe (XAP). Small pixel designs (75 microns) oversample a 5-10" PSF by a factor of 3-6 for a 10 m focal length, enabling observations at both high count rates and high energy resolution. Pixel designs utilizing multiple absorbers attached to single transition-edge sensors can extend the focal plane to cover a significantly larger field of view, albeit at a cost in maximum count rate and energy resolution. Optimizing the science return for a given cost and/or complexity is therefore a non-trivial calculation that includes consideration of issues such as the mission science drivers, likely targets, mirror size, and observing efficiency. We present a range of possible designs taking these factors into account and their impacts on the science return of future large effective-area X-ray spectroscopic missions.

  9. Characterization of an ultraviolet imaging detector with high event rate ROIC (HEROIC) readout

    NASA Astrophysics Data System (ADS)

    Nell, Nicholas; France, Kevin; Harwit, Alex; Bradley, Scott; Franka, Steve; Freymiller, Ed; Ebbets, Dennis

    2016-07-01

    We present characterization results from a photon counting imaging detector consisting of one microchannel plate (MCP) and an array of two readout integrated circuits (ROIC) that record photon position. The ROICs used in the position readout are the high event rate ROIC (HEROIC) devices designed to handle event rates up to 1 MHz per pixel, recently developed by the Ball Aerospace and Technologies Corporation in collaboration with the University of Colorado. An opaque cesium iodide (CsI) photocathode sensitive in the far-ultraviolet (FUV; 122-200 nm), is deposited on the upper surface of the MCP. The detector is characterized in a chamber developed by CU Boulder that is capable of illumination with vacuum-ultraviolet (VUV) monochromatic light and measurement of absolute ux with a calibrated photodiode. Testing includes investigation of the effects of adjustment of internal settings of the HEROIC devices including charge threshold, gain, and amplifier bias. The detector response to high count rates is tested. We report initial results including background, uniformity, and quantum detection efficiency (QDE) as a function of wavelength.

  10. High-throughput measurement of rice tillers using a conveyor equipped with x-ray computed tomography

    NASA Astrophysics Data System (ADS)

    Yang, Wanneng; Xu, Xiaochun; Duan, Lingfeng; Luo, Qingming; Chen, Shangbin; Zeng, Shaoqun; Liu, Qian

    2011-02-01

    Tillering is one of the most important agronomic traits because the number of shoots per plant determines panicle number, a key component of grain yield. The conventional method of counting tillers is still manual. Under the condition of mass measurement, the accuracy and efficiency could be gradually degraded along with fatigue of experienced staff. Thus, manual measurement, including counting and recording, is not only time consuming but also lack objectivity. To automate this process, we developed a high-throughput facility, dubbed high-throughput system for measuring automatically rice tillers (H-SMART), for measuring rice tillers based on a conventional x-ray computed tomography (CT) system and industrial conveyor. Each pot-grown rice plant was delivered into the CT system for scanning via the conveyor equipment. A filtered back-projection algorithm was used to reconstruct the transverse section image of the rice culms. The number of tillers was then automatically extracted by image segmentation. To evaluate the accuracy of this system, three batches of rice at different growth stages (tillering, heading, or filling) were tested, yielding absolute mean absolute errors of 0.22, 0.36, and 0.36, respectively. Subsequently, the complete machine was used under industry conditions to estimate its efficiency, which was 4320 pots per continuous 24 h workday. Thus, the H-SMART could determine the number of tillers of pot-grown rice plants, providing three advantages over the manual tillering method: absence of human disturbance, automation, and high throughput. This facility expands the application of agricultural photonics in plant phenomics.

  11. High-throughput measurement of rice tillers using a conveyor equipped with x-ray computed tomography.

    PubMed

    Yang, Wanneng; Xu, Xiaochun; Duan, Lingfeng; Luo, Qingming; Chen, Shangbin; Zeng, Shaoqun; Liu, Qian

    2011-02-01

    Tillering is one of the most important agronomic traits because the number of shoots per plant determines panicle number, a key component of grain yield. The conventional method of counting tillers is still manual. Under the condition of mass measurement, the accuracy and efficiency could be gradually degraded along with fatigue of experienced staff. Thus, manual measurement, including counting and recording, is not only time consuming but also lack objectivity. To automate this process, we developed a high-throughput facility, dubbed high-throughput system for measuring automatically rice tillers (H-SMART), for measuring rice tillers based on a conventional x-ray computed tomography (CT) system and industrial conveyor. Each pot-grown rice plant was delivered into the CT system for scanning via the conveyor equipment. A filtered back-projection algorithm was used to reconstruct the transverse section image of the rice culms. The number of tillers was then automatically extracted by image segmentation. To evaluate the accuracy of this system, three batches of rice at different growth stages (tillering, heading, or filling) were tested, yielding absolute mean absolute errors of 0.22, 0.36, and 0.36, respectively. Subsequently, the complete machine was used under industry conditions to estimate its efficiency, which was 4320 pots per continuous 24 h workday. Thus, the H-SMART could determine the number of tillers of pot-grown rice plants, providing three advantages over the manual tillering method: absence of human disturbance, automation, and high throughput. This facility expands the application of agricultural photonics in plant phenomics.

  12. Homicides by Police: Comparing Counts From the National Violent Death Reporting System, Vital Statistics, and Supplementary Homicide Reports

    PubMed Central

    Azrael, Deborah; Cohen, Amy; Miller, Matthew; Thymes, Deonza; Wang, David Enze; Hemenway, David

    2016-01-01

    Objective. To evaluate the National Violent Death Reporting System (NVDRS) as a surveillance system for homicides by law enforcement officers. Methods. We assessed sensitivity and positive predictive value of the NVDRS “type of death” variable against our study count of homicides by police, which we derived from NVDRS coded and narrative data for states participating in NVDRS 2005 to 2012. We compared state counts of police homicides from NVDRS, Vital Statistics, and Federal Bureau of Investigation Supplementary Homicide Reports. Results. We identified 1552 police homicides in the 16 states. Positive predictive value and sensitivity of the NVDRS “type of death” variable for police homicides were high (98% and 90%, respectively). Counts from Vital Statistics and Supplementary Homicide Reports were 58% and 48%, respectively, of our study total; gaps varied widely by state. The annual rate of police homicide (0.24/100 000) varied 5-fold by state and 8-fold by race/ethnicity. Conclusions. NVDRS provides more complete data on police homicides than do existing systems. Policy Implications. Expanding NVDRS to all 50 states and making 2 improvements we identify will be an efficient way to provide the nation with more accurate, detailed data on homicides by law enforcement. PMID:26985611

  13. Adaptive Bloom Filter: A Space-Efficient Counting Algorithm for Unpredictable Network Traffic

    NASA Astrophysics Data System (ADS)

    Matsumoto, Yoshihide; Hazeyama, Hiroaki; Kadobayashi, Youki

    The Bloom Filter (BF), a space-and-time-efficient hashcoding method, is used as one of the fundamental modules in several network processing algorithms and applications such as route lookups, cache hits, packet classification, per-flow state management or network monitoring. BF is a simple space-efficient randomized data structure used to represent a data set in order to support membership queries. However, BF generates false positives, and cannot count the number of distinct elements. A counting Bloom Filter (CBF) can count the number of distinct elements, but CBF needs more space than BF. We propose an alternative data structure of CBF, and we called this structure an Adaptive Bloom Filter (ABF). Although ABF uses the same-sized bit-vector used in BF, the number of hash functions employed by ABF is dynamically changed to record the number of appearances of a each key element. Considering the hash collisions, the multiplicity of a each key element on ABF can be estimated from the number of hash functions used to decode the membership of the each key element. Although ABF can realize the same functionality as CBF, ABF requires the same memory size as BF. We describe the construction of ABF and IABF (Improved ABF), and provide a mathematical analysis and simulation using Zipf's distribution. Finally, we show that ABF can be used for an unpredictable data set such as real network traffic.

  14. Visual versus mechanised leucocyte differential counts: costing and evaluation of traditional and Hemalog D methods.

    PubMed Central

    Hudson, M J; Green, A E

    1980-01-01

    Visual differential counts were examined for efficiency, cost effectiveness, and staff acceptability within our laboratory. A comparison with the Hemalog D system was attempted. The advantages and disadvantages of each system are enumerated and discussed in the context of a large general hospital. PMID:7440760

  15. Architecture and data processing alternatives for the TSE computer. Volume 3: Execution of a parallel counting algorithm using array logic (Tse) devices

    NASA Technical Reports Server (NTRS)

    Metcalfe, A. G.; Bodenheimer, R. E.

    1976-01-01

    A parallel algorithm for counting the number of logic-l elements in a binary array or image developed during preliminary investigation of the Tse concept is described. The counting algorithm is implemented using a basic combinational structure. Modifications which improve the efficiency of the basic structure are also presented. A programmable Tse computer structure is proposed, along with a hardware control unit, Tse instruction set, and software program for execution of the counting algorithm. Finally, a comparison is made between the different structures in terms of their more important characteristics.

  16. The 124Sb activity standardization by gamma spectrometry for medical applications

    NASA Astrophysics Data System (ADS)

    de Almeida, M. C. M.; Iwahara, A.; Delgado, J. U.; Poledna, R.; da Silva, R. L.

    2010-07-01

    This work describes a metrological activity determination of 124Sb, which can be used as radiotracer, applying gamma spectrometry methods with hyper pure germanium detector and efficiency curves. This isotope with good activity and high radionuclidic purity is employed in the form of meglumine antimoniate (Glucantime) or sodium stibogluconate (Pentostam) to treat leishmaniasis. 124Sb is also applied in animal organ distribution studies to solve some questions in pharmacology. 124Sb decays by β-emission and it produces several photons (X and gamma rays) with energy varying from 27 to 2700 keV. Efficiency curves to measure point 124Sb solid sources were obtained from a 166mHo standard that is a multi-gamma reference source. These curves depend on radiation energy, sample geometry, photon attenuation, dead time and sample-detector position. Results for activity determination of 124Sb samples using efficiency curves and a high purity coaxial germanium detector were consistent in different counting geometries. Also uncertainties of about 2% ( k=2) were obtained.

  17. Photon-counting CT with silicon detectors: feasibility for pediatric imaging

    NASA Astrophysics Data System (ADS)

    Yveborg, Moa; Xu, Cheng; Fredenberg, Erik; Danielsson, Mats

    2009-02-01

    X-ray detectors made of crystalline silicon have several advantages including low dark currents, fast charge collection and high energy resolution. For high-energy x-rays, however, silicon suffers from its low atomic number, which might result in low detection efficiency, as well as low energy and spatial resolution due to Compton scattering. We have used a monte-carlo model to investigate the feasibility of a detector for pediatric CT with 30 to 40 mm of silicon using x-ray spectra ranging from 80 to 140 kVp. A detection efficiency of 0.74 was found at 80 kVp, provided the noise threshold could be set low. Scattered photons were efficiently blocked by a thin metal shielding between the detector units, and Compton scattering in the detector could be well separated from photo absorption at 80 kVp. Hence, the detector is feasible at low acceleration voltages, which is also suitable for pediatric imaging. We conclude that silicon detectors may be an alternative to other designs for this special case.

  18. A Fast parallel tridiagonal algorithm for a class of CFD applications

    NASA Technical Reports Server (NTRS)

    Moitra, Stuti; Sun, Xian-He

    1996-01-01

    The parallel diagonal dominant (PDD) algorithm is an efficient tridiagonal solver. This paper presents for study a variation of the PDD algorithm, the reduced PDD algorithm. The new algorithm maintains the minimum communication provided by the PDD algorithm, but has a reduced operation count. The PDD algorithm also has a smaller operation count than the conventional sequential algorithm for many applications. Accuracy analysis is provided for the reduced PDD algorithm for symmetric Toeplitz tridiagonal (STT) systems. Implementation results on Langley's Intel Paragon and IBM SP2 show that both the PDD and reduced PDD algorithms are efficient and scalable.

  19. How many fish in a tank? Constructing an automated fish counting system by using PTV analysis

    NASA Astrophysics Data System (ADS)

    Abe, S.; Takagi, T.; Takehara, K.; Kimura, N.; Hiraishi, T.; Komeyama, K.; Torisawa, S.; Asaumi, S.

    2017-02-01

    Because escape from a net cage and mortality are constant problems in fish farming, health control and management of facilities are important in aquaculture. In particular, the development of an accurate fish counting system has been strongly desired for the Pacific Bluefin tuna farming industry owing to the high market value of these fish. The current fish counting method, which involves human counting, results in poor accuracy; moreover, the method is cumbersome because the aquaculture net cage is so large that fish can only be counted when they move to another net cage. Therefore, we have developed an automated fish counting system by applying particle tracking velocimetry (PTV) analysis to a shoal of swimming fish inside a net cage. In essence, we treated the swimming fish as tracer particles and estimated the number of fish by analyzing the corresponding motion vectors. The proposed fish counting system comprises two main components: image processing and motion analysis, where the image-processing component abstracts the foreground and the motion analysis component traces the individual's motion. In this study, we developed a Region Extraction and Centroid Computation (RECC) method and a Kalman filter and Chi-square (KC) test for the two main components. To evaluate the efficiency of our method, we constructed a closed system, placed an underwater video camera with a spherical curved lens at the bottom of the tank, and recorded a 360° view of a swimming school of Japanese rice fish (Oryzias latipes). Our study showed that almost all fish could be abstracted by the RECC method and the motion vectors could be calculated by the KC test. The recognition rate was approximately 90% when more than 180 individuals were observed within the frame of the video camera. These results suggest that the presented method has potential application as a fish counting system for industrial aquaculture.

  20. Radiation-hard ceramic Resistive Plate Chambers for forward TOF and T0 systems

    NASA Astrophysics Data System (ADS)

    Akindinov, A.; Dreyer, J.; Fan, X.; Kämpfer, B.; Kiselev, S.; Kotte, R.; Garcia, A. Laso; Malkevich, D.; Naumann, L.; Nedosekin, A.; Plotnikov, V.; Stach, D.; Sultanov, R.; Voloshin, K.

    2017-02-01

    Resistive Plate Chambers with ceramic electrodes are the main candidates for a use in precise multi-channel timing systems operating in high-radiation conditions. We report the latest R&D results on these detectors aimed to meet the requirements of the forward T0 counter at the CBM experiment. RPC design, gas mixture, limits on the bulk resistivity of ceramic electrodes, efficiency, time resolution, counting rate capabilities and ageing test results are presented.

  1. Parametric Amplification For Detecting Weak Optical Signals

    NASA Technical Reports Server (NTRS)

    Hemmati, Hamid; Chen, Chien; Chakravarthi, Prakash

    1996-01-01

    Optical-communication receivers of proposed type implement high-sensitivity scheme of optical parametric amplification followed by direct detection for reception of extremely weak signals. Incorporates both optical parametric amplification and direct detection into optimized design enhancing effective signal-to-noise ratios during reception in photon-starved (photon-counting) regime. Eliminates need for complexity of heterodyne detection scheme and partly overcomes limitations imposed on older direct-detection schemes by noise generated in receivers and by limits on quantum efficiencies of photodetectors.

  2. Mutacins and bacteriocins like genes in Streptococcus mutans isolated from participants with high, moderate, and low salivary count.

    PubMed

    Soto, Carolina; Padilla, Carlos; Lobos, Olga

    2017-02-01

    To detect S. mutans producers of mutacins and bacteriocins like substances (BLIS) from saliva of participants with low, moderate, and high salivary counts. 123 strains of S. mutans were obtained from participants with low, moderate, and high salivary counts (age 18 and 20 years old) and their antibacterial capacity analyzed. By using PCR amplification, the expression levels of mutacins and BLIS genes were studied (expressed in arbitrary units/ml) in all three levels. S. mutans strains from participants with low salivary counts show high production of mutacins (63%). In contrast, participants with moderate and high salivary counts depict relatively low levels of mutacins (22 and 15%, respectively). Moreover, participants with low salivary counts showed high expression levels of genes encoding mutacins, a result that correlates with the strong antimicrobial activity of the group. Participants with moderate and high salivary counts however depict low expression levels of mutacin related genes, and little antimicrobial activity. No BLIS were detected in any of the groups studied. S. mutans isolated from the saliva of participants with low bacterial counts have significant antibacterial capacity compared to that of participants with moderate and high salivary counts. The superior lethality of S. mutans in participants with low salivary counts is likely due to the augmented expression of mutacin- related genes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  3. Energy Efficient Engine: High-pressure compressor test hardware detailed design report

    NASA Technical Reports Server (NTRS)

    Howe, David C.; Marchant, R. D.

    1988-01-01

    The objective of the NASA Energy Efficient Engine program is to identify and verify the technology required to achieve significant reductions in fuel consumption and operating cost for future commercial gas turbine engines. The design and analysis is documented of the high pressure compressor which was tested as part of the Pratt and Whitney effort under the Energy Efficient Engine program. This compressor was designed to produce a 14:1 pressure ratio in ten stages with an adiabatic efficiency of 88.2 percent in the flight propulsion system. The corresponding expected efficiency for the compressor component test rig is 86.5 percent. Other performance goals are a surge margin of 20 percent, a corrected flow rate of 35.2 kg/sec (77.5 lb/sec), and a life of 20,000 missions and 30,000 hours. Low loss, highly loaded airfoils are used to increase efficiency while reducing the parts count. Active clearance control and case trenches in abradable strips over the blade tips are included in the compressor component design to further increase the efficiency potential. The test rig incorporates variable geometry stator vanes in all stages to permit maximum flexibility in developing stage-to-stage matching. This provision precluded active clearance control on the rear case of the test rig. Both the component and rig designs meet or exceed design requirements with the exception of life goals, which will be achievable with planned advances in materials technology.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fisk, William J.; Sullivan, Douglas

    This pilot scale study evaluated the counting accuracy of two people counting systems that could be used in demand controlled ventilation systems to provide control signals for modulating outdoor air ventilation rates. The evaluations included controlled challenges of the people counting systems using pre-planned movements of occupants through doorways and evaluations of counting accuracies when naive occupants (i.e., occupants unaware of the counting systems) passed through the entrance doors of the building or room. The two people counting systems had high counting accuracy accuracies, with errors typically less than 10percent, for typical non-demanding counting events. However, counting errors were highmore » in some highly challenging situations, such as multiple people passing simultaneously through a door. Counting errors, for at least one system, can be very high if people stand in the field of view of the sensor. Both counting system have limitations and would need to be used only at appropriate sites and where the demanding situations that led to counting errors were rare.« less

  5. Experimental Study for Automatic Colony Counting System Based Onimage Processing

    NASA Astrophysics Data System (ADS)

    Fang, Junlong; Li, Wenzhe; Wang, Guoxin

    Colony counting in many colony experiments is detected by manual method at present, therefore it is difficult for man to execute the method quickly and accurately .A new automatic colony counting system was developed. Making use of image-processing technology, a study was made on the feasibility of distinguishing objectively white bacterial colonies from clear plates according to the RGB color theory. An optimal chromatic value was obtained based upon a lot of experiments on the distribution of the chromatic value. It has been proved that the method greatly improves the accuracy and efficiency of the colony counting and the counting result is not affected by using inoculation, shape or size of the colony. It is revealed that automatic detection of colony quantity using image-processing technology could be an effective way.

  6. Evaluation of heterotrophic plate and chromogenic agar colony counting in water quality laboratories.

    PubMed

    Hallas, Gary; Monis, Paul

    2015-01-01

    The enumeration of bacteria using plate-based counts is a core technique used by food and water microbiology testing laboratories. However, manual counting of bacterial colonies is both time and labour intensive, can vary between operators and also requires manual entry of results into laboratory information management systems, which can be a source of data entry error. An alternative is to use automated digital colony counters, but there is a lack of peer-reviewed validation data to allow incorporation into standards. We compared the performance of digital counting technology (ProtoCOL3) against manual counting using criteria defined in internationally recognized standard methods. Digital colony counting provided a robust, standardized system suitable for adoption in a commercial testing environment. The digital technology has several advantages:•Improved measurement of uncertainty by using a standard and consistent counting methodology with less operator error.•Efficiency for labour and time (reduced cost).•Elimination of manual entry of data onto LIMS.•Faster result reporting to customers.

  7. Influence of chlorothalonil on the removal of organic matter in horizontal subsurface flow constructed wetlands.

    PubMed

    Casas-Zapata, Juan C; Ríos, Karina; Florville-Alejandre, Tomás R; Morató, Jordi; Peñuela, Gustavo

    2013-01-01

    This study investigates the effects of chlorothalonil (CLT) on chemical oxygen demand (COD) and dissolved organic carbon (DOC) in pilot-scale horizontal subsurface flow constructed wetlands (HSSFCW) planted with Phragmites australis. Physicochemical parameters of influent and effluent water samples, microbial population counting methods and statistical analysis were used to evaluate the influence of CLT on organic matter removal efficiency. The experiments were conducted on four planted replicate wetlands (HSSFCW-Pa) and one unplanted control wetland (HSSFCW-NPa). The wetlands exhibited high average organic matter removal efficiencies (HSSFCW-Pa: 80.6% DOC, 98.0% COD; HSSFCW-NPa: 93.2% DOC, 98.4% COD). The addition of CLT did not influence organic removal parameters. In all cases CLT concentrations in the effluent occurred in concentrations lower than the detection limit of the analytical method. Microbial population counts from HSSFCW-Pa showed significant correlations among different microbial groups and with different physicochemical variables. The apparent independence of organic matter removal and CLT inputs, along with the CLT depletion observed in effluent samples demonstrated that HSSFCW are a viable technology for the treatment of agricultural effluents contaminated with organo-chloride pesticides like CLT.

  8. A GEM-TPC in twin configuration for the Super-FRS tracking of heavy ions at FAIR

    NASA Astrophysics Data System (ADS)

    García, F.; Grahn, T.; Hoffmann, J.; Jokinen, A.; Kaya, C.; Kunkel, J.; Rinta-Antila, S.; Risch, H.; Rusanov, I.; Schmidt, C. J.; Simon, H.; Simons, C.; Turpeinen, R.; Voss, B.; Äystö, J.; Winkler, M.

    2018-03-01

    The GEM-TPC described herein will be part of the standard beam-diagnostics equipment of the Super-FRS. This chamber will provide tracking information for particle identification at rates up to 1 MHz on an event-by-event basis. The key requirements of operation for these chambers are: close to 100% tracking efficiency under conditions of high counting rate, spatial resolution below 1 mm and a superb large dynamic range covering projectiles from Z = 1 up to Z = 92. The current prototype consists of two GEM-TPCs inside a single vessel, which are operating independently and have electrical drift fields in opposite directions. The twin configuration is done by flipping one of the GEM-TPCs on the middle plane with respect to the second one. In order to put this development in context, the evolution of previous prototypes will be described and its performances discussed. Finally, this chamber was tested at the University of Jyväskylä accelerator with proton projectiles and at GSI with Uranium, Xenon, fragments and Carbon beams. The results obtained have shown a position resolution between 120 to 300 μm at moderate counting rate under conditions of full tracking efficiency.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Funsten, Herbert O.; Harper, Ronnie W.; Dors, Eric E.

    Channel electron multiplier (CEM) and microchannel plate (MCP) detectors are routinely used in space instrumentation for measurement of space plasmas. Here, our goal is to understand the relative sensitivities of these detectors to penetrating radiation in space, which can generate background counts and shorten detector lifetime. We use 662 keV γ-rays as a proxy for penetrating radiation such as γ-rays, cosmic rays, and high-energy electrons and protons that are ubiquitous in the space environment. We find that MCP detectors are ~20 times more sensitive to 662 keV γ-rays than CEM detectors. This is attributed to the larger total area ofmore » multiplication channels in an MCP detector that is sensitive to electronic excitation and ionization resulting from the interaction of penetrating radiation with the detector material. In contrast to the CEM detector, whose quantum efficiency ε γ for 662 keVγ -rays is found to be 0.00175 and largely independent of detector bias, the quantum efficiency of the MCP detector is strongly dependent on the detector bias, with a power law index of 5.5. Lastly, background counts in MCP detectors from penetrating radiation can be reduced using MCP geometries with higher pitch and smaller channel diameter.« less

  10. Track counts as indices to abundances of arboreal rodents.

    Treesearch

    A.B. Carey; J.W. Witt

    1991-01-01

    Counting tracks to obtain an index of abundance for species difficult to capture offers a promise of efficiency and effectiveness when broad surveys of populations are necessary. Sand plots, smoked kymograph paper, and, recently, smoked aluminum plates have been used to record tracks(Raphael et al., 1986; Taylor and Raphael, 1988). Findings of studies of carnivores...

  11. When Practice Doesn't Lead to Retrieval: An Analysis of Children's Errors with Simple Addition

    ERIC Educational Resources Information Center

    de Villiers, Celéste; Hopkins, Sarah

    2013-01-01

    Counting strategies initially used by young children to perform simple addition are often replaced by more efficient counting strategies, decomposition strategies and rule-based strategies until most answers are encoded in memory and can be directly retrieved. Practice is thought to be the key to developing fluent retrieval of addition facts. This…

  12. A high-throughput AO/PI-based cell concentration and viability detection method using the Celigo image cytometry.

    PubMed

    Chan, Leo Li-Ying; Smith, Tim; Kumph, Kendra A; Kuksin, Dmitry; Kessel, Sarah; Déry, Olivier; Cribbes, Scott; Lai, Ning; Qiu, Jean

    2016-10-01

    To ensure cell-based assays are performed properly, both cell concentration and viability have to be determined so that the data can be normalized to generate meaningful and comparable results. Cell-based assays performed in immuno-oncology, toxicology, or bioprocessing research often require measuring of multiple samples and conditions, thus the current automated cell counter that uses single disposable counting slides is not practical for high-throughput screening assays. In the recent years, a plate-based image cytometry system has been developed for high-throughput biomolecular screening assays. In this work, we demonstrate a high-throughput AO/PI-based cell concentration and viability method using the Celigo image cytometer. First, we validate the method by comparing directly to Cellometer automated cell counter. Next, cell concentration dynamic range, viability dynamic range, and consistency are determined. The high-throughput AO/PI method described here allows for 96-well to 384-well plate samples to be analyzed in less than 7 min, which greatly reduces the time required for the single sample-based automated cell counter. In addition, this method can improve the efficiency for high-throughput screening assays, where multiple cell counts and viability measurements are needed prior to performing assays such as flow cytometry, ELISA, or simply plating cells for cell culture.

  13. Spatio-energetic cross talk in photon counting detectors: Detector model and correlated Poisson data generator.

    PubMed

    Taguchi, Katsuyuki; Polster, Christoph; Lee, Okkyun; Stierstorfer, Karl; Kappler, Steffen

    2016-12-01

    An x-ray photon interacts with photon counting detectors (PCDs) and generates an electron charge cloud or multiple clouds. The clouds (thus, the photon energy) may be split between two adjacent PCD pixels when the interaction occurs near pixel boundaries, producing a count at both of the pixels. This is called double-counting with charge sharing. (A photoelectric effect with K-shell fluorescence x-ray emission would result in double-counting as well). As a result, PCD data are spatially and energetically correlated, although the output of individual PCD pixels is Poisson distributed. Major problems include the lack of a detector noise model for the spatio-energetic cross talk and lack of a computationally efficient simulation tool for generating correlated Poisson data. A Monte Carlo (MC) simulation can accurately simulate these phenomena and produce noisy data; however, it is not computationally efficient. In this study, the authors developed a new detector model and implemented it in an efficient software simulator that uses a Poisson random number generator to produce correlated noisy integer counts. The detector model takes the following effects into account: (1) detection efficiency; (2) incomplete charge collection and ballistic effect; (3) interaction with PCDs via photoelectric effect (with or without K-shell fluorescence x-ray emission, which may escape from the PCDs or be reabsorbed); and (4) electronic noise. The correlation was modeled by using these two simplifying assumptions: energy conservation and mutual exclusiveness. The mutual exclusiveness is that no more than two pixels measure energy from one photon. The effect of model parameters has been studied and results were compared with MC simulations. The agreement, with respect to the spectrum, was evaluated using the reduced χ 2 statistics or a weighted sum of squared errors, χ red 2 (≥1), where χ red 2 =1 indicates a perfect fit. The model produced spectra with flat field irradiation that qualitatively agree with previous studies. The spectra generated with different model and geometry parameters allowed for understanding the effect of the parameters on the spectrum and the correlation of data. The agreement between the model and MC data was very strong. The mean spectra with 90 keV and 140 kVp agreed exceptionally well: χ red 2 values were 1.049 with 90 keV data and 1.007 with 140 kVp data. The degrees of cross talk (in terms of the relative increase from single pixel irradiation to flat field irradiation) were 22% with 90 keV and 19% with 140 kVp for MC simulations, while they were 21% and 17%, respectively, for the model. The covariance was in strong agreement qualitatively, although it was overestimated. The noisy data generation was very efficient, taking less than a CPU minute as opposed to CPU hours for MC simulators. The authors have developed a novel, computationally efficient PCD model that takes into account double-counting and resulting spatio-energetic correlation between PCD pixels. The MC simulation validated the accuracy.

  14. Experimental evaluation of the extended Dytlewski-style dead time correction formalism for neutron multiplicity counting

    DOE PAGES

    Lockhart, M.; Henzlova, D.; Croft, S.; ...

    2017-09-20

    Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli (DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory andmore » implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. Here, we discuss and present the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. To assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. In conclusion, the DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.« less

  15. Experimental evaluation of the extended Dytlewski-style dead time correction formalism for neutron multiplicity counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lockhart, M.; Henzlova, D.; Croft, S.

    Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli (DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory andmore » implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. Here, we discuss and present the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. To assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. In conclusion, the DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.« less

  16. Feasibility of ultra low-dose thallium stress-redistribution protocol including prone imaging in obese patients using CZT camera.

    PubMed

    Kincl, Vladimír; Kamínek, Milan; Vašina, Jiří; Panovský, Roman; Havel, Martin

    2016-09-01

    High efficiency cadmium-zinc-telluride (CZT) cameras provide an opportunity to lower the injected activities of radiopharmaceuticals for single photon emission tomography (SPECT) myocardial perfusion imaging (MPI). The limits for reducing activities of thallium have not been determined, particularly in obese patients. After an injection of 0.7 megabecquerel (MBq) of thallium/kg, we collected an average 1.5 million counts for the 10-min acquisition in a pilot cohort of ten patients. After extrapolation, we reduced the administered activity to 0.5 MBq/kg to obtain the expected 1 million counts. We studied the image quality in 124 patients (86 men, 43 obese with body mass index over 30 kg/m 2 ) referred for MPI. The quality of images was assessed by a number of recorded counts and visually by a four-grade scale (one-poor quality, four-excellent quality). In non-obese and obese patients, the average number of recorded counts was 1.1 vs. 1.07 million counts for the 10-min stress acquisition, 1.04 vs. 1.06 million counts for the 13-min rest acquisition, and the average quality score was 3.97 vs. 3.90, respectively (p = NS).The mean administered activity was 39.2 ± 7 MBq for non-obese and 48.7 ± 6 for obese patients (p < 0.0001), and the calculated effective dose was 4.0 ± 0.7 and 4.9 ± 0.6 mSv respectively (p < 0.0001). The ultra-low-dose thallium stress-redistribution protocol, including post-stress prone imaging, provides good quality of images with a low radiation burden, even in obese patients.

  17. Gamma-gamma coincidence performance of LaBr 3:Ce scintillation detectors vs HPGe detectors in high count-rate scenarios

    DOE PAGES

    Drescher, A.; Yoho, M.; Landsberger, S.; ...

    2017-01-15

    In this study, a radiation detection system consisting of two cerium doped lanthanum bromide (LaBr 3:Ce) scintillation detectors in a gamma-gamma coincidence configuration has been used to demonstrate the advantages that coincident detection provides relative to a single detector, and the advantages that LaBr 3:Ce detectors provide relative to high purity germanium (HPGe) detectors. Signal to noise ratios of select photopeak pairs for these detectors have been compared to high-purity germanium (HPGe) detectors in both single and coincident detector configurations in order to quantify the performance of each detector configuration. The efficiency and energy resolution of LaBr 3:Ce detectors havemore » been determined and compared to HPGe detectors. Coincident gamma-ray pairs from the radionuclides 152Eu and 133Ba have been identified in a sample that is dominated by 137Cs. Gamma-gamma coincidence successfully reduced the Compton continuum from the large 137Cs peak, revealed several coincident gamma energies characteristic of these nuclides, and improved the signal-to-noise ratio relative to single detector measurements. LaBr 3:Ce detectors performed at count rates multiple times higher than can be achieved with HPGe detectors. The standard background spectrum consisting of peaks associated with transitions within the LaBr 3:Ce crystal has also been significantly reduced. Finally, it is shown that LaBr 3:Ce detectors have the unique capability to perform gamma-gamma coincidence measurements in very high count rate scenarios, which can potentially benefit nuclear safeguards in situ measurements of spent nuclear fuel.« less

  18. Gamma-gamma coincidence performance of LaBr 3:Ce scintillation detectors vs HPGe detectors in high count-rate scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drescher, A.; Yoho, M.; Landsberger, S.

    In this study, a radiation detection system consisting of two cerium doped lanthanum bromide (LaBr 3:Ce) scintillation detectors in a gamma-gamma coincidence configuration has been used to demonstrate the advantages that coincident detection provides relative to a single detector, and the advantages that LaBr 3:Ce detectors provide relative to high purity germanium (HPGe) detectors. Signal to noise ratios of select photopeak pairs for these detectors have been compared to high-purity germanium (HPGe) detectors in both single and coincident detector configurations in order to quantify the performance of each detector configuration. The efficiency and energy resolution of LaBr 3:Ce detectors havemore » been determined and compared to HPGe detectors. Coincident gamma-ray pairs from the radionuclides 152Eu and 133Ba have been identified in a sample that is dominated by 137Cs. Gamma-gamma coincidence successfully reduced the Compton continuum from the large 137Cs peak, revealed several coincident gamma energies characteristic of these nuclides, and improved the signal-to-noise ratio relative to single detector measurements. LaBr 3:Ce detectors performed at count rates multiple times higher than can be achieved with HPGe detectors. The standard background spectrum consisting of peaks associated with transitions within the LaBr 3:Ce crystal has also been significantly reduced. Finally, it is shown that LaBr 3:Ce detectors have the unique capability to perform gamma-gamma coincidence measurements in very high count rate scenarios, which can potentially benefit nuclear safeguards in situ measurements of spent nuclear fuel.« less

  19. Low-noise low-jitter 32-pixels CMOS single-photon avalanche diodes array for single-photon counting from 300 nm to 900 nm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scarcella, Carmelo; Tosi, Alberto, E-mail: alberto.tosi@polimi.it; Villa, Federica

    2013-12-15

    We developed a single-photon counting multichannel detection system, based on a monolithic linear array of 32 CMOS SPADs (Complementary Metal-Oxide-Semiconductor Single-Photon Avalanche Diodes). All channels achieve a timing resolution of 100 ps (full-width at half maximum) and a photon detection efficiency of 50% at 400 nm. Dark count rate is very low even at room temperature, being about 125 counts/s for 50 μm active area diameter SPADs. Detection performance and microelectronic compactness of this CMOS SPAD array make it the best candidate for ultra-compact time-resolved spectrometers with single-photon sensitivity from 300 nm to 900 nm.

  20. A miniaturized 4 K platform for superconducting infrared photon counting detectors

    NASA Astrophysics Data System (ADS)

    Gemmell, Nathan R.; Hills, Matthew; Bradshaw, Tom; Rawlings, Tom; Green, Ben; Heath, Robert M.; Tsimvrakidis, Konstantinos; Dobrovolskiy, Sergiy; Zwiller, Val; Dorenbos, Sander N.; Crook, Martin; Hadfield, Robert H.

    2017-11-01

    We report on a miniaturized platform for superconducting infrared photon counting detectors. We have implemented a fibre-coupled superconducting nanowire single photon detector in a Stirling/Joule-Thomson platform with a base temperature of 4.2 K. We have verified a cooling power of 4 mW at 4.7 K. We report 20% system detection efficiency at 1310 nm wavelength at a dark count rate of 1 kHz. We have carried out compelling application demonstrations in single photon depth metrology and singlet oxygen luminescence detection.

  1. Time-resolved single-photon detection module based on silicon photomultiplier: A novel building block for time-correlated measurement systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martinenghi, E., E-mail: edoardo.martinenghi@polimi.it; Di Sieno, L.; Contini, D.

    2016-07-15

    We present the design and preliminary characterization of the first detection module based on Silicon Photomultiplier (SiPM) tailored for single-photon timing applications. The aim of this work is to demonstrate, thanks to the design of a suitable module, the possibility to easily exploit SiPM in many applications as an interesting detector featuring large active area, similarly to photomultipliers tubes, but keeping the advantages of solid state detectors (high quantum efficiency, low cost, compactness, robustness, low bias voltage, and insensitiveness to magnetic field). The module integrates a cooled SiPM with a total photosensitive area of 1 mm{sup 2} together with themore » suitable avalanche signal read-out circuit, the signal conditioning, the biasing electronics, and a Peltier cooler driver for thermal stabilization. It is able to extract the single-photon timing information with resolution better than 100 ps full-width at half maximum. We verified the effective stabilization in response to external thermal perturbations, thus proving the complete insensitivity of the module to environment temperature variations, which represents a fundamental parameter to profitably use the instrument for real-field applications. We also characterized the single-photon timing resolution, the background noise due to both primary dark count generation and afterpulsing, the single-photon detection efficiency, and the instrument response function shape. The proposed module can become a reliable and cost-effective building block for time-correlated single-photon counting instruments in applications requiring high collection capability of isotropic light and detection efficiency (e.g., fluorescence decay measurements or time-domain diffuse optics systems).« less

  2. Process to evaluate hematological parameters that reflex to manual differential cell counts in a pediatric institution.

    PubMed

    Guarner, Jeannette; Atuan, Maria Ana; Nix, Barbara; Mishak, Christopher; Vejjajiva, Connie; Curtis, Cheri; Park, Sunita; Mullins, Richard

    2010-01-01

    Each institution sets specific parameters obtained by automated hematology analyzers to trigger manual counts. We designed a process to decrease the number of manual differential cell counts without impacting patient care. We selected new criteria that prompt manual counts and studied the impact these changes had in 2 days of work and in samples of patients with newly diagnosed leukemia, sickle cell disease, and presence of left shift. By using fewer parameters and expanding our ranges we decreased the number of manual counts by 20%. The parameters that prompted manual counts most frequently were the presence of blast flags and nucleated red blood cells, 2 parameters that were not changed. The parameters that accounted for a decrease in the number of manual counts were the white blood cell count and large unstained cells. Eight of 32 patients with newly diagnosed leukemia did not show blast flags; however, other parameters triggered manual counts. In 47 patients with sickle cell disease, nucleated red cells and red cell variability prompted manual review. Bands were observed in 18% of the specimens and 4% would not have been counted manually with the new criteria, for the latter the mean band count was 2.6%. The process we followed to evaluate hematological parameters that reflex to manual differential cell counts increased efficiency without compromising patient care in our hospital system.

  3. DMRfinder: efficiently identifying differentially methylated regions from MethylC-seq data.

    PubMed

    Gaspar, John M; Hart, Ronald P

    2017-11-29

    DNA methylation is an epigenetic modification that is studied at a single-base resolution with bisulfite treatment followed by high-throughput sequencing. After alignment of the sequence reads to a reference genome, methylation counts are analyzed to determine genomic regions that are differentially methylated between two or more biological conditions. Even though a variety of software packages is available for different aspects of the bioinformatics analysis, they often produce results that are biased or require excessive computational requirements. DMRfinder is a novel computational pipeline that identifies differentially methylated regions efficiently. Following alignment, DMRfinder extracts methylation counts and performs a modified single-linkage clustering of methylation sites into genomic regions. It then compares methylation levels using beta-binomial hierarchical modeling and Wald tests. Among its innovative attributes are the analyses of novel methylation sites and methylation linkage, as well as the simultaneous statistical analysis of multiple sample groups. To demonstrate its efficiency, DMRfinder is benchmarked against other computational approaches using a large published dataset. Contrasting two replicates of the same sample yielded minimal genomic regions with DMRfinder, whereas two alternative software packages reported a substantial number of false positives. Further analyses of biological samples revealed fundamental differences between DMRfinder and another software package, despite the fact that they utilize the same underlying statistical basis. For each step, DMRfinder completed the analysis in a fraction of the time required by other software. Among the computational approaches for identifying differentially methylated regions from high-throughput bisulfite sequencing datasets, DMRfinder is the first that integrates all the post-alignment steps in a single package. Compared to other software, DMRfinder is extremely efficient and unbiased in this process. DMRfinder is free and open-source software, available on GitHub ( github.com/jsh58/DMRfinder ); it is written in Python and R, and is supported on Linux.

  4. Optimizing Imaging Instruments for Emission Mammography

    NASA Astrophysics Data System (ADS)

    Weinberg, Irving N.

    1996-05-01

    Clinical studies have demonstrated that radiotracer methods can noninvasively detect breast cancers in vivo(L.P. Adler, J.P.Crowe, N.K. Al-Kaisis, et al, Radiology 187,743-750 (1993)) (I. Khalkhali, I. Mena, E. Jouanne, et al, J. Am. Coll. Surg. 178, 491-497 (1994)). Due to spatial resolution and count efficiency considerations, users of conventional nuclear medicine instruments have had difficulty in detecting subcentimeter cancers. This limitation is unfortunate, since cancer therapy is generally most efficacious when tumor diameter at detection is less than a centimeter. A more subtle limitation of conventional nuclear medicine imaging instruments is that they are poorly suited to guiding interventions. With the assistance of C.J. Thompson from McGill University, and the CEBAF Detector Physics Group, we have explored the possibility of configuring detectors for nuclear medicine imaging devices into geometries that resemble conventional x-ray mammography cameras(I.N. Weinberg, U.S.Patent 5,252,830 (1993)). Phantom and pilot clinical studies suggest that applying breast compression within such geometries may offer several advantages(C.J. Thompson, K. Murthy, I.N. Weinberg, et al, Med. Physics 21, 259-538 (1994)): For coincident detection of positron emitters, efficiency and spatial resolution are improved by bringing the detectors very close to the source (the breast tumor). For single-photon detection, attenuation due to overlying tissue is reduced. Since, for a high-efficiency collimator, spatial resolution worsens with increasing source to collimator distance, adoption of compression allows more efficient collimators to be employed. Economics are favorable in that detectors can be deployed in the region of interest, rather than around the entire body, and that such detectors can be mounted in conventional mammographic gantries. The application of conventional mammographic geometry promises to assist physicians in conducting radiotracer-guided biopsies, and in correlating biochemical with x-ray data. The primary challenge of conducting studies with dedicated emission mammography devices has been dealing with high count rates due to cardiac activity.

  5. High Accuracy Verification of a Correlated-Photon-Based Method for Determining Photon-Counting Detection Efficiency

    DTIC Science & Technology

    2007-01-01

    Metrology; (270.5290) Photon statistics. References and links 1. W. H. Louisell, A. Yariv, and A. E. Siegman , “Quantum Fluctuations and Noise in...939–941 (1981). 7. S. R. Bowman, Y. H. Shih, and C. O. Alley, “The use of Geiger mode avalanche photodiodes for precise laser ranging at very low...light levels: An experimental evaluation”, in Laser Radar Technology and Applications I, James M. Cruickshank, Robert C. Harney eds., Proc. SPIE 663

  6. Performance of a high-sensitivity dedicated cardiac SPECT scanner for striatal uptake quantification in the brain based on analysis of projection data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Mi-Ae; Moore, Stephen C.; McQuaid, Sarah J.

    Purpose: The authors have previously reported the advantages of high-sensitivity single-photon emission computed tomography (SPECT) systems for imaging structures located deep inside the brain. DaTscan (Isoflupane I-123) is a dopamine transporter (DaT) imaging agent that has shown potential for early detection of Parkinson disease (PD), as well as for monitoring progression of the disease. Realizing the full potential of DaTscan requires efficient estimation of striatal uptake from SPECT images. They have evaluated two SPECT systems, a conventional dual-head gamma camera with low-energy high-resolution collimators (conventional) and a dedicated high-sensitivity multidetector cardiac imaging system (dedicated) for imaging tasks related to PD.more » Methods: Cramer-Rao bounds (CRB) on precision of estimates of striatal and background activity concentrations were calculated from high-count, separate acquisitions of the compartments (right striata, left striata, background) of a striatal phantom. CRB on striatal and background activity concentration were calculated from essentially noise-free projection datasets, synthesized by scaling and summing the compartment projection datasets, for a range of total detected counts. They also calculated variances of estimates of specific-to-nonspecific binding ratios (BR) and asymmetry indices from these values using propagation of error analysis, as well as the precision of measuring changes in BR on the order of the average annual decline in early PD. Results: Under typical clinical conditions, the conventional camera detected 2 M counts while the dedicated camera detected 12 M counts. Assuming a normal BR of 5, the standard deviation of BR estimates was 0.042 and 0.021 for the conventional and dedicated system, respectively. For an 8% decrease to BR = 4.6, the signal-to-noise ratio were 6.8 (conventional) and 13.3 (dedicated); for a 5% decrease, they were 4.2 (conventional) and 8.3 (dedicated). Conclusions: This implies that PD can be detected earlier with the dedicated system than with the conventional system; therefore, earlier identification of PD progression should be possible with the high-sensitivity dedicated SPECT camera.« less

  7. Effect of age on the diagnostic efficiency of HbA1c for diabetes in a Chinese middle-aged and elderly population: The Shanghai Changfeng Study.

    PubMed

    Wu, Li; Lin, Huandong; Gao, Jian; Li, Xiaoming; Xia, Mingfeng; Wang, Dan; Aleteng, Qiqige; Ma, Hui; Pan, Baishen; Gao, Xin

    2017-01-01

    Glycated hemoglobin A1c (HbA1c) ≥6.5% (or 48mmol/mol) has been recommended as a new diagnostic criterion for diabetes; however, limited literature is available regarding the effect of age on the HbA1c for diagnosing diabetes and the causes for this age effect remain unknown. In this study, we investigated whether and why age affects the diagnostic efficiency of HbA1c for diabetes in a community-based Chinese population. In total, 4325 participants without previously known diabetes were enrolled in this study. Participants were stratified by age. Receiver operating characteristic curve (ROC) was plotted for each age group and the area under the curve (AUC) represented the diagnostic efficiency of HbA1c for diabetes defined by the plasma glucose criteria. The area under the ROC curve in each one-year age group was defined as AUCage. Multiple regression analyses were performed to identify factors inducing the association between age and AUCage based on the changes in the β and P values of age. The current threshold of HbA1c (≥6.5% or 48mmol/mol) showed low sensitivity (35.6%) and high specificity (98.9%) in diagnosing diabetes. ROC curve analyses showed that the diagnostic efficiency of HbA1c in the ≥75 years age group was significantly lower than that in the 45-54 years age group (AUC: 0.755 vs. 0.878; P<0.001). Pearson correlation analysis showed that the AUCage of HbA1c was negatively correlated with age (r = -0.557, P = 0.001). When adjusting the red blood cell (RBC) count in the multiple regression model, the negative association between age and AUCage disappeared, with the regression coefficient of age reversed to 0.001 and the P value increased to 0.856. The diagnostic efficiency of HbA1c for diabetes decreased with aging, and this age effect was induced by the decreasing RBC count with age. HbA1c is unsuitable for diagnosing diabetes in elderly individuals because of their physiologically decreased RBC count.

  8. Effect of age on the diagnostic efficiency of HbA1c for diabetes in a Chinese middle-aged and elderly population: The Shanghai Changfeng Study

    PubMed Central

    Gao, Jian; Li, Xiaoming; Xia, Mingfeng; Wang, Dan; Aleteng, Qiqige; Ma, Hui; Pan, Baishen

    2017-01-01

    Background and aims Glycated hemoglobin A1c (HbA1c) ≥6.5% (or 48mmol/mol) has been recommended as a new diagnostic criterion for diabetes; however, limited literature is available regarding the effect of age on the HbA1c for diagnosing diabetes and the causes for this age effect remain unknown. In this study, we investigated whether and why age affects the diagnostic efficiency of HbA1c for diabetes in a community-based Chinese population. Methods In total, 4325 participants without previously known diabetes were enrolled in this study. Participants were stratified by age. Receiver operating characteristic curve (ROC) was plotted for each age group and the area under the curve (AUC) represented the diagnostic efficiency of HbA1c for diabetes defined by the plasma glucose criteria. The area under the ROC curve in each one-year age group was defined as AUCage. Multiple regression analyses were performed to identify factors inducing the association between age and AUCage based on the changes in the β and P values of age. Results The current threshold of HbA1c (≥6.5% or 48mmol/mol) showed low sensitivity (35.6%) and high specificity (98.9%) in diagnosing diabetes. ROC curve analyses showed that the diagnostic efficiency of HbA1c in the ≥75 years age group was significantly lower than that in the 45–54 years age group (AUC: 0.755 vs. 0.878; P<0.001). Pearson correlation analysis showed that the AUCage of HbA1c was negatively correlated with age (r = -0.557, P = 0.001). When adjusting the red blood cell (RBC) count in the multiple regression model, the negative association between age and AUCage disappeared, with the regression coefficient of age reversed to 0.001 and the P value increased to 0.856. Conclusions The diagnostic efficiency of HbA1c for diabetes decreased with aging, and this age effect was induced by the decreasing RBC count with age. HbA1c is unsuitable for diagnosing diabetes in elderly individuals because of their physiologically decreased RBC count. PMID:28886160

  9. A low-latency high-port count optical switch with optical delay line buffering for disaggregated data centers

    NASA Astrophysics Data System (ADS)

    Moralis-Pegios, M.; Terzenidis, N.; Mourgias-Alexandris, G.; Vyrsokinos, K.; Pleros, N.

    2018-02-01

    Disaggregated Data Centers (DCs) have emerged as a powerful architectural framework towards increasing resource utilization and system power efficiency, requiring, however, a networking infrastructure that can ensure low-latency and high-bandwidth connectivity between a high-number of interconnected nodes. This reality has been the driving force towards high-port count and low-latency optical switching platforms, with recent efforts concluding that the use of distributed control architectures as offered by Broadcast-and-Select (BS) layouts can lead to sub-μsec latencies. However, almost all high-port count optical switch designs proposed so far rely either on electronic buffering and associated SerDes circuitry for resolving contention or on buffer-less designs with packet drop and re-transmit procedures, unavoidably increasing latency or limiting throughput. In this article, we demonstrate a 256x256 optical switch architecture for disaggregated DCs that employs small-size optical delay line buffering in a distributed control scheme, exploiting FPGA-based header processing over a hybrid BS/Wavelength routing topology that is implemented by a 16x16 BS design and a 16x16 AWGR. Simulation-based performance analysis reveals that even the use of a 2- packet optical buffer can yield <620nsec latency with >85% throughput for up to 100% loads. The switch has been experimentally validated with 10Gb/s optical data packets using 1:16 optical splitting and a SOA-MZI wavelength converter (WC) along with fiber delay lines for the 2-packet buffer implementation at every BS outgoing port, followed by an additional SOA-MZI tunable WC and the 16x16 AWGR. Error-free performance in all different switch input/output combinations has been obtained with a power penalty of <2.5dB.

  10. Solar XUV Imaging and Non-dispersive Spectroscopy for Solar-C Enabled by Scientific CMOS APS Arrays

    NASA Astrophysics Data System (ADS)

    Stern, Robert A.; Lemen, J. R.; Shing, L.; Janesick, J.; Tower, J.

    2009-05-01

    Monolithic CMOS Advanced Pixel Sensor (APS) arrays are showing great promise as eventual replacements for the current workhorse of solar physics focal planes, the scientific CCD. CMOS APS devices have individually addressable pixels, increased radiation tolerance compared to CCDs, and require lower clock voltages, and thus lower power. However, commercially available CMOS chips, while suitable for use with intensifiers or fluorescent coatings, are generally not optimized for direct detection of EUV and X-ray photons. A high performance scientific CMOS array designed for these wavelengths will have significant new capabilities compared to CCDs, including the ability to read out small regions of the solar disk at high (sub sec) cadence, count single X-ray photons with Fano-limited energy resolution, and even operate at room temperature with good noise performance. Such capabilities will be crucial for future solar X-ray and EUV missions such as Solar-C. Sarnoff Corporation has developed scientific grade, monolithic CMOS arrays for X-ray imaging and photon counting. One prototype device, the "minimal" array, has 8 um pixels, is 15 to 25 um thick, is fabricated on high-resistivity ( 10 to 20 kohm-cm) Si wafers, and can be back-illuminated. These characteristics yield high quantum efficiency and high spatial resolution with minimal charge sharing among pixels, making it ideal for the detection of keV X-rays. When used with digital correlated double sampling, the array has demonstrated noise performance as low as 2 e, allowing single photon counting of X-rays over a range of temperatures. We report test results for this device in X-rays, and discuss the implications for future solar space missions.

  11. Multiparameter linear least-squares fitting to Poisson data one count at a time

    NASA Technical Reports Server (NTRS)

    Wheaton, Wm. A.; Dunklee, Alfred L.; Jacobsen, Allan S.; Ling, James C.; Mahoney, William A.; Radocinski, Robert G.

    1995-01-01

    A standard problem in gamma-ray astronomy data analysis is the decomposition of a set of observed counts, described by Poisson statistics, according to a given multicomponent linear model, with underlying physical count rates or fluxes which are to be estimated from the data. Despite its conceptual simplicity, the linear least-squares (LLSQ) method for solving this problem has generally been limited to situations in which the number n(sub i) of counts in each bin i is not too small, conventionally more than 5-30. It seems to be widely believed that the failure of the LLSQ method for small counts is due to the failure of the Poisson distribution to be even approximately normal for small numbers. The cause is more accurately the strong anticorrelation between the data and the wieghts w(sub i) in the weighted LLSQ method when square root of n(sub i) instead of square root of bar-n(sub i) is used to approximate the uncertainties, sigma(sub i), in the data, where bar-n(sub i) = E(n(sub i)), the expected value of N(sub i). We show in an appendix that, avoiding this approximation, the correct equations for the Poisson LLSQ (PLLSQ) problems are actually identical to those for the maximum likelihood estimate using the exact Poisson distribution. We apply the method to solve a problem in high-resolution gamma-ray spectroscopy for the JPL High-Resolution Gamma-Ray Spectrometer flown on HEAO 3. Systematic error in subtracting the strong, highly variable background encountered in the low-energy gamma-ray region can be significantly reduced by closely pairing source and background data in short segments. Significant results can be built up by weighted averaging of the net fluxes obtained from the subtraction of many individual source/background pairs. Extension of the approach to complex situations, with multiple cosmic sources and realistic background parameterizations, requires a means of efficiently fitting to data from single scans in the narrow (approximately = 1.2 keV, HEAO 3) energy channels of a Ge spectrometer, where the expected number of counts obtained per scan may be very low. Such an analysis system is discussed and compared to the method previously used.

  12. Spatial variability in the pollen count in Sydney, Australia: can one sampling site accurately reflect the pollen count for a region?

    PubMed

    Katelaris, Constance H; Burke, Therese V; Byth, Karen

    2004-08-01

    There is increasing interest in the daily pollen count, with pollen-sensitive individuals using it to determine medication use and researchers relying on it for commencing clinical drug trials and assessing drug efficacy according to allergen exposure. Counts are often expressed qualitatively as low, medium, and high, and often only 1 pollen trap is used for an entire region. To examine the spatial variability in the pollen count in Sydney, Australia, and to compare discrepancies among low-, medium-, and high-count days at 3 sites separated by a maximum of 30 km. Three sites in western Sydney were sampled using Burkard traps. Data from the 3 sites were used to compare vegetation differences, possible effects of some meteorological parameters, and discrepancies among sites in low-, medium-, and high-count days. Total pollen counts during the spring months were 14,382 grains/m3 at Homebush, 11,584 grains/m3 at Eastern Creek, and 9,269 grains/m3 at Nepean. The only significant correlation between differences in meteorological parameters and differences in pollen counts was the Homebush-Nepean differences in rainfall and pollen counts. Comparison between low- and high-count days among the 3 sites revealed a discordance rate of 8% to 17%. For informing the public about pollen counts, the count from 1 trap is a reasonable estimation in a 30-km region; however, the discrepancies among 3 trap sites would have a significant impact on the performance of a clinical trial where enrollment was determined by a low or high count. Therefore, for clinical studies, data collection must be local and applicable to the study population.

  13. Improving distillation method and device of tritiated water analysis for ultra high decontamination efficiency.

    PubMed

    Fang, Hsin-Fa; Wang, Chu-Fang; Lin, Chien-Kung

    2015-12-01

    It is important that monitoring environmental tritiated water for understanding the contamination dispersion of the nuclear facilities. Tritium is a pure beta radionuclide which is usually measured by Liquid Scintillation Counting (LSC). The average energy of tritum beta is only 5.658 keV that makes the LSC counting of tritium easily be interfered by the beta emitted by other radionuclides. Environmental tritiated water samples usually need to be decontaminated by distillation for reducing the interference. After Fukushima Nucleaer Accident, the highest gross beta concentration of groundwater samples obtained around Fukushima Daiichi Nuclear Power Station is over 1,000,000 Bq/l. There is a need for a distillation with ultra-high decontamination efficiency for environmental tritiated water analysis. This study is intended to improve the heating temperature control for better sub-boiling distillation control and modify the height of the container of the air cooling distillation device for better fractional distillation effect. The DF of Cs-137 of the distillation may reach 450,000 which is far better than the prior study. The average loss rate of the improved method and device is about 2.6% which is better than the bias value listed in the ASTM D4107-08. It is proven that the modified air cooling distillation device can provide an easy-handling, water-saving, low cost and effective way of purifying water samples for higher beta radionuclides contaminated water samples which need ultra-high decontamination treatment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Generalized linear models and point count data: statistical considerations for the design and analysis of monitoring studies

    Treesearch

    Nathaniel E. Seavy; Suhel Quader; John D. Alexander; C. John Ralph

    2005-01-01

    The success of avian monitoring programs to effectively guide management decisions requires that studies be efficiently designed and data be properly analyzed. A complicating factor is that point count surveys often generate data with non-normal distributional properties. In this paper we review methods of dealing with deviations from normal assumptions, and we focus...

  15. Protozoa and metazoa relations to technological conditions of non-woven textile filters for wastewater treatment.

    PubMed

    Spychała, Marcin; Sowińska, Aleksandra; Starzyk, Justyna; Masłowski, Adam

    2015-01-01

    The objective of this study was a preliminary identification of basic groups of micro-organisms in the cross-sectional profile of geotextile filters for septic tank effluent (STE) treatment and their relations to technological conditions. Reactors with textile filters treating wastewater were investigated on a semi-technical scale. Filters were vertically situated and STE was filtered through them under hydrostatic pressure at a wastewater surface height of 7-20 cm. Filters were made of four layers of non-woven TS 20 geotextile of 0.9 mm thickness. Various groups of organisms were observed; the most abundant group comprised free-swimming and crawling ciliates, less abundant were stalked ciliates and the least numerous were nematodes. The individual counts of all groups of micro-organisms investigated during the study were variable according to time and space. The high abundance of Opercularia, a commonly observed genus of stalked ciliates, was related to the high efficiency of wastewater treatment and dissolved oxygen concentration of about 1.0 g/m3. Numbers of free-swimming and crawling ciliates had a tendency to decrease in relation to the depth of filter cross-sectional profile. The variability in counts of particular groups of organisms could be related to the local stress conditions. No correlation between identified organism count and total mass concentration in the cross-sectional filter profile was found.

  16. Kmerind: A Flexible Parallel Library for K-mer Indexing of Biological Sequences on Distributed Memory Systems.

    PubMed

    Pan, Tony; Flick, Patrick; Jain, Chirag; Liu, Yongchao; Aluru, Srinivas

    2017-10-09

    Counting and indexing fixed length substrings, or k-mers, in biological sequences is a key step in many bioinformatics tasks including genome alignment and mapping, genome assembly, and error correction. While advances in next generation sequencing technologies have dramatically reduced the cost and improved latency and throughput, few bioinformatics tools can efficiently process the datasets at the current generation rate of 1.8 terabases every 3 days. We present Kmerind, a high performance parallel k-mer indexing library for distributed memory environments. The Kmerind library provides a set of simple and consistent APIs with sequential semantics and parallel implementations that are designed to be flexible and extensible. Kmerind's k-mer counter performs similarly or better than the best existing k-mer counting tools even on shared memory systems. In a distributed memory environment, Kmerind counts k-mers in a 120 GB sequence read dataset in less than 13 seconds on 1024 Xeon CPU cores, and fully indexes their positions in approximately 17 seconds. Querying for 1% of the k-mers in these indices can be completed in 0.23 seconds and 28 seconds, respectively. Kmerind is the first k-mer indexing library for distributed memory environments, and the first extensible library for general k-mer indexing and counting. Kmerind is available at https://github.com/ParBLiSS/kmerind.

  17. Linear Mode HgCdTe Avalanche Photodiodes for Photon Counting Applications

    NASA Technical Reports Server (NTRS)

    Sullivan, William, III; Beck, Jeffrey; Scritchfield, Richard; Skokan, Mark; Mitra, Pradip; Sun, Xiaoli; Abshire, James; Carpenter, Darren; Lane, Barry

    2015-01-01

    An overview of recent improvements in the understanding and maturity of linear mode photon counting with HgCdTe electron-initiated avalanche photodiodes is presented. The first HgCdTe LMPC 2x8 format array fabricated in 2011 with 64 micron pitch was a remarkable success in terms of demonstrating a high single photon signal to noise ratio of 13.7 with an excess noise factor of 1.3-1.4, a 7 ns minimum time between events, and a broad spectral response extending from 0.4 micron to 4.2 micron. The main limitations were a greater than 10x higher false event rate than expected of greater than 1 MHz, a 5-7x lower than expected APD gain, and a photon detection efficiency of only 50% when greater than 60% was expected. This paper discusses the reasons behind these limitations and the implementation of their mitigations with new results.

  18. ProMotE: an efficient algorithm for counting independent motifs in uncertain network topologies.

    PubMed

    Ren, Yuanfang; Sarkar, Aisharjya; Kahveci, Tamer

    2018-06-26

    Identifying motifs in biological networks is essential in uncovering key functions served by these networks. Finding non-overlapping motif instances is however a computationally challenging task. The fact that biological interactions are uncertain events further complicates the problem, as it makes the existence of an embedding of a given motif an uncertain event as well. In this paper, we develop a novel method, ProMotE (Probabilistic Motif Embedding), to count non-overlapping embeddings of a given motif in probabilistic networks. We utilize a polynomial model to capture the uncertainty. We develop three strategies to scale our algorithm to large networks. Our experiments demonstrate that our method scales to large networks in practical time with high accuracy where existing methods fail. Moreover, our experiments on cancer and degenerative disease networks show that our method helps in uncovering key functional characteristics of biological networks.

  19. Lung counting: comparison of detector performance with a four detector array that has either metal or carbon fibre end caps, and the effect on mda calculation.

    PubMed

    Ahmed, Asm Sabbir; Hauck, Barry; Kramer, Gary H

    2012-08-01

    This study described the performance of an array of high-purity Germanium detectors, designed with two different end cap materials-steel and carbon fibre. The advantages and disadvantages of using this detector type in the estimation of the minimum detectable activity (MDA) for different energy peaks of isotope (152)Eu were illustrated. A Monte Carlo model was developed to study the detection efficiency for the detector array. A voxelised Lawrence Livermore torso phantom, equipped with lung, chest plates and overlay plates, was used to mimic a typical lung counting protocol with the array of detectors. The lung of the phantom simulated the volumetric source organ. A significantly low MDA was estimated for energy peaks at 40 keV and at a chest wall thickness of 6.64 cm.

  20. CROSS-DISCIPLINARY PHYSICS AND RELATED AREAS OF SCIENCE AND TECHNOLOGY: Surface Micromachined Adjustable Micro-Concave Mirror for Bio-Detection Applications

    NASA Astrophysics Data System (ADS)

    Kuo, Ju-Nan; Chen, Wei-Lun; Jywe, Wen-Yuh

    2009-08-01

    We present a bio-detection system integrated with an adjustable micro-concave mirror. The bio-detection system consists of an adjustable micro-concave mirror, micro flow cytometer chip and optical detection module. The adjustable micro-concave mirror can be fabricated with ease using commercially available MEMS foundry services (such as multiuser MEMS processes, MUMPs) and its curvature can be controlled utilizing thermal or electrical effects. Experimental results show that focal lengths of the micro-concave mirror ranging from 313.5 to 2275.0 μm are achieved. The adjustable micro-concave mirror can be used to increase the efficiency of optical detection and provide a high signal-to-noise ratio. The developed micro-concave mirror is integrated with a micro flow cytometer for cell counting applications. Successful counting of fluorescent-labeled beads is demonstrated using the developed method.

  1. Visible and Ultraviolet Detectors for High Earth Orbit and Lunar Observatories

    NASA Technical Reports Server (NTRS)

    Woodgate, Bruce E.

    1989-01-01

    The current status of detectors for the visible and UV for future large observatories in earth orbit and the moon is briefly reviewed. For the visible, CCDs have the highest quantum efficiency, but are subject to contamination of the data by cosmic ray hits. For the moon, the level of hits can be brought down to that at the earth's surface by shielding below about 20 meters of rock. For high earth orbits above the geomagnetic shield, CCDs might be able to be used by combining many short exposures and vetoing the cosmic ray hits, otherwise photoemissive detectors will be necessary. For the UV, photoemissive detectors will be necessary to reject the visible; to use CCDs would require the development of UV-efficient filters which reject the visible by many orders of magnitude. Development of higher count rate capability would be desirable for photoemissive detectors.

  2. SNSPD with parallel nanowires (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Ejrnaes, Mikkel; Parlato, Loredana; Gaggero, Alessandro; Mattioli, Francesco; Leoni, Roberto; Pepe, Giampiero; Cristiano, Roberto

    2017-05-01

    Superconducting nanowire single-photon detectors (SNSPDs) have shown to be promising in applications such as quantum communication and computation, quantum optics, imaging, metrology and sensing. They offer the advantages of a low dark count rate, high efficiency, a broadband response, a short time jitter, a high repetition rate, and no need for gated-mode operation. Several SNSPD designs have been proposed in literature. Here, we discuss the so-called parallel nanowires configurations. They were introduced with the aim of improving some SNSPD property like detection efficiency, speed, signal-to-noise ratio, or photon number resolution. Although apparently similar, the various parallel designs are not the same. There is no one design that can improve the mentioned properties all together. In fact, each design presents its own characteristics with specific advantages and drawbacks. In this work, we will discuss the various designs outlining peculiarities and possible improvements.

  3. Cryogenic, high-resolution x-ray detector with high count rate capability

    DOEpatents

    Frank, Matthias; Mears, Carl A.; Labov, Simon E.; Hiller, Larry J.; Barfknecht, Andrew T.

    2003-03-04

    A cryogenic, high-resolution X-ray detector with high count rate capability has been invented. The new X-ray detector is based on superconducting tunnel junctions (STJs), and operates without thermal stabilization at or below 500 mK. The X-ray detector exhibits good resolution (.about.5-20 eV FWHM) for soft X-rays in the keV region, and is capable of counting at count rates of more than 20,000 counts per second (cps). Simple, FET-based charge amplifiers, current amplifiers, or conventional spectroscopy shaping amplifiers can provide the electronic readout of this X-ray detector.

  4. High-Throughput Method for Automated Colony and Cell Counting by Digital Image Analysis Based on Edge Detection

    PubMed Central

    Choudhry, Priya

    2016-01-01

    Counting cells and colonies is an integral part of high-throughput screens and quantitative cellular assays. Due to its subjective and time-intensive nature, manual counting has hindered the adoption of cellular assays such as tumor spheroid formation in high-throughput screens. The objective of this study was to develop an automated method for quick and reliable counting of cells and colonies from digital images. For this purpose, I developed an ImageJ macro Cell Colony Edge and a CellProfiler Pipeline Cell Colony Counting, and compared them to other open-source digital methods and manual counts. The ImageJ macro Cell Colony Edge is valuable in counting cells and colonies, and measuring their area, volume, morphology, and intensity. In this study, I demonstrate that Cell Colony Edge is superior to other open-source methods, in speed, accuracy and applicability to diverse cellular assays. It can fulfill the need to automate colony/cell counting in high-throughput screens, colony forming assays, and cellular assays. PMID:26848849

  5. High-voltage spark carbon-fiber sticky-tape data analyzer

    NASA Technical Reports Server (NTRS)

    Yang, L. C.; Hull, G. G.

    1980-01-01

    An efficient method for detecting carbon fibers collected on a stick tape monitor was developed. The fibers were released from a simulated crash fire situation containing carbon fiber composite material. The method utilized the ability of the fiber to initiate a spark across a set of alternately biased high voltage electrodes to electronically count the number of fiber fragments collected on the tape. It was found that the spark, which contains high energy and is of very short duration, is capable of partially damaging or consuming the fiber fragments. It also creates a mechanical disturbance which ejects the fiber from the grid. Both characteristics were helpful in establishing a single discharge pulse for each fiber segment.

  6. High-efficiency and low-background multi-segmented proportional gas counter for β-decay spectroscopy

    NASA Astrophysics Data System (ADS)

    Mukai, M.; Hirayama, Y.; Watanabe, Y. X.; Schury, P.; Jung, H. S.; Ahmed, M.; Haba, H.; Ishiyama, H.; Jeong, S. C.; Kakiguchi, Y.; Kimura, S.; Moon, J. Y.; Oyaizu, M.; Ozawa, A.; Park, J. H.; Ueno, H.; Wada, M.; Miyatake, H.

    2018-03-01

    A multi-segmented proportional gas counter (MSPGC) with high detection efficiency and low-background event rate has been developed for β-decay spectroscopy. The MSPGC consists of two cylindrically aligned layers of 16 counters (32 counters in total). Each counter has a long active length and small trapezoidal cross-section, and the total solid angle of the 32 counters is 80% of 4 π. β-rays are distinguished from the background events including cosmic-rays by analyzing the hit patterns of independent counters. The deduced intrinsic detection efficiency of each counter was almost 100%. The measured background event rate was 0.11 counts per second using the combination of veto counters for cosmic-rays and lead block shields for background γ-rays. The MSPGC was applied to measure the β-decay half-lives of 198Ir and 199mPt. The evaluated half-lives of T1/2 = 9 . 8(7) s and 12.4(7) s for 198Ir and 199mPt, respectively, were in agreement with previously reported values. The estimated absolute detection efficiency of the MSPGC from GEANT4 simulations was consistent with the evaluated efficiency from the analysis of the β- γ spectroscopy of 199Pt, saturating at approximately 60% for Qβ > 4 MeV.

  7. High-efficiency scintillation detector for combined detection of thermal and fast neutrons and gamma radiation

    DOEpatents

    Chiles, M.M.; Mihalczo, J.T.; Blakeman, E.D.

    1987-02-27

    A scintillation based radiation detector for the combined detection of thermal neutrons, high-energy neutrons and gamma rays in a single detecting unit. The detector consists of a pair of scintillators sandwiched together and optically coupled to the light sensitive face of a photomultiplier tube. A light tight radiation pervious housing is disposed about the scintillators and a portion of the photomultiplier tube to hold the arrangement in assembly and provides a radiation window adjacent the outer scintillator through which the radiation to be detected enters the detector. The outer scintillator is formed of a material in which scintillations are produced by thermal-neutrons and the inner scintillator is formed of a material in which scintillations are produced by high-energy neutrons and gamma rays. The light pulses produced by events detected in both scintillators are coupled to the photomultiplier tube which produces a current pulse in response to each detected event. These current pulses may be processed in a conventional manner to produce a count rate output indicative of the total detected radiation event count rate. Pulse discrimination techniques may be used to distinguish the different radiations and their energy distribution.

  8. High-efficiency scintillation detector for combined of thermal and fast neutrons and gamma radiation

    DOEpatents

    Chiles, Marion M.; Mihalczo, John T.; Blakeman, Edward D.

    1989-02-07

    A scintillation based radiation detector for the combined detection of thermal neutrons, high-energy neutrons and gamma rays in a single detecting unit. The detector consists of a pair of scintillators sandwiched together and optically coupled to the light sensitive face of a photomultiplier tube. A light tight radiation pervious housing is disposed about the scintillators and a portion of the photomultiplier tube to hold the arrangement in assembly and provides a radiation window adjacent the outer scintillator through which the radiation to be detected enters the detector. The outer scintillator is formed of a material in which scintillations are produced by thermal-neutrons and the inner scintillator is formed of a material in which scintillations are produced by high-energy neutrons and gamma rays. The light pulses produced by events detected in both scintillators are coupled to the photomultiplier tube which produces a current pulse in response to each detected event. These current pulses may be processed in a conventional manner to produce a count rate output indicative of the total detected radiation even count rate. Pulse discrimination techniques may be used to distinguish the different radiations and their energy distribution.

  9. High-efficiency scintillation detector for combined of thermal and fast neutrons and gamma radiation

    DOEpatents

    Chiles, Marion M.; Mihalczo, John T.; Blakeman, Edward D.

    1989-01-01

    A scintillation based radiation detector for the combined detection of thermal neutrons, high-energy neutrons and gamma rays in a single detecting unit. The detector consists of a pair of scintillators sandwiched together and optically coupled to the light sensitive face of a photomultiplier tube. A light tight radiation pervious housing is disposed about the scintillators and a portion of the photomultiplier tube to hold the arrangement in assembly and provides a radiation window adjacent the outer scintillator through which the radiation to be detected enters the detector. The outer scintillator is formed of a material in which scintillations are produced by thermal-neutrons and the inner scintillator is formed of a material in which scintillations are produced by high-energy neutrons and gamma rays. The light pulses produced by events detected in both scintillators are coupled to the photomultiplier tube which produces a current pulse in response to each detected event. These current pulses may be processed in a conventional manner to produce a count rate output indicative of the total detected radiation even count rate. Pulse discrimination techniques may be used to distinguish the different radiations and their energy distribution.

  10. Standardisation of the (129)I, (151)Sm and (166m)Ho activity concentration using the CIEMAT/NIST efficiency tracing method.

    PubMed

    Altzitzoglou, Timotheos; Rožkov, Andrej

    2016-03-01

    The (129)I, (151)Sm and (166m)Ho standardisations using the CIEMAT/NIST efficiency tracing method, that have been carried out in the frame of the European Metrology Research Program project "Metrology for Radioactive Waste Management" are described. The radionuclide beta counting efficiencies were calculated using two computer codes CN2005 and MICELLE2. The sensitivity analysis of the code input parameters (ionization quenching factor, beta shape factor) on the calculated efficiencies was performed, and the results are discussed. The combined relative standard uncertainty of the standardisations of the (129)I, (151)Sm and (166m)Ho solutions were 0.4%, 0.5% and 0.4%, respectively. The stated precision obtained using the CIEMAT/NIST method is better than that previously reported in the literature obtained by the TDCR ((129)I), the 4πγ-NaI ((166m)Ho) counting or the CIEMAT/NIST method ((151)Sm). Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. Within-site variability in surveys of wildlife populations

    USGS Publications Warehouse

    Link, William A.; Barker, Richard J.; Sauer, John R.; Droege, Sam

    1994-01-01

    Most large-scale surveys of animal populations are based on counts of individuals observed during a sampling period, which are used as indexes to the population. The variability in these indexes not only reflects variability in population sizes among sites but also variability due to the inexactness of the counts. Repeated counts at survey sites can be used to document this additional source of variability and, in some applications, to mitigate its effects. We present models for evaluating the proportion of total variability in counts that is attributable to this within-site variability and apply them in the analysis of data from repeated counts on routes from the North American Breeding Bird Survey. We analyzed data on 98 species, obtaining estimates of these percentages, which ranged from 3.5 to 100% with a mean of 36.25%. For at least 14 of the species, more than half of the variation in counts was attributable to within-site sources. Counts for species with lower average counts had a higher percentage of within-site variability. We discuss the relative cost efficiency of replicating sites or initiating new sites for several objectives, concluding that it is frequently better to initiate new sites than to attempt to replicate existing sites.

  12. A Vision-Based Counting and Recognition System for Flying Insects in Intelligent Agriculture.

    PubMed

    Zhong, Yuanhong; Gao, Junyuan; Lei, Qilun; Zhou, Yao

    2018-05-09

    Rapid and accurate counting and recognition of flying insects are of great importance, especially for pest control. Traditional manual identification and counting of flying insects is labor intensive and inefficient. In this study, a vision-based counting and classification system for flying insects is designed and implemented. The system is constructed as follows: firstly, a yellow sticky trap is installed in the surveillance area to trap flying insects and a camera is set up to collect real-time images. Then the detection and coarse counting method based on You Only Look Once (YOLO) object detection, the classification method and fine counting based on Support Vector Machines (SVM) using global features are designed. Finally, the insect counting and recognition system is implemented on Raspberry PI. Six species of flying insects including bee, fly, mosquito, moth, chafer and fruit fly are selected to assess the effectiveness of the system. Compared with the conventional methods, the test results show promising performance. The average counting accuracy is 92.50% and average classifying accuracy is 90.18% on Raspberry PI. The proposed system is easy-to-use and provides efficient and accurate recognition data, therefore, it can be used for intelligent agriculture applications.

  13. A Vision-Based Counting and Recognition System for Flying Insects in Intelligent Agriculture

    PubMed Central

    Zhong, Yuanhong; Gao, Junyuan; Lei, Qilun; Zhou, Yao

    2018-01-01

    Rapid and accurate counting and recognition of flying insects are of great importance, especially for pest control. Traditional manual identification and counting of flying insects is labor intensive and inefficient. In this study, a vision-based counting and classification system for flying insects is designed and implemented. The system is constructed as follows: firstly, a yellow sticky trap is installed in the surveillance area to trap flying insects and a camera is set up to collect real-time images. Then the detection and coarse counting method based on You Only Look Once (YOLO) object detection, the classification method and fine counting based on Support Vector Machines (SVM) using global features are designed. Finally, the insect counting and recognition system is implemented on Raspberry PI. Six species of flying insects including bee, fly, mosquito, moth, chafer and fruit fly are selected to assess the effectiveness of the system. Compared with the conventional methods, the test results show promising performance. The average counting accuracy is 92.50% and average classifying accuracy is 90.18% on Raspberry PI. The proposed system is easy-to-use and provides efficient and accurate recognition data, therefore, it can be used for intelligent agriculture applications. PMID:29747429

  14. Comparison of viable plate count, turbidity measurement and real-time PCR for quantification of Porphyromonas gingivalis.

    PubMed

    Clais, S; Boulet, G; Van Kerckhoven, M; Lanckacker, E; Delputte, P; Maes, L; Cos, P

    2015-01-01

    The viable plate count (VPC) is considered as the reference method for bacterial enumeration in periodontal microbiology but shows some important limitations for anaerobic bacteria. As anaerobes such as Porphyromonas gingivalis are difficult to culture, VPC becomes time-consuming and less sensitive. Hence, efficient normalization of experimental data to bacterial cell count requires alternative rapid and reliable quantification methods. This study compared the performance of VPC with that of turbidity measurement and real-time PCR (qPCR) in an experimental context using highly concentrated bacterial suspensions. Our TaqMan-based qPCR assay for P. gingivalis 16S rRNA proved to be sensitive and specific. Turbidity measurements offer a fast method to assess P. gingivalis growth, but suffer from high variability and a limited dynamic range. VPC was very time-consuming and less repeatable than qPCR. Our study concludes that qPCR provides the most rapid and precise approach for P. gingivalis quantification. Although our data were gathered in a specific research context, we believe that our conclusions on the inferior performance of VPC and turbidity measurements in comparison to qPCR can be extended to other research and clinical settings and even to other difficult-to-culture micro-organisms. Various clinical and research settings require fast and reliable quantification of bacterial suspensions. The viable plate count method (VPC) is generally seen as 'the gold standard' for bacterial enumeration. However, VPC-based quantification of anaerobes such as Porphyromonas gingivalis is time-consuming due to their stringent growth requirements and shows poor repeatability. Comparison of VPC, turbidity measurement and TaqMan-based qPCR demonstrated that qPCR possesses important advantages regarding speed, accuracy and repeatability. © 2014 The Society for Applied Microbiology.

  15. Bifidobacterium breve IPLA20005 affects in vitro the expression of hly and luxS genes, related to the virulence of Listeria monocytogenes Lm23.

    PubMed

    Rios-Covian, David; Nogacka, Alicja; Salazar, Nuria; Hernández-Barranco, A M; Cuesta, Isabel; Gueimonde, Miguel; de Los Reyes Gavilán, Clara G

    2018-03-01

    Mechanistic features that characterize the interaction and inhibition of the food-borne pathogen Listeria monocytogenes by members of the genus Bifidobacterium still remain unclear. In the present work, we tried to shed light on the influence that co-cultivation of L. monocytogenes with Bifidobacterium breve may exert on both microorganisms and on virulence of the pathogen. Production of acetate and lactate was measured by gas chromatography and high-performance liquid chromatography, respectively; bacterial counts were obtained by plate count; gene expression was determined by RT-qPCR; and haemolytic activity was analyzed against goat erythrocytes. We found slightly but significantly lower final counts of Listeria and Bifidobacterium (p < 0.05) and lower haemolytic efficiency in L. monocytogenes cells from cocultures than in those from monocultures. In contrast, the hly and luxS genes, which code for the cytolysin listeriolysin O and participate in biofilm formation, respectively, were overexpressed when L. monocytogenes was grown in coculture. This indicates that the presence of Bifidobacterium is able to modify the gene expression and haemolytic activity of L. monocytogenes when both microorganisms grow together.

  16. Counting particles emitted by stratospheric aircraft and measuring size of particles emitted by stratospheric aircraft

    NASA Technical Reports Server (NTRS)

    Wilson, James Charles

    1994-01-01

    The ER-2 condensation nuclei counter (CNC) has been modified to reduce the diffusive losses of particles within the instrument. These changes have been successful in improving the counting efficiency of small particles at low pressures. Two techniques for measuring the size distributions of particles with diameters less than 0.17 micrometers have been evaluated. Both of these methods, the differential mobility analyzer (DMA) and the diffusion battery, have fundamental problems that limit their usefulness for stratospheric applications. We cannot recommend either for this application. Newly developed, alternative methods for measuring small particles include inertial separation with a low-loss critical orifice and thin-plate impactor device. This technique is now used to collect particles in the multisample aerosol collector housed in the ER-2 CNC-2, and shows some promise for particle size measurements when coupled with a CNC as a counting device. The modified focused-cavity aerosol spectrometer (FCAS) can determine the size distribution of particles with ambient diameters as small as about 0.07 micrometers. Data from this instrument indicates the presence of a nuclei mode when CNC-2 indicates high concentrations of particles, but cannot resolve important parameters of the distribution.

  17. Identification of Lactobacillus delbrueckii and Streptococcus thermophilus Strains Present in Artisanal Raw Cow Milk Cheese Using Real-time PCR and Classic Plate Count Methods.

    PubMed

    Stachelska, Milena A

    2017-12-04

    The aim of this paper was to detect Lactobacillus delbrueckii and Streptococcus thermophilus using real-time quantitative PCR assay in 7-day ripening cheese produced from unpasteurised milk. Real-time quantitative PCR assays were designed to identify and enumerate the chosen species of lactic acid bacteria (LAB) in ripened cheese. The results of molecular quantification and classic bacterial enumeration showed a high level of similarity proving that DNA extraction was carried out in a proper way and that genomic DNA solutions were free of PCR inhibitors. These methods revealed the presence of L. delbrueckii and S. thermophilus. The real-time PCR enabled quantification with a detection of 101-103 CFU/g of product. qPCR-standard curves were linear over seven log units down to 101 copies per reaction; efficiencies ranged from 77.9% to 93.6%. Cheese samples were analysed with plate count method and qPCR in parallel. Compared with the classic plate count method, the newly developed qPCR method provided faster and species specific identification of two dairy LAB and yielded comparable quantitative results.

  18. Detection and Counting of Orchard Trees from Vhr Images Using a Geometrical-Optical Model and Marked Template Matching

    NASA Astrophysics Data System (ADS)

    Maillard, Philippe; Gomes, Marília F.

    2016-06-01

    This article presents an original algorithm created to detect and count trees in orchards using very high resolution images. The algorithm is based on an adaptation of the "template matching" image processing approach, in which the template is based on a "geometricaloptical" model created from a series of parameters, such as illumination angles, maximum and ambient radiance, and tree size specifications. The algorithm is tested on four images from different regions of the world and different crop types. These images all have < 1 meter spatial resolution and were downloaded from the GoogleEarth application. Results show that the algorithm is very efficient at detecting and counting trees as long as their spectral and spatial characteristics are relatively constant. For walnut, mango and orange trees, the overall accuracy was clearly above 90%. However, the overall success rate for apple trees fell under 75%. It appears that the openness of the apple tree crown is most probably responsible for this poorer result. The algorithm is fully explained with a step-by-step description. At this stage, the algorithm still requires quite a bit of user interaction. The automatic determination of most of the required parameters is under development.

  19. WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code

    NASA Astrophysics Data System (ADS)

    Mendygral, P. J.; Radcliffe, N.; Kandalla, K.; Porter, D.; O'Neill, B. J.; Nolting, C.; Edmon, P.; Donnert, J. M. F.; Jones, T. W.

    2017-02-01

    We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it may be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.

  20. Fiber-coupled superconducting nanowire single-photon detectors integrated with a bandpass filter on the fiber end-face

    NASA Astrophysics Data System (ADS)

    Zhang, W. J.; Yang, X. Y.; Li, H.; You, L. X.; Lv, C. L.; Zhang, L.; Zhang, C. J.; Liu, X. Y.; Wang, Z.; Xie, X. M.

    2018-07-01

    Superconducting nanowire single-photon detectors (SNSPDs) with both high system detection efficiency (SDE) and low dark count rate (DCR) play significant roles in quantum information processes and various applications. The background dark counts of SNSPDs originate from the room temperature blackbody radiation coupled to the device via a fiber. Therefore, a bandpass filter (BPF) operated at low temperature with minimal insert loss is necessary to suppress the background DCR. Herein, a low-loss BPF integrated on a single-mode fiber end-face was designed, fabricated and verified for the low temperature implement. The fiber end-face BPF was featured with a typical passband width about 40 nm in the 1550 nm telecom band and a peak transmittance of over 0.98. SNSPD with high SDE fabricated on a distributed Bragg reflector was coupled to the BPF. The device with such a BPF showed an SDE of 80% at a DCR of 0.5 Hz, measured at 2.1 K. Compared the same device without a BPF, the DCR was reduced by over 13 dB with an SDE decrease of <3%.

  1. Comparison of microbiological loads and physicochemical properties of raw milk treated with single-/multiple-cycle high hydrostatic pressure and ultraviolet-C light

    NASA Astrophysics Data System (ADS)

    Hu, Guanglan; Zheng, Yuanrong; Wang, Danfeng; Zha, Baoping; Liu, Zhenmin; Deng, Yun

    2015-07-01

    The effects of ultraviolet-C radiation (UV-C, 11.8 W/m2), single-cycle and multiple-cycle high hydrostatic pressure (HHP at 200, 400 or 600 MPa) on microbial load and physicochemical quality of raw milk were evaluated. Reductions of aerobic plate count (APC) and coliform count (CC) by HHP were more than 99.9% and 98.7%, respectively. Inactivation efficiency of microorganisms increased with pressure level. At the same pressure level, two-cycle treatments caused lower APC, but did not show CC differences compared with single-cycle treatments. Reductions of APC and CC by UV-C were somewhere between 200 MPa and 400/600 MPa. Both HHP and UV-C significantly decreased lightness and increased pH, but did not change soluble solids content and thiobarbituric acid-reactive substances' values. Two 2.5 min cycles of HHP at 600 MPa caused minimum APC and CC, and maximum conductivity. Compared with HHP, UV-C markedly increased protein oxidation and reduced darkening.

  2. Pre-treatment with oral hydroxyurea prior to intensive chemotherapy improves early survival of patients with high hyperleukocytosis in acute myeloid leukemia.

    PubMed

    Mamez, Anne-Claire; Raffoux, Emmanuel; Chevret, Sylvie; Lemiale, Virginie; Boissel, Nicolas; Canet, Emmanuel; Schlemmer, Benoît; Dombret, Hervé; Azoulay, Elie; Lengliné, Etienne

    2016-10-01

    Acute myeloid leukemia with high white blood cell count (WBC) is a medical emergency. A reduction of tumor burden with hydroxyurea may prevent life-threatening complications induced by straight chemotherapy. To evaluate this strategy, we reviewed medical charts of adult patients admitted to our institution from 1997 to 2011 with non-promyelocytic AML and WBC over 50 G/L. One hundred and sixty patients were included with a median WBC of 120 G/L (range 50-450), 107 patients received hydroxyurea prior to chemotherapy, and 53 received emergency induction chemotherapy (CT). Hospital mortality was lower for patients treated with hydroxyurea (34% versus 19%, p = 0.047) even after adjusting for age (p < 0.01) and initial WBC count (p = 0.02). No evidence of any difference between treatment groups in terms of WBC decline kinetics and disease free survival (p = 0.87) was found. Oral hydroxyurea prior to chemotherapy seems a safe and efficient strategy to reduce early death of hyperleukocytic AML patients.

  3. Orthogonal bases of invariants in tensor models

    NASA Astrophysics Data System (ADS)

    Diaz, Pablo; Rey, Soo-Jong

    2018-02-01

    Representation theory provides an efficient framework to count and classify invariants in tensor models of (gauge) symmetry G d = U( N 1) ⊗ · · · ⊗ U( N d ) . We show that there are two natural ways of counting invariants, one for arbitrary G d and another valid for large rank of G d . We construct basis of invariant operators based on the counting, and compute correlators of their elements. The basis associated with finite rank of G d diagonalizes two-point function. It is analogous to the restricted Schur basis used in matrix models. We comment on future directions for investigation.

  4. EUV observation from the Earth-orbiting satellite, EXCEED

    NASA Astrophysics Data System (ADS)

    Yoshioka, K.; Murakami, G.; Yoshikawa, I.; Ueno, M.; Uemizu, K.; Yamazaki, A.

    2010-01-01

    An Earth-orbiting small satellite “EXtreme ultraviolet spectrosCope for ExosphEric Dynamics” (EXCEED) which will be launched in 2012 is under development. The mission will carry out spectroscopic and imaging observation of EUV (Extreme Ultraviolet: 60-145 nm) emissions from tenuous plasmas around the planets (Venus, Mars, Mercury, and Jupiter). It is essential for EUV observation to put on an observing site outside the Earth’s atmosphere to avoid the absorption. It is also essential that the detection efficiency must be very high in order to catch the faint signals from those targets. In this mission, we employ cesium iodide coated microchannel plate as a 2 dimensional photon counting devise which shows 1.5-50 times higher quantum detection efficiency comparing with the bared one. We coat the surface of the grating and entrance mirror with silicon carbides by the chemical vapor deposition method in order to archive the high diffraction efficiency and reflectivity. The whole spectrometer is shielded by the 2 mm thick stainless steel to prevent the contamination caused by the high energy electrons from the inner radiation belt. In this paper, we will introduce the mission overview, its instrument, and their performance.

  5. Utility and validation of day and night snorkel counts for estimating bull trout abundance in first-to-third order streams

    Treesearch

    Russell F. Thurow; James T. Peterson; John W. Guzevich

    2006-01-01

    Despite the widespread use of underwater observation to census stream-dwelling fishes, the accuracy of snorkeling methods has rarely been validated. We evaluated the efficiency of day and night snorkel counts for estimating the abundance of bull trout Salvelinus confluentus in 215 sites within first- to third-order streams. We used a dual-gear...

  6. Space and power efficient hybrid counters array

    DOEpatents

    Gara, Alan G [Mount Kisco, NY; Salapura, Valentina [Chappaqua, NY

    2009-05-12

    A hybrid counter array device for counting events. The hybrid counter array includes a first counter portion comprising N counter devices, each counter device for receiving signals representing occurrences of events from an event source and providing a first count value corresponding to a lower order bits of the hybrid counter array. The hybrid counter array includes a second counter portion comprising a memory array device having N addressable memory locations in correspondence with the N counter devices, each addressable memory location for storing a second count value representing higher order bits of the hybrid counter array. A control device monitors each of the N counter devices of the first counter portion and initiates updating a value of a corresponding second count value stored at the corresponding addressable memory location in the second counter portion. Thus, a combination of the first and second count values provide an instantaneous measure of number of events received.

  7. Space and power efficient hybrid counters array

    DOEpatents

    Gara, Alan G.; Salapura, Valentina

    2010-03-30

    A hybrid counter array device for counting events. The hybrid counter array includes a first counter portion comprising N counter devices, each counter device for receiving signals representing occurrences of events from an event source and providing a first count value corresponding to a lower order bits of the hybrid counter array. The hybrid counter array includes a second counter portion comprising a memory array device having N addressable memory locations in correspondence with the N counter devices, each addressable memory location for storing a second count value representing higher order bits of the hybrid counter array. A control device monitors each of the N counter devices of the first counter portion and initiates updating a value of a corresponding second count value stored at the corresponding addressable memory location in the second counter portion. Thus, a combination of the first and second count values provide an instantaneous measure of number of events received.

  8. Gross beta determination in drinking water using scintillating fiber array detector.

    PubMed

    Lv, Wen-Hui; Yi, Hong-Chang; Liu, Tong-Qing; Zeng, Zhi; Li, Jun-Li; Zhang, Hui; Ma, Hao

    2018-04-04

    A scintillating fiber array detector for measuring gross beta counting is developed to monitor the real-time radioactivity in drinking water. The detector, placed in a stainless-steel tank, consists of 1096 scintillating fibers, both sides of which are connected to a photomultiplier tube. The detector parameters, including working voltage, background counting rate and stability, are tested, and the detection efficiency is calibrated using standard potassium chloride solution. Water samples are measured with the detector and the results are compared with those by evaporation method. The results show consistency with those by evaporation method. The background counting rate of the detector is 38.131 ± 0.005 cps, and the detection efficiency for β particles is 0.37 ± 0.01 cps/(Bq/l). The MDAC of this system can be less than 1.0 Bq/l for β particles in 120 min without pre-concentration. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. BayMeth: improved DNA methylation quantification for affinity capture sequencing data using a flexible Bayesian approach

    PubMed Central

    2014-01-01

    Affinity capture of DNA methylation combined with high-throughput sequencing strikes a good balance between the high cost of whole genome bisulfite sequencing and the low coverage of methylation arrays. We present BayMeth, an empirical Bayes approach that uses a fully methylated control sample to transform observed read counts into regional methylation levels. In our model, inefficient capture can readily be distinguished from low methylation levels. BayMeth improves on existing methods, allows explicit modeling of copy number variation, and offers computationally efficient analytical mean and variance estimators. BayMeth is available in the Repitools Bioconductor package. PMID:24517713

  10. Low-cost optical interconnect module for parallel optical data links

    NASA Astrophysics Data System (ADS)

    Noddings, Chad; Hirsch, Tom J.; Olla, M.; Spooner, C.; Yu, Jason J.

    1995-04-01

    We have designed, fabricated, and tested a prototype parallel ten-channel unidirectional optical data link. When scaled to production, we project that this technology will satisfy the following market penetration requirements: (1) up to 70 meters transmission distance, (2) at least 1 gigabyte/second data rate, and (3) 0.35 to 0.50 MByte/second volume selling price. These goals can be achieved by means of the assembly innovations described in this paper: a novel alignment method that is integrated with low-cost, few chip module packaging techniques, yielding high coupling and reducing the component count. Furthermore, high coupling efficiency increases projected reliability reducing the driver's power requirements.

  11. Photon-Counting Multikilohertz Microlaser Altimeters for Airborne and Spaceborne Topographic Measurements

    NASA Technical Reports Server (NTRS)

    Degnan, John J.; Smith, David E. (Technical Monitor)

    2000-01-01

    We consider the optimum design of photon-counting microlaser altimeters operating from airborne and spaceborne platforms under both day and night conditions. Extremely compact Q-switched microlaser transmitters produce trains of low energy pulses at multi-kHz rates and can easily generate subnanosecond pulse-widths for precise ranging. To guide the design, we have modeled the solar noise background and developed simple algorithms, based on Post-Detection Poisson Filtering (PDPF), to optimally extract the weak altimeter signal from a high noise background during daytime operations. Practical technology issues, such as detector and/or receiver dead times, have also been considered in the analysis. We describe an airborne prototype, being developed under NASA's instrument Incubator Program, which is designed to operate at a 10 kHz rate from aircraft cruise altitudes up to 12 km with laser pulse energies on the order of a few microjoules. We also analyze a compact and power efficient system designed to operate from Mars orbit at an altitude of 300 km and sample the Martian surface at rates up to 4.3 kHz using a 1 watt laser transmitter and an 18 cm telescope. This yields a Power-Aperture Product of 0.24 W-square meter, corresponding to a value almost 4 times smaller than the Mars Orbiting Laser Altimeter (0. 88W-square meter), yet the sampling rate is roughly 400 times greater (4 kHz vs 10 Hz) Relative to conventional high power laser altimeters, advantages of photon-counting laser altimeters include: (1) a more efficient use of available laser photons providing up to two orders of magnitude greater surface sampling rates for a given laser power-telescope aperture product; (2) a simultaneous two order of magnitude reduction in the volume, cost and weight of the telescope system; (3) the unique ability to spatially resolve the source of the surface return in a photon counting mode through the use of pixellated or imaging detectors; and (4) improved vertical and transverse spatial resolution resulting from both (1) and (3). Furthermore, because of significantly lower laser pulse energies, the microaltimeter is inherently more eyesafe to observers on the ground and less prone to internal optical damage, which can terminate a space mission prematurely.

  12. PRESAGE: Protecting Structured Address Generation against Soft Errors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram

    Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation (to index large arrays) have not been widely researched. We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGE is that any addressmore » computation scheme that flows an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Enabling the flow of errors allows one to situate detectors at loop exit points, and helps turn silent corruptions into easily detectable error situations. Our experiments using PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.« less

  13. PRESAGE: Protecting Structured Address Generation against Soft Errors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram

    Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation have not been widely researched (especially in the context of indexing large arrays). We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGEmore » is that any address computation scheme that propagates an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Ensuring the propagation of errors allows one to place detectors at loop exit points and helps turn silent corruptions into easily detectable error situations. Our experiments using the PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.« less

  14. Efficacy and safety of a herbo-mineral ayurvedic formulation ‘Afrodet Plus®’ in male rats

    PubMed Central

    Dhumal, Rohit; Vijaykumar, Tushara; Dighe, Vikas; Selkar, Nilakash; Chawda, Mukesh; Vahlia, Mahesh; Vanage, Geeta

    2013-01-01

    Background: Reverse pharmacology for drug development has been highly productive and cost-effective in recent past as it is based on the documented therapeutic effects of plants in ancient texts. Afrodet Plus® is formulated for the treatment of male infertility, which contains ancient herbo-minerals. Its efficacy and safety are validated through this animal study in reverse pharmacology mode. Objectives: This study was undertaken to evaluate efficacy and safety of an Ayurvedic formulation Afrodet Plus® in adult male rats. Materials and Methods: Twelve male rats (Holtzman) between 8 and 10 weeks of age were randomly selected and animals were assigned to a control and two treatment groups. Dosing was performed daily. Various parameters such as weekly body weight, hematology, serum testosterone levels, epididymal sperm count, and efficiency of Daily Sperm Production (DSP) were evaluated. Results: It was found that epididymal sperm count had significantly increased in both low-dose (+27.39%) and high-dose (+40.5%) groups as compared to control group. The DSP also showed an increase of 43.7% at high dose of 180 mg/kg body weight as compared to the control group. An increase in sperm motility and especially progressive motility was observed when evaluated by Computer Assisted Semen Analyzer. Histological evaluation of testicular tissue for spermatogenic index revealed that the index had increased in treatment group as compared to control group. Conclusion: This study revealed that oral administration of Afrodet Plus® resulted in significant increase in DSP in the testis along with increase in epididymal sperm count and progressive motility as compared to control group without producing any treatment-related adverse effects. These findings provide the documentary evidence that the use of Afrodet Plus® at 90 and 180 mg/kg body weight is effective and safe for the treatment of male infertility especially to improve sperm count and progressive motility. PMID:24250145

  15. Efficiency of energy recovery from waste incineration, in the light of the new Waste Framework Directive.

    PubMed

    Grosso, Mario; Motta, Astrid; Rigamonti, Lucia

    2010-07-01

    This paper deals with a key issue related to municipal waste incineration, which is the efficiency of energy recovery. A strong driver for improving the energy performances of waste-to-energy plants is the recent Waste Framework Directive (Directive 2008/98/EC of the European Parliament and of the Council of 19 November 2008 on waste and repealing certain Directives), which allows high efficiency installations to benefit from a status of "recovery" rather than "disposal". The change in designation means a step up in the waste hierarchy, where the lowest level of priority is now restricted to landfilling and low efficiency wastes incineration. The so-called "R1 formula" reported in the Directive, which counts for both production of power and heat, is critically analyzed and correlated to the more scientific-based approach of exergy efficiency. The results obtained for waste-to-energy plants currently operating in Europe reveal some significant differences in their performance, mainly related to the average size and to the availability of a heat market (district heating). Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  16. Evaluation of the automated hematology analyzer ADVIA® 120 for cerebrospinal fluid analysis and usage of unique hemolysis reagent.

    PubMed

    Tanada, H; Ikemoto, T; Masutani, R; Tanaka, H; Takubo, T

    2014-02-01

    In this study, we evaluated the performance of the ADVIA 120 hematology system for cerebrospinal fluid (CSF) assay. Cell counts and leukocyte differentials in CSF were examined with the ADVIA 120 hematology system, while simultaneously confirming an effective hemolysis agent for automated CSF cell counts. The detection limits of both white blood cell (WBC) counts and red blood cell (RBC) counts on the measurement of CSF cell counts by the ADVIA 120 hematology system were superior at 2 cells/μL (10(-6) L). The WBC count was linear up to 9.850 cells/μL, and the RBC count was linear up to approximately 20 000 cells/μL. The intrarun reproducibility indicated good precision. The leukocyte differential of CSF cells, performed by the ADVIA120 hematology system, showed good correlation with the microscopic procedure. The VersaLyse hemolysis solution efficiently lysed the samples without interfering with cell counts and leukocyte differential, even in a sample that included approximately 50 000/μL RBC. These data show the ADVIA 120 hematology system correctly measured the WBC count and leukocyte differential in CSF. The VersaLyse hemolysis solution is considered to be optimal for hemolysis treatment of CSF when measuring cell counts and differentials by the ADVIA 120 hematology system. © 2013 John Wiley & Sons Ltd.

  17. Free-space-coupled superconducting nanowire single-photon detectors for infrared optical communications.

    PubMed

    Bellei, Francesco; Cartwright, Alyssa P; McCaughan, Adam N; Dane, Andrew E; Najafi, Faraz; Zhao, Qingyuan; Berggren, Karl K

    2016-02-22

    This paper describes the construction of a cryostat and an optical system with a free-space coupling efficiency of 56.5% ± 3.4% to a superconducting nanowire single-photon detector (SNSPD) for infrared quantum communication and spectrum analysis. A 1K pot decreases the base temperature to T = 1.7 K from the 2.9 K reached by the cold head cooled by a pulse-tube cryocooler. The minimum spot size coupled to the detector chip was 6.6 ± 0.11 µm starting from a fiber source at wavelength, λ = 1.55 µm. We demonstrated photon counting on a detector with an 8 × 7.3 µm2 area. We measured a dark count rate of 95 ± 3.35 kcps and a system detection efficiency of 1.64% ± 0.13%. We explain the key steps that are required to improve further the coupling efficiency.

  18. Disk-based k-mer counting on a PC

    PubMed Central

    2013-01-01

    Background The k-mer counting problem, which is to build the histogram of occurrences of every k-symbol long substring in a given text, is important for many bioinformatics applications. They include developing de Bruijn graph genome assemblers, fast multiple sequence alignment and repeat detection. Results We propose a simple, yet efficient, parallel disk-based algorithm for counting k-mers. Experiments show that it usually offers the fastest solution to the considered problem, while demanding a relatively small amount of memory. In particular, it is capable of counting the statistics for short-read human genome data, in input gzipped FASTQ file, in less than 40 minutes on a PC with 16 GB of RAM and 6 CPU cores, and for long-read human genome data in less than 70 minutes. On a more powerful machine, using 32 GB of RAM and 32 CPU cores, the tasks are accomplished in less than half the time. No other algorithm for most tested settings of this problem and mammalian-size data can accomplish this task in comparable time. Our solution also belongs to memory-frugal ones; most competitive algorithms cannot efficiently work on a PC with 16 GB of memory for such massive data. Conclusions By making use of cheap disk space and exploiting CPU and I/O parallelism we propose a very competitive k-mer counting procedure, called KMC. Our results suggest that judicious resource management may allow to solve at least some bioinformatics problems with massive data on a commodity personal computer. PMID:23679007

  19. Comparison of two apheresis systems for the collection of CD14+ cells intended to be used in dendritic cell culture.

    PubMed

    Strasser, Erwin F; Berger, Thomas G; Weisbach, Volker; Zimmermann, Robert; Ringwald, Jürgen; Schuler-Thurner, Beatrice; Zingsem, Jürgen; Eckstein, Reinhold

    2003-09-01

    Monocytes collected by leukapheresis are increasingly used for dendritic cell (DC) culture in cell factories suitable for DC vaccination in cancer. Using modified MNC programs on two apheresis systems (Cobe Spectra and Fresenius AS.TEC204), leukapheresis components collected from 84 patients with metastatic malignant melanoma and from 31 healthy male donors were investigated. MNCs, monocytes, RBCs, and platelets (PLTs) in donors and components were analyzed by cell counters, WBC differential counts, and flow cytometry. In 5-L collections, Astec showed better results regarding monocyte collection rates (11.0 vs. 7.4 x 10(6)/min, p = 0.04) and efficiencies (collection efficiency, 51.9 vs. 31.9%; p < 0.001). Both devices resulted in monocyte yields at an average of 1 x 10(9) (donors) and 2.5 x 10(9) (patients), whereas Astec components contained high residual RBCs. Compared to components with low residual PLTs, high PLT concentration resulted in higher monocyte loss (48 vs. 20%, p < 0.0001) before DC culture. The Astec is more efficient in 5-L MNC collections compared to the Spectra. Components with high residual PLTs result in high MNC loss by purification procedures. Thus, optimizing MNC programs is essential to obtain components with high MNC yields and low residual cells as prerequisite for high DC yields.

  20. Comparative Response of Microchannel Plate and Channel Electron Multiplier Detectors to Penetrating Radiation in Space

    DOE PAGES

    Funsten, Herbert O.; Harper, Ronnie W.; Dors, Eric E.; ...

    2015-10-02

    Channel electron multiplier (CEM) and microchannel plate (MCP) detectors are routinely used in space instrumentation for measurement of space plasmas. Here, our goal is to understand the relative sensitivities of these detectors to penetrating radiation in space, which can generate background counts and shorten detector lifetime. We use 662 keV γ-rays as a proxy for penetrating radiation such as γ-rays, cosmic rays, and high-energy electrons and protons that are ubiquitous in the space environment. We find that MCP detectors are ~20 times more sensitive to 662 keV γ-rays than CEM detectors. This is attributed to the larger total area ofmore » multiplication channels in an MCP detector that is sensitive to electronic excitation and ionization resulting from the interaction of penetrating radiation with the detector material. In contrast to the CEM detector, whose quantum efficiency ε γ for 662 keVγ -rays is found to be 0.00175 and largely independent of detector bias, the quantum efficiency of the MCP detector is strongly dependent on the detector bias, with a power law index of 5.5. Lastly, background counts in MCP detectors from penetrating radiation can be reduced using MCP geometries with higher pitch and smaller channel diameter.« less

  1. The Electronic McPhail Trap

    PubMed Central

    Potamitis, Ilyas; Rigakis, Iraklis; Fysarakis, Konstantinos

    2014-01-01

    Certain insects affect cultivations in a detrimental way. A notable case is the olive fruit fly (Bactrocera oleae (Rossi)), that in Europe alone causes billions of euros in crop-loss/per year. Pests can be controlled with aerial and ground bait pesticide sprays, the efficiency of which depends on knowing the time and location of insect infestations as early as possible. The inspection of traps is currently carried out manually. Automatic monitoring traps can enhance efficient monitoring of flying pests by identifying and counting targeted pests as they enter the trap. This work deals with the hardware setup of an insect trap with an embedded optoelectronic sensor that automatically records insects as they fly in the trap. The sensor responsible for detecting the insect is an array of phototransistors receiving light from an infrared LED. The wing-beat recording is based on the interruption of the emitted light due to the partial occlusion from insect's wings as they fly in the trap. We show that the recordings are of high quality paving the way for automatic recognition and transmission of insect detections from the field to a smartphone. This work emphasizes the hardware implementation of the sensor and the detection/counting module giving all necessary implementation details needed to construct it. PMID:25429412

  2. Evaluation of seven aquatic sampling methods for amphibians and other aquatic fauna

    USGS Publications Warehouse

    Gunzburger, M.S.

    2007-01-01

    To design effective and efficient research and monitoring programs researchers must have a thorough understanding of the capabilities and limitations of their sampling methods. Few direct comparative studies exist for aquatic sampling methods for amphibians. The objective of this study was to simultaneously employ seven aquatic sampling methods in 10 wetlands to compare amphibian species richness and number of individuals detected with each method. Four sampling methods allowed counts of individuals (metal dipnet, D-frame dipnet, box trap, crayfish trap), whereas the other three methods allowed detection of species (visual encounter, aural, and froglogger). Amphibian species richness was greatest with froglogger, box trap, and aural samples. For anuran species, the sampling methods by which each life stage was detected was related to relative length of larval and breeding periods and tadpole size. Detection probability of amphibians varied across sampling methods. Box trap sampling resulted in the most precise amphibian count, but the precision of all four count-based methods was low (coefficient of variation > 145 for all methods). The efficacy of the four count sampling methods at sampling fish and aquatic invertebrates was also analyzed because these predatory taxa are known to be important predictors of amphibian habitat distribution. Species richness and counts were similar for fish with the four methods, whereas invertebrate species richness and counts were greatest in box traps. An effective wetland amphibian monitoring program in the southeastern United States should include multiple sampling methods to obtain the most accurate assessment of species community composition at each site. The combined use of frogloggers, crayfish traps, and dipnets may be the most efficient and effective amphibian monitoring protocol. ?? 2007 Brill Academic Publishers.

  3. Lymphocyte apheresis for chimeric antigen receptor T-cell manufacturing in children and young adults with leukemia and neuroblastoma.

    PubMed

    Ceppi, Francesco; Rivers, Julie; Annesley, Colleen; Pinto, Navin; Park, Julie R; Lindgren, Catherine; Mgebroff, Stephanie; Linn, Naomi; Delaney, Meghan; Gardner, Rebecca A

    2018-06-01

    The first step in the production of chimeric antigen receptor T cells is the collection of autologous T cells using apheresis technology. The procedure is technically challenging, because patients often have low leukocyte counts and are heavily pretreated with multiple lines of chemotherapy, marrow transplantation, and/or radiotherapy. Here, we report our experience of collecting T lymphocytes for chimeric antigen receptor T-cell manufacturing in pediatric and young adult patients with leukemia, non-Hodgkin lymphoma, or neuroblastoma. Apheresis procedures were performed on a COBE Spectra machine using the mononuclear cell program, with a collection target of 1 × 10 9 total mononuclear cells per kilogram. Data were collected regarding preapheresis and postapheresis blood counts, apheresis parameters, products, and adverse events. Ninety-nine patients (ages 1.3-25.7 years) and 102 apheresis events were available for analysis. Patients underwent apheresis at a variety of absolute lymphocyte cell counts, with a median absolute lymphocyte count of 944 cells/μL (range, 142-6944 cells/μL). Twenty-two patients (21.6%) had absolute lymphocyte counts less than 500 cells/μL. The mononuclear cell target was obtained in 100% of all apheresis harvests, and chimeric antigen receptor T-cell production was possible from the majority of collections (94%). Mononuclear cell collection efficiency was 65.4%, and T-lymphocyte collection efficiency was 83.4%. Ten patients (9.8%) presented with minor adverse events during the 102 apheresis procedures, with one exception of a severe allergy. Mononuclear cell apheresis for chimeric antigen receptor T-cell therapy is well tolerated and safe, and it is possible to obtain an adequate quantity of CD3+ lymphocytes for chimeric antigen receptor T-cell manufacturing in heavily pretreated patients who have low lymphocyte counts. © 2018 AABB.

  4. In Orbit Performance of Si Avalanche Photodiode Single Photon Counting Modules in the Geoscience Laser Altimeter System on ICESat

    NASA Technical Reports Server (NTRS)

    Sun, X.; Jester, P. L.; Palm, S. P.; Abshire, J. B.; Spinhime, J. D.; Krainak, M. A.

    2006-01-01

    Si avalanche photodiode (APD) single photon counting modules (SPCMs) are used in the Geoscience Laser Altimeter System (GLAS) on Ice, Cloud, anti land Elevation Satellite (ICESat), currently in orbit measuring Earth surface elevation and atmosphere backscattering. These SPCMs are used to measure cloud and aerosol backscatterings to the GLAS laser light at 532-nm wavelength with 60-70% quantum efficiencies and up to 15 millions/s maximum count rates. The performance of the SPCMs has been closely monitored since ICESat launch on January 12, 2003. There has been no measurable change in the quantum efficiency, as indicated by the average photon count rates in response to the background light from the sunlit earth. The linearity and the afterpulsing seen from the cloud and surface backscatterings profiles have been the same as those during ground testing. The detector dark count rates monitored while the spacecraft was in the dark side of the globe have increased almost linearly at about 60 counts/s per day due to space radiation damage. The radiation damage appeared to be independent of the device temperature and power states. There was also an abrupt increase in radiation damage during the solar storm in 28-30 October 2003. The observed radiation damage is a factor of two to three lower than the expected and sufficiently low to provide useful atmosphere backscattering measurements through the end of the ICESat mission. To date, these SPCMs have been in orbit for more than three years. The accumulated operating time to date has reached 290 days (7000 hours). These SPCMs have provided unprecedented receiver sensitivity and dynamic range in ICESat atmosphere backscattering measurements.

  5. Deep-UV-sensitive high-frame-rate backside-illuminated CCD camera developments

    NASA Astrophysics Data System (ADS)

    Dawson, Robin M.; Andreas, Robert; Andrews, James T.; Bhaskaran, Mahalingham; Farkas, Robert; Furst, David; Gershstein, Sergey; Grygon, Mark S.; Levine, Peter A.; Meray, Grazyna M.; O'Neal, Michael; Perna, Steve N.; Proefrock, Donald; Reale, Michael; Soydan, Ramazan; Sudol, Thomas M.; Swain, Pradyumna K.; Tower, John R.; Zanzucchi, Pete

    2002-04-01

    New applications for ultra-violet imaging are emerging in the fields of drug discovery and industrial inspection. High throughput is critical for these applications where millions of drug combinations are analyzed in secondary screenings or high rate inspection of small feature sizes over large areas is required. Sarnoff demonstrated in1990 a back illuminated, 1024 X 1024, 18 um pixel, split-frame-transfer device running at > 150 frames per second with high sensitivity in the visible spectrum. Sarnoff designed, fabricated and delivered cameras based on these CCDs and is now extending this technology to devices with higher pixel counts and higher frame rates through CCD architectural enhancements. The high sensitivities obtained in the visible spectrum are being pushed into the deep UV to support these new medical and industrial inspection applications. Sarnoff has achieved measured quantum efficiencies > 55% at 193 nm, rising to 65% at 300 nm, and remaining almost constant out to 750 nm. Optimization of the sensitivity is being pursued to tailor the quantum efficiency for particular wavelengths. Characteristics of these high frame rate CCDs and cameras will be described and results will be presented demonstrating high UV sensitivity down to 150 nm.

  6. High-speed optical switch fabrics with large port count.

    PubMed

    Yeo, Yong-Kee; Xu, Zhaowen; Wang, Dawei; Liu, Jianguo; Wang, Yixin; Cheng, Tee-Hiang

    2009-06-22

    We report a novel architecture that can be used to construct optical switch fabrics with very high port count and nanoseconds switching speed. It is well known that optical switch fabrics with very fast switching time and high port count are challenging to realize. Currently, one of the most promising solutions is based on a combination of wavelength-tunable lasers and the arrayed waveguide grating router (AWGR). To scale up the number of ports in such switches, a direct method is to use AWGRs with a high channel count. However, such AWGRs introduce very large crosstalk noise due to the close wavelength channel spacing. In this paper, we propose an architecture for realizing a high-port count optical switch fabric using a combination of low-port count AWGRs, optical ON-OFF gates and WDM couplers. Using this new methodology, we constructed a proof-of concept experiment to demonstrate the feasibility of a 256 x 256 optical switch fabric. To our knowledge, this port count is the highest ever reported for switch fabrics of this type.

  7. Box-Counting Dimension Revisited: Presenting an Efficient Method of Minimizing Quantization Error and an Assessment of the Self-Similarity of Structural Root Systems

    PubMed Central

    Bouda, Martin; Caplan, Joshua S.; Saiers, James E.

    2016-01-01

    Fractal dimension (FD), estimated by box-counting, is a metric used to characterize plant anatomical complexity or space-filling characteristic for a variety of purposes. The vast majority of published studies fail to evaluate the assumption of statistical self-similarity, which underpins the validity of the procedure. The box-counting procedure is also subject to error arising from arbitrary grid placement, known as quantization error (QE), which is strictly positive and varies as a function of scale, making it problematic for the procedure's slope estimation step. Previous studies either ignore QE or employ inefficient brute-force grid translations to reduce it. The goals of this study were to characterize the effect of QE due to translation and rotation on FD estimates, to provide an efficient method of reducing QE, and to evaluate the assumption of statistical self-similarity of coarse root datasets typical of those used in recent trait studies. Coarse root systems of 36 shrubs were digitized in 3D and subjected to box-counts. A pattern search algorithm was used to minimize QE by optimizing grid placement and its efficiency was compared to the brute force method. The degree of statistical self-similarity was evaluated using linear regression residuals and local slope estimates. QE, due to both grid position and orientation, was a significant source of error in FD estimates, but pattern search provided an efficient means of minimizing it. Pattern search had higher initial computational cost but converged on lower error values more efficiently than the commonly employed brute force method. Our representations of coarse root system digitizations did not exhibit details over a sufficient range of scales to be considered statistically self-similar and informatively approximated as fractals, suggesting a lack of sufficient ramification of the coarse root systems for reiteration to be thought of as a dominant force in their development. FD estimates did not characterize the scaling of our digitizations well: the scaling exponent was a function of scale. Our findings serve as a caution against applying FD under the assumption of statistical self-similarity without rigorously evaluating it first. PMID:26925073

  8. Primary Standardization of 152Eu by 4πβ(LS) – γ (Nal) coincidence counting and CIEMAT-NIST method

    NASA Astrophysics Data System (ADS)

    Ruzzarin, A.; da Cruz, P. A. L.; Ferreira Filho, A. L.; Iwahara, A.

    2018-03-01

    The 4πβ-γ coincidence counting and CIEMAT/NIST liquid scintillation method were used in the standardization of a solution of 152Eu. In CIEMAT/NIST method, measurements were performed in a Liquid Scintillation Counter model Wallac 1414. In the 4πβ-γ coincidence counting, the solution was standardized using a coincidence method with ‘‘beta-efficiency extrapolation”. A simple 4πβ-γ coincidence system was used, with acrylic scintillation cell coupled to two coincident photomultipliers at 180° each other and NaI(Tl) detector. The activity concentrations obtained were 156.934 ± 0.722 and 157.403 ± 0.113 kBq/g, respectively, for CIEMAT/NIST and 4πβ-γ coincidence counting measurement methods.

  9. A new NIST primary standardization of 18F.

    PubMed

    Fitzgerald, R; Zimmerman, B E; Bergeron, D E; Cessna, J C; Pibida, L; Moreira, D S

    2014-02-01

    A new primary standardization of (18)F by NIST is reported. The standard is based on live-timed beta-gamma anticoincidence counting with confirmatory measurements by three other methods: (i) liquid scintillation (LS) counting using CIEMAT/NIST (3)H efficiency tracing; (ii) triple-to-double coincidence ratio (TDCR) counting; and (iii) NaI integral counting and HPGe γ-ray spectrometry. The results are reported as calibration factors for NIST-maintained ionization chambers (including some "dose calibrators"). The LS-based methods reveal evidence for cocktail instability for one LS cocktail. Using an ionization chamber to link this work with previous NIST results, the new value differs from the previous reports by about 4%, but appears to be in good agreement with the key comparison reference value (KCRV) of 2005. © 2013 Published by Elsevier Ltd.

  10. A Computer Program for the Prediction of Solid Propellant Rocket Motor Performance. Volume 3

    DTIC Science & Technology

    1975-07-01

    following losses: two-dimensional/two-phase (coupled), nozzle erosion, kinetics, boundary layer, combustion efficiency, submergence . The program...loss •Two dimensional or divergence less •Finite Rate Kinetics loss •Boundary Layer Loss •Combustion Efficiency - • Submergence Loss •Erosion...counted twice. The iforcmcr.tioned assumptions are describ- ed In Section US, The submergence efficiency, ijgno* ^* rased on an empirical

  11. Calibration of the Accuscan II In Vivo System for I-125 Thyroid Counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ovard R. Perry; David L. Georgeson

    2011-07-01

    This report describes the March 2011 calibration of the Accuscan II HpGe In Vivo system for I-125 thyroid counting. The source used for the calibration was a DOE manufactured Am-241/Eu-152 source contained in a 22 ml vial BEA Am-241/Eu-152 RMC II-1 with energies from 26 keV to 344 keV. The center of the detector housing was positioned 64 inches from the vault floor. This position places the approximate center line of the detector housing at the center line of the source in the phantom thyroid tube. The energy and efficiency calibration were performed using an RMC II phantom (Appendix J).more » Performance testing was conducted using source BEA Am-241/Eu-152 RMC II-1 and Validation testing was performed using an I-125 source in a 30 ml vial (I-125 BEA Thyroid 002) and an ANSI N44.3 phantom (Appendix I). This report includes an overview introduction and records for the energy/FWHM and efficiency calibration including performance verification and validation counting. The Accuscan II system was successfully calibrated for counting the thyroid for I-125 and verified in accordance with ANSI/HPS N13.30-1996 criteria.« less

  12. A Statistical Treatment of Bioassay Pour Fractions

    NASA Technical Reports Server (NTRS)

    Barengoltz, Jack; Hughes, David W.

    2014-01-01

    The binomial probability distribution is used to treat the statistics of a microbiological sample that is split into two parts, with only one part evaluated for spore count. One wishes to estimate the total number of spores in the sample based on the counts obtained from the part that is evaluated (pour fraction). Formally, the binomial distribution is recharacterized as a function of the observed counts (successes), with the total number (trials) an unknown. The pour fraction is the probability of success per spore (trial). This distribution must be renormalized in terms of the total number. Finally, the new renormalized distribution is integrated and mathematically inverted to yield the maximum estimate of the total number as a function of a desired level of confidence ( P(

  13. Estimation of Confidence Intervals for Multiplication and Efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verbeke, J

    2009-07-17

    Helium-3 tubes are used to detect thermal neutrons by charge collection using the {sup 3}He(n,p) reaction. By analyzing the time sequence of neutrons detected by these tubes, one can determine important features about the constitution of a measured object: Some materials such as Cf-252 emit several neutrons simultaneously, while others such as uranium and plutonium isotopes multiply the number of neutrons to form bursts. This translates into unmistakable signatures. To determine the type of materials measured, one compares the measured count distribution with the one generated by a theoretical fission chain model. When the neutron background is negligible, the theoreticalmore » count distributions can be completely characterized by a pair of parameters, the multiplication M and the detection efficiency {var_epsilon}. While the optimal pair of M and {var_epsilon} can be determined by existing codes such as BigFit, the uncertainty on these parameters has not yet been fully studied. The purpose of this work is to precisely compute the uncertainties on the parameters M and {var_epsilon}, given the uncertainties in the count distribution. By considering different lengths of time tagged data, we will determine how the uncertainties on M and {var_epsilon} vary with the different count distributions.« less

  14. Exploiting Thread Parallelism for Ocean Modeling on Cray XC Supercomputers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarje, Abhinav; Jacobsen, Douglas W.; Williams, Samuel W.

    The incorporation of increasing core counts in modern processors used to build state-of-the-art supercomputers is driving application development towards exploitation of thread parallelism, in addition to distributed memory parallelism, with the goal of delivering efficient high-performance codes. In this work we describe the exploitation of threading and our experiences with it with respect to a real-world ocean modeling application code, MPAS-Ocean. We present detailed performance analysis and comparisons of various approaches and configurations for threading on the Cray XC series supercomputers.

  15. Prediction of noise field of a propfan at angle of attack

    NASA Technical Reports Server (NTRS)

    Envia, Edmane

    1991-01-01

    A method for predicting the noise field of a propfan operating at an angle of attack to the oncoming flow is presented. The method takes advantage of the high-blade-count of the advanced propeller designs to provide an accurate and efficient formula for predicting their noise field. The formula, which is written in terms of the Airy function and its derivative, provides a very attractive alternative to the use of numerical integration. A preliminary comparison shows rather favorable agreement between the predictions from the present method and the experimental data.

  16. On-chip integrated functional near infra-red spectroscopy (fNIRS) photoreceiver for portable brain imaging

    NASA Astrophysics Data System (ADS)

    Kamrani, Ehsan

    Optical brain imaging using functional near infra-red spectroscopy (fNIRS) offers a direct and noninvasive tool for monitoring of blood oxygenation. fNIRS is a noninvasive, safe, minimally intrusive, and high temporal-resolution technique for real-time and long-term brain imaging. It allows detecting both fast-neuronal and slow-hemodynamic signals. Besides the significant advantages of fNIRS systems, they still suffer from few drawbacks including low spatial-resolution, moderately high-level noise and high-sensitivity to movement. In order to overcome the limitations of currently available non-portable fNIRS systems, we have introduced a new low-power, miniaturized on-chip photodetector front-end intended for portable fNIRS systems. It includes silicon avalanche photodiode (SiAPD), Transimpedance amplifier (TIA), and Quench- Reset circuitry implemented using standard CMOS technologies to operate in both linear and Geiger modes. So it can be applied for both continuous-wave fNIRS (CW-fNIRS) and also single-photon counting applications. Several SiAPDs have been implemented in novel structures and shapes (Rectangular, Octagonal, Dual, Nested, Netted, Quadratic and Hexadecagonal) using different premature edge breakdown prevention techniques. The main characteristics of the SiAPDs are validated and the impact of each parameter and the device simulators (TCAD, COMSOL, etc.) have been studied based on the simulation and measurement results. Proposed techniques exhibit SiAPDs with high avalanche-gain (up to 119), low breakdown-voltage (around 12V) and high photon-detection efficiency (up to 72% in NIR region) in additional to a low dark-count rate (down to 30Hz at 1V excess bias voltage). Three new high gain-bandwidth product (GBW) and low-noise TIAs are introduced and implemented based on distributed-gain concept, logarithmic-amplification and automatic noise-rejection and have been applied in linear-mode of operation. The implemented TIAs offer a power-consumption around 0.4 mW, transimpedance gain of 169 dBO, and input-output current/voltage noises in fA/pV range accompanied with ability to tune the gain, bandwidth and power-consumption in a wide range. The implemented mixed quench-reset circuit (MQC) and controllable MQC (CMQC) front-ends offer a quench-time of 10ns, a maximum power-consumption of 0.4 mW, with a controllable hold-off and reset-times. The on-chip integration of SiAPDs with TIA and photon-counting circuitries has been demonstrated showing improvement of the photodetection-efficiency, specially regarding to the sensitivity, power-consumption and signal-to-noise ratio (SNR) characteristics.

  17. A high dynamic range pulse counting detection system for mass spectrometry.

    PubMed

    Collings, Bruce A; Dima, Martian D; Ivosev, Gordana; Zhong, Feng

    2014-01-30

    A high dynamic range pulse counting system has been developed that demonstrates an ability to operate at up to 2e8 counts per second (cps) on a triple quadrupole mass spectrometer. Previous pulse counting detection systems have typically been limited to about 1e7 cps at the upper end of the systems dynamic range. Modifications to the detection electronics and dead time correction algorithm are described in this paper. A high gain transimpedance amplifier is employed that allows a multi-channel electron multiplier to be operated at a significantly lower bias potential than in previous pulse counting systems. The system utilises a high-energy conversion dynode, a multi-channel electron multiplier, a high gain transimpedance amplifier, non-paralysing detection electronics and a modified dead time correction algorithm. Modification of the dead time correction algorithm is necessary due to a characteristic of the pulse counting electronics. A pulse counting detection system with the capability to count at ion arrival rates of up to 2e8 cps is described. This is shown to provide a linear dynamic range of nearly five orders of magnitude for a sample of aprazolam with concentrations ranging from 0.0006970 ng/mL to 3333 ng/mL while monitoring the m/z 309.1 → m/z 205.2 transition. This represents an upward extension of the detector's linear dynamic range of about two orders of magnitude. A new high dynamic range pulse counting system has been developed demonstrating the ability to operate at up to 2e8 cps on a triple quadrupole mass spectrometer. This provides an upward extension of the detector's linear dynamic range by about two orders of magnitude over previous pulse counting systems. Copyright © 2013 John Wiley & Sons, Ltd.

  18. Optimally achieving milk bulk tank somatic cell count thresholds.

    PubMed

    Troendle, Jason A; Tauer, Loren W; Gröhn, Yrjo T

    2017-01-01

    High somatic cell count in milk leads to reduced shelf life in fluid milk and lower processed yields in manufactured dairy products. As a result, farmers are often penalized for high bulk tank somatic cell count or paid a premium for low bulk tank somatic cell count. Many countries also require all milk from a farm to be lower than a specified regulated somatic cell count. Thus, farms often cull cows that have high somatic cell count to meet somatic cell count thresholds. Rather than naïvely cull the highest somatic cell count cows, a mathematical programming model was developed that determines the cows to be culled from the herd by maximizing the net present value of the herd, subject to meeting any specified bulk tank somatic cell count level. The model was applied to test-day cows on 2 New York State dairy farms. Results showed that the net present value of the herd was increased by using the model to meet the somatic cell count restriction compared with naïvely culling the highest somatic cell count cows. Implementation of the model would be straightforward in dairy management decision software. Copyright © 2017 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  19. PubChem promiscuity: a web resource for gathering compound promiscuity data from PubChem.

    PubMed

    Canny, Stephanie A; Cruz, Yasel; Southern, Mark R; Griffin, Patrick R

    2012-01-01

    Promiscuity counts allow for a better understanding of a compound's assay activity profile and drug potential. Although PubChem contains a vast amount of compound and assay data, it currently does not have a convenient or efficient method to obtain in-depth promiscuity counts for compounds. PubChem promiscuity fills this gap. It is a Java servlet that uses NCBI Entrez (eUtils) web services to interact with PubChem and provide promiscuity counts in a variety of categories along with compound descriptors, including PAINS-based functional group detection. http://chemutils.florida.scripps.edu/pcpromiscuity southern@scripps.edu

  20. PubChem promiscuity: a web resource for gathering compound promiscuity data from PubChem

    PubMed Central

    Canny, Stephanie A.; Cruz, Yasel; Southern, Mark R.; Griffin, Patrick R.

    2012-01-01

    Summary: Promiscuity counts allow for a better understanding of a compound's assay activity profile and drug potential. Although PubChem contains a vast amount of compound and assay data, it currently does not have a convenient or efficient method to obtain in-depth promiscuity counts for compounds. PubChem promiscuity fills this gap. It is a Java servlet that uses NCBI Entrez (eUtils) web services to interact with PubChem and provide promiscuity counts in a variety of categories along with compound descriptors, including PAINS-based functional group detection. Availability: http://chemutils.florida.scripps.edu/pcpromiscuity Contact: southern@scripps.edu PMID:22084255

  1. Temporal binning of time-correlated single photon counting data improves exponential decay fits and imaging speed

    PubMed Central

    Walsh, Alex J.; Sharick, Joe T.; Skala, Melissa C.; Beier, Hope T.

    2016-01-01

    Time-correlated single photon counting (TCSPC) enables acquisition of fluorescence lifetime decays with high temporal resolution within the fluorescence decay. However, many thousands of photons per pixel are required for accurate lifetime decay curve representation, instrument response deconvolution, and lifetime estimation, particularly for two-component lifetimes. TCSPC imaging speed is inherently limited due to the single photon per laser pulse nature and low fluorescence event efficiencies (<10%) required to reduce bias towards short lifetimes. Here, simulated fluorescence lifetime decays are analyzed by SPCImage and SLIM Curve software to determine the limiting lifetime parameters and photon requirements of fluorescence lifetime decays that can be accurately fit. Data analysis techniques to improve fitting accuracy for low photon count data were evaluated. Temporal binning of the decays from 256 time bins to 42 time bins significantly (p<0.0001) improved fit accuracy in SPCImage and enabled accurate fits with low photon counts (as low as 700 photons/decay), a 6-fold reduction in required photons and therefore improvement in imaging speed. Additionally, reducing the number of free parameters in the fitting algorithm by fixing the lifetimes to known values significantly reduced the lifetime component error from 27.3% to 3.2% in SPCImage (p<0.0001) and from 50.6% to 4.2% in SLIM Curve (p<0.0001). Analysis of nicotinamide adenine dinucleotide–lactate dehydrogenase (NADH-LDH) solutions confirmed temporal binning of TCSPC data and a reduced number of free parameters improves exponential decay fit accuracy in SPCImage. Altogether, temporal binning (in SPCImage) and reduced free parameters are data analysis techniques that enable accurate lifetime estimation from low photon count data and enable TCSPC imaging speeds up to 6x and 300x faster, respectively, than traditional TCSPC analysis. PMID:27446663

  2. Demolition of a hospital building by controlled explosion: the impact on filamentous fungal load in internal and external air.

    PubMed

    Bouza, E; Peláez, T; Pérez-Molina, J; Marín, M; Alcalá, L; Padilla, B; Muñoz, P; Adán, P; Bové, B; Bueno, M J; Grande, F; Puente, D; Rodríguez, M P; Rodríguez-Créixems, M; Vigil, D; Cuevas, O

    2002-12-01

    The demolition of a maternity building at our institution provided us with the opportunity to study the load of filamentous fungi in the air. External (nearby streets) and internal (within the hospital buildings) air was sampled with an automatic volumetric machine (MAS-100 Air Samplair) at least daily during the week before the demolition, at 10, 30, 60, 90,120, 180, 240, 420, 540 and 660 min post-demolition, daily during the week after the demolition and weekly during weeks 2, 3 and 4 after demolition. Samples were duplicated to analyse reproducibility. Three hundred and forty samples were obtained: 115 external air, 69 'non-protected' internal air and 156 protected internal air [high efficiency particulate air (HEPA) filtered air under positive pressure]. A significant increase in the colony count of filamentous fungi occurred after the demolition. Median colony counts of external air on demolition day were significantly higher than from internal air (70.2 cfu/m(3) vs 35.8 cfu/m(3)) (P < 0.001). Mechanical demolition on day +4 also produced a significant difference between external and internal air (74.5 cfu/m(3) vs 41.7 cfu/m(3)). The counts returned to baseline levels on day +11. Most areas with a protected air supply yielded no colonies before demolition day and remained negative on demolition day. The reproducibility of the count method was good (intra-assay variance: 2.4 cfu/m(3)). No episodes of invasive filamentous mycosis were detected during the three months following the demolition. Demolition work was associated with a significant increase in the fungal colony counts of hospital external and non-protected internal air. Effective protective measures may be taken to avoid the emergence of clinical infections. Copyright 2002 The Hospital Infection Society

  3. An efficient algorithm for accurate computation of the Dirichlet-multinomial log-likelihood function.

    PubMed

    Yu, Peng; Shaw, Chad A

    2014-06-01

    The Dirichlet-multinomial (DMN) distribution is a fundamental model for multicategory count data with overdispersion. This distribution has many uses in bioinformatics including applications to metagenomics data, transctriptomics and alternative splicing. The DMN distribution reduces to the multinomial distribution when the overdispersion parameter ψ is 0. Unfortunately, numerical computation of the DMN log-likelihood function by conventional methods results in instability in the neighborhood of [Formula: see text]. An alternative formulation circumvents this instability, but it leads to long runtimes that make it impractical for large count data common in bioinformatics. We have developed a new method for computation of the DMN log-likelihood to solve the instability problem without incurring long runtimes. The new approach is composed of a novel formula and an algorithm to extend its applicability. Our numerical experiments show that this new method both improves the accuracy of log-likelihood evaluation and the runtime by several orders of magnitude, especially in high-count data situations that are common in deep sequencing data. Using real metagenomic data, our method achieves manyfold runtime improvement. Our method increases the feasibility of using the DMN distribution to model many high-throughput problems in bioinformatics. We have included in our work an R package giving access to this method and a vingette applying this approach to metagenomic data. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  4. Dynamic models for estimating the effect of HAART on CD4 in observational studies: Application to the Aquitaine Cohort and the Swiss HIV Cohort Study.

    PubMed

    Prague, Mélanie; Commenges, Daniel; Gran, Jon Michael; Ledergerber, Bruno; Young, Jim; Furrer, Hansjakob; Thiébaut, Rodolphe

    2017-03-01

    Highly active antiretroviral therapy (HAART) has proved efficient in increasing CD4 counts in many randomized clinical trials. Because randomized trials have some limitations (e.g., short duration, highly selected subjects), it is interesting to assess the effect of treatments using observational studies. This is challenging because treatment is started preferentially in subjects with severe conditions. This general problem had been treated using Marginal Structural Models (MSM) relying on the counterfactual formulation. Another approach to causality is based on dynamical models. We present three discrete-time dynamic models based on linear increments models (LIM): the first one based on one difference equation for CD4 counts, the second with an equilibrium point, and the third based on a system of two difference equations, which allows jointly modeling CD4 counts and viral load. We also consider continuous-time models based on ordinary differential equations with non-linear mixed effects (ODE-NLME). These mechanistic models allow incorporating biological knowledge when available, which leads to increased statistical evidence for detecting treatment effect. Because inference in ODE-NLME is numerically challenging and requires specific methods and softwares, LIM are a valuable intermediary option in terms of consistency, precision, and complexity. We compare the different approaches in simulation and in illustration on the ANRS CO3 Aquitaine Cohort and the Swiss HIV Cohort Study. © 2016, The International Biometric Society.

  5. Quantum dots-based double imaging combined with organic dye imaging to establish an automatic computerized method for cancer Ki67 measurement.

    PubMed

    Wang, Lin-Wei; Qu, Ai-Ping; Liu, Wen-Lou; Chen, Jia-Mei; Yuan, Jing-Ping; Wu, Han; Li, Yan; Liu, Juan

    2016-02-03

    As a widely used proliferative marker, Ki67 has important impacts on cancer prognosis, especially for breast cancer (BC). However, variations in analytical practice make it difficult for pathologists to manually measure Ki67 index. This study is to establish quantum dots (QDs)-based double imaging of nuclear Ki67 as red signal by QDs-655, cytoplasmic cytokeratin (CK) as yellow signal by QDs-585, and organic dye imaging of cell nucleus as blue signal by 4',6-diamidino-2-phenylindole (DAPI), and to develop a computer-aided automatic method for Ki67 index measurement. The newly developed automatic computerized Ki67 measurement could efficiently recognize and count Ki67-positive cancer cell nuclei with red signals and cancer cell nuclei with blue signals within cancer cell cytoplasmic with yellow signals. Comparisons of computerized Ki67 index, visual Ki67 index, and marked Ki67 index for 30 patients of 90 images with Ki67 ≤ 10% (low grade), 10% < Ki67 < 50% (moderate grade), and Ki67 ≥ 50% (high grade) showed computerized Ki67 counting is better than visual Ki67 counting, especially for Ki67 low and moderate grades. Based on QDs-based double imaging and organic dye imaging on BC tissues, this study successfully developed an automatic computerized Ki67 counting method to measure Ki67 index.

  6. Dark-count-less photon-counting x-ray computed tomography system using a YAP-MPPC detector

    NASA Astrophysics Data System (ADS)

    Sato, Eiichi; Sato, Yuich; Abudurexiti, Abulajiang; Hagiwara, Osahiko; Matsukiyo, Hiroshi; Osawa, Akihiro; Enomoto, Toshiyuki; Watanabe, Manabu; Kusachi, Shinya; Sato, Shigehiro; Ogawa, Akira; Onagawa, Jun

    2012-10-01

    A high-sensitive X-ray computed tomography (CT) system is useful for decreasing absorbed dose for patients, and a dark-count-less photon-counting CT system was developed. X-ray photons are detected using a YAP(Ce) [cerium-doped yttrium aluminum perovskite] single crystal scintillator and an MPPC (multipixel photon counter). Photocurrents are amplified by a high-speed current-voltage amplifier, and smooth event pulses from an integrator are sent to a high-speed comparator. Then, logical pulses are produced from the comparator and are counted by a counter card. Tomography is accomplished by repeated linear scans and rotations of an object, and projection curves of the object are obtained by the linear scan. The image contrast of gadolinium medium slightly fell with increase in lower-level voltage (Vl) of the comparator. The dark count rate was 0 cps, and the count rate for the CT was approximately 250 kcps.

  7. Evaluation of common cleaning and disinfection programmes in battery cage and on-floor layer houses in France.

    PubMed

    Huneau-Salaün, A; Michel, V; Balaine, L; Petetin, I; Eono, F; Ecobichon, F; Bouquin, S Le

    2010-04-01

    1. The aim in this study was to evaluate cleaning and disinfection programmes in battery cage and on-floor layer houses in France. 2. Cleaning and disinfection efficiency was assessed by a visual evaluation of cleaning and a bacteriological monitoring of surface contamination from counts of thermotolerant streptococci on contact agar plates. 3. In battery cage houses, dropping belts, manure conveyors, and house floors remained highly contaminated due to poor cleaning in half of the buildings examined. 4. In on-floor houses, a high standard of cleaning was achieved but errors in the planning of cleaning and disinfection operations sometimes led to a high residual contamination of nest boxes and egg sorting tables.

  8. Compact Efficient Lidar Receiver for Measuring Atmospheric Aerosols

    NASA Technical Reports Server (NTRS)

    Gili, Christopher; De Young, Russell

    2006-01-01

    A small, light weight, and efficient aerosol lidar receiver was constructed and tested. Weight and space savings were realized by using rigid optic tubes and mounting cubes to package the steering optics and detectors in a compact assembly. The receiver had a 1064nm channel using an APD detector. The 532nm channel was split (90/10) into an analog channel (90%) and a photon counting channel (10%). The efficiency of the 1064nm channel with optical filter was 44.0%. The efficiency of the analog 532nm channel was 61.4% with the optical filter, and the efficiency of the 532nm photon counting channel was 7.6% with the optical filter. The results of the atmospheric tests show that the detectors were able to consistently return accurate results. The lidar receiver was able to detect distinct cloud layers, and the lidar returns also agreed across the different detectors. The use of a light weight fiber-coupled telescope reduced weight and allowed great latitude in detector assembly positioning due to the flexibility enabled by the use of fiber optics. The receiver is now ready to be deployed for aircraft or ground based aerosol lidar measurements.

  9. Free-running InGaAs single photon detector with 1 dark count per second at 10% efficiency

    NASA Astrophysics Data System (ADS)

    Korzh, B.; Walenta, N.; Lunghi, T.; Gisin, N.; Zbinden, H.

    2014-02-01

    We present a free-running single photon detector for telecom wavelengths based on a negative feedback avalanche photodiode (NFAD). A dark count rate as low as 1 cps was obtained at a detection efficiency of 10%, with an afterpulse probability of 2.2% for 20 μs of deadtime. This was achieved by using an active hold-off circuit and cooling the NFAD with a free-piston stirling cooler down to temperatures of -110 °C. We integrated two detectors into a practical, 625 MHz clocked quantum key distribution system. Stable, real-time key distribution in the presence of 30 dB channel loss was possible, yielding a secret key rate of 350 bps.

  10. Analysis of 161Tb by radiochemical separation and liquid scintillation counting

    DOE PAGES

    Jiang, J.; Davies, A.; Arrigo, L.; ...

    2015-12-05

    The determination of 161Tb activity is problematic due to its very low fission yield, short half-life, and the complication of its gamma spectrum. At AWE, radiochemically purified 161Tb solution was measured on a PerkinElmer 1220 Quantulus TM Liquid Scintillation Spectrometer. Since there was no 161Tb certified standard solution available commercially, the counting efficiency was determined by the CIEMAT/NIST Efficiency Tracing method. The method was validated during a recent inter-laboratory comparison exercise involving the analysis of a uranium sample irradiated with thermal neutrons. Lastly, the measured 161Tb result was in excellent agreement with the result using gamma spectrometry and the resultmore » obtained by Pacific Northwest National Laboratory.« less

  11. High-Density Liquid-State Machine Circuitry for Time-Series Forecasting.

    PubMed

    Rosselló, Josep L; Alomar, Miquel L; Morro, Antoni; Oliver, Antoni; Canals, Vincent

    2016-08-01

    Spiking neural networks (SNN) are the last neural network generation that try to mimic the real behavior of biological neurons. Although most research in this area is done through software applications, it is in hardware implementations in which the intrinsic parallelism of these computing systems are more efficiently exploited. Liquid state machines (LSM) have arisen as a strategic technique to implement recurrent designs of SNN with a simple learning methodology. In this work, we show a new low-cost methodology to implement high-density LSM by using Boolean gates. The proposed method is based on the use of probabilistic computing concepts to reduce hardware requirements, thus considerably increasing the neuron count per chip. The result is a highly functional system that is applied to high-speed time series forecasting.

  12. WOMBAT: A Scalable and High-performance Astrophysical Magnetohydrodynamics Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendygral, P. J.; Radcliffe, N.; Kandalla, K.

    2017-02-01

    We present a new code for astrophysical magnetohydrodynamics specifically designed and optimized for high performance and scaling on modern and future supercomputers. We describe a novel hybrid OpenMP/MPI programming model that emerged from a collaboration between Cray, Inc. and the University of Minnesota. This design utilizes MPI-RMA optimized for thread scaling, which allows the code to run extremely efficiently at very high thread counts ideal for the latest generation of multi-core and many-core architectures. Such performance characteristics are needed in the era of “exascale” computing. We describe and demonstrate our high-performance design in detail with the intent that it maymore » be used as a model for other, future astrophysical codes intended for applications demanding exceptional performance.« less

  13. Patient-dependent count-rate adaptive normalization for PET detector efficiency with delayed-window coincidence events

    NASA Astrophysics Data System (ADS)

    Niu, Xiaofeng; Ye, Hongwei; Xia, Ting; Asma, Evren; Winkler, Mark; Gagnon, Daniel; Wang, Wenli

    2015-07-01

    Quantitative PET imaging is widely used in clinical diagnosis in oncology and neuroimaging. Accurate normalization correction for the efficiency of each line-of- response is essential for accurate quantitative PET image reconstruction. In this paper, we propose a normalization calibration method by using the delayed-window coincidence events from the scanning phantom or patient. The proposed method could dramatically reduce the ‘ring’ artifacts caused by mismatched system count-rates between the calibration and phantom/patient datasets. Moreover, a modified algorithm for mean detector efficiency estimation is proposed, which could generate crystal efficiency maps with more uniform variance. Both phantom and real patient datasets are used for evaluation. The results show that the proposed method could lead to better uniformity in reconstructed images by removing ring artifacts, and more uniform axial variance profiles, especially around the axial edge slices of the scanner. The proposed method also has the potential benefit to simplify the normalization calibration procedure, since the calibration can be performed using the on-the-fly acquired delayed-window dataset.

  14. Mechanistic evaluation of the pros and cons of digital RT-LAMP for HIV-1 viral load quantification on a microfluidic device and improved efficiency via a two-step digital protocol.

    PubMed

    Sun, Bing; Shen, Feng; McCalla, Stephanie E; Kreutz, Jason E; Karymov, Mikhail A; Ismagilov, Rustem F

    2013-02-05

    Here we used a SlipChip microfluidic device to evaluate the performance of digital reverse transcription-loop-mediated isothermal amplification (dRT-LAMP) for quantification of HIV viral RNA. Tests are needed for monitoring HIV viral load to control the emergence of drug resistance and to diagnose acute HIV infections. In resource-limited settings, in vitro measurement of HIV viral load in a simple format is especially needed, and single-molecule counting using a digital format could provide a potential solution. We showed here that when one-step dRT-LAMP is used for quantification of HIV RNA, the digital count is lower than expected and is limited by the yield of desired cDNA. We were able to overcome the limitations by developing a microfluidic protocol to manipulate many single molecules in parallel through a two-step digital process. In the first step we compartmentalize the individual RNA molecules (based on Poisson statistics) and perform reverse transcription on each RNA molecule independently to produce DNA. In the second step, we perform the LAMP amplification on all individual DNA molecules in parallel. Using this new protocol, we increased the absolute efficiency (the ratio between the concentration calculated from the actual count and the expected concentration) of dRT-LAMP 10-fold, from ∼2% to ∼23%, by (i) using a more efficient reverse transcriptase, (ii) introducing RNase H to break up the DNA:RNA hybrid, and (iii) adding only the BIP primer during the RT step. We also used this two-step method to quantify HIV RNA purified from four patient samples and found that in some cases, the quantification results were highly sensitive to the sequence of the patient's HIV RNA. We learned the following three lessons from this work: (i) digital amplification technologies, including dLAMP and dPCR, may give adequate dilution curves and yet have low efficiency, thereby providing quantification values that underestimate the true concentration. Careful validation is essential before a method is considered to provide absolute quantification; (ii) the sensitivity of dLAMP to the sequence of the target nucleic acid necessitates additional validation with patient samples carrying the full spectrum of mutations; (iii) for multistep digital amplification chemistries, such as a combination of reverse transcription with amplification, microfluidic devices may be used to decouple these steps from one another and to perform them under different, individually optimized conditions for improved efficiency.

  15. Chemical vapor deposition of aminopropyl silanes in microfluidic channels for highly efficient microchip capillary electrophoresis-electrospray ionization-mass spectrometry.

    PubMed

    Batz, Nicholas G; Mellors, J Scott; Alarie, Jean Pierre; Ramsey, J Michael

    2014-04-01

    We describe a chemical vapor deposition (CVD) method for the surface modification of glass microfluidic devices designed to perform electrophoretic separations of cationic species. The microfluidic channel surfaces were modified using aminopropyl silane reagents. Coating homogeneity was inferred by precise measurement of the separation efficiency and electroosmotic mobility for multiple microfluidic devices. Devices coated with (3-aminopropyl)di-isopropylethoxysilane (APDIPES) yielded near diffusion-limited separations and exhibited little change in electroosmotic mobility between pH 2.8 and pH 7.5. We further evaluated the temporal stability of both APDIPES and (3-aminopropyl)triethoxysilane (APTES) coatings when stored for a total of 1 week under vacuum at 4 °C or filled with pH 2.8 background electrolyte at room temperature. Measurements of electroosmotic flow (EOF) and separation efficiency during this time confirmed that both coatings were stable under both conditions. Microfluidic devices with a 23 cm long, serpentine electrophoretic separation channel and integrated nanoelectrospray ionization emitter were CVD coated with APDIPES and used for capillary electrophoresis (CE)-electrospray ionization (ESI)-mass spectrometry (MS) of peptides and proteins. Peptide separations were fast and highly efficient, yielding theoretical plate counts over 600,000 and a peak capacity of 64 in less than 90 s. Intact protein separations using these devices yielded Gaussian peak profiles with separation efficiencies between 100,000 and 400,000 theoretical plates.

  16. A loophole-free Bell's inequality experiment

    NASA Astrophysics Data System (ADS)

    Kwiat, Paul G.; Steinberg, Aephraim M.; Chiao, Raymond Y.; Eberhard, Philippe H.

    1994-05-01

    The proof of Nature's nonlocality through Bell-type experiments is a topic of longstanding interest. Nevertheless, no experiments performed thus far have avoided the so-called 'detection loophole,' arising from low detector efficiencies and angular-correlation difficulties. In fact, most, if not all, of the systems employed to date can never close this loophole, even with perfect detectors. In addition, another loophole involving the non-rapid, non-random switching of various parameter settings exists in all past experiments. We discuss a proposal for a potentially loophole-free Bell's inequality experiment. The source of the EPR-correlated pairs consists of two simultaneously-pumped type-2 phase-matched nonlinear crystals and a polarizing beam splitter. The feasibility of such a scheme with current detector technology seems high, and will be discussed. We also present a single-crystal version, motivated by other work presented at this conference. In a separate experiment, we have measured the absolute detection efficiency and time response of four single-photon detectors. The highest observed efficiencies were 70.7 plus or minus 1.9 percent (at 633 nm, with a device from Rockwell International) and 76.4 plus or minus 2.3 percent (at 702 nm, with an EG&G counting module). Possible efficiencies as high as 90 percent were implied. The EG&G devices displayed sub-nanosecond time resolution.

  17. Industrial metabolism of chlorine: a case study of a chlor-alkali industrial chain.

    PubMed

    Han, Feng; Li, Wenfeng; Yu, Fei; Cui, Zhaojie

    2014-05-01

    Substance flow analysis (SFA) is applied to a case study of chlorine metabolism in a chlor-alkali industrial chain. A chain-level SFA model is constructed, and eight indices are proposed to analyze and evaluate the metabolic status of elemental chlorine. The primary objectives of this study are to identify low-efficiency links in production processes and to find ways to improve the operational performance of the industrial chain. Five-year in-depth data collection and analysis revealed that system production efficiency and source efficiency continued increasing since 2008, i.e., when the chain was first formed, at average annual growth rates of 21.01 % and 1.01 %, respectively. In 2011, 64.15 % of the total chlorine input was transformed into final products. That is, as high as 98.50 % of the chlorine inputs were utilized when other by-products were counted. Chlorine loss occurred mostly in the form of chloride ions in wastewater, and the system loss rate was 0.54 %. The metabolic efficiency of chlorine in this case was high, and the chain system had minimal impact on the environment. However, from the perspectives of processing depth and economic output, the case study of a chlor-alkali industrial chain still requires expansion.

  18. A loophole-free Bell's inequality experiment

    NASA Technical Reports Server (NTRS)

    Kwiat, Paul G.; Steinberg, Aephraim M.; Chiao, Raymond Y.; Eberhard, Philippe H.

    1994-01-01

    The proof of Nature's nonlocality through Bell-type experiments is a topic of longstanding interest. Nevertheless, no experiments performed thus far have avoided the so-called 'detection loophole,' arising from low detector efficiencies and angular-correlation difficulties. In fact, most, if not all, of the systems employed to date can never close this loophole, even with perfect detectors. In addition, another loophole involving the non-rapid, non-random switching of various parameter settings exists in all past experiments. We discuss a proposal for a potentially loophole-free Bell's inequality experiment. The source of the EPR-correlated pairs consists of two simultaneously-pumped type-2 phase-matched nonlinear crystals and a polarizing beam splitter. The feasibility of such a scheme with current detector technology seems high, and will be discussed. We also present a single-crystal version, motivated by other work presented at this conference. In a separate experiment, we have measured the absolute detection efficiency and time response of four single-photon detectors. The highest observed efficiencies were 70.7 plus or minus 1.9 percent (at 633 nm, with a device from Rockwell International) and 76.4 plus or minus 2.3 percent (at 702 nm, with an EG&G counting module). Possible efficiencies as high as 90 percent were implied. The EG&G devices displayed sub-nanosecond time resolution.

  19. Computing row and column counts for sparse QR and LU factorization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, John R.; Li, Xiaoye S.; Ng, Esmond G.

    2001-01-01

    We present algorithms to determine the number of nonzeros in each row and column of the factors of a sparse matrix, for both the QR factorization and the LU factorization with partial pivoting. The algorithms use only the nonzero structure of the input matrix, and run in time nearly linear in the number of nonzeros in that matrix. They may be used to set up data structures or schedule parallel operations in advance of the numerical factorization. The row and column counts we compute are upper bounds on the actual counts. If the input matrix is strong Hall and theremore » is no coincidental numerical cancellation, the counts are exact for QR factorization and are the tightest bounds possible for LU factorization. These algorithms are based on our earlier work on computing row and column counts for sparse Cholesky factorization, plus an efficient method to compute the column elimination tree of a sparse matrix without explicitly forming the product of the matrix and its transpose.« less

  20. A compact and high efficiency GAGG well counter for radiocesium concentration measurements

    NASA Astrophysics Data System (ADS)

    Yamamoto, Seiichi; Ogata, Yoshimune

    2014-07-01

    After the Fukushima nuclear disaster, social concern about radiocesium (137Cs and 134Cs) contamination in food increased. However, highly efficient instruments that can measure low level radioactivity are quite expensive and heavy. A compact, lightweight, and reliable radiation detector that can inexpensively monitor low level radiocesium is highly desired. We developed a compact and highly efficient radiocesium detector to detect ~32 keV X-rays from radiocesium instead of high energy gamma photons. A 1-mm thick GAGG scintillator was selected to effectively detect ~32 keV X-rays from 137Cs to reduce the influence of ambient radiation. Four sets of 25 mm×25 mm×1 mm GAGG plates, each of which was optically coupled to a triangular-shaped light guide, were optically coupled to a photomultiplier tube (PMT) to form a square-shaped well counter. Another GAGG plate was directly optically coupled to the PMT to form its bottom detector. The energy resolution of the GAGG well counter was 22.3% FWHM for 122 keV gamma rays and 32% FWHM for ~32 keV X-rays. The counting efficiency for the X-rays from radiocesium (mixture of 137Cs and 134Cs) was 4.5%. In measurements of the low level radiocesium mixture, a photo-peak of ~32 keV X-rays can clearly be distinguished from the background. The minimum detectable activity (MDA) was estimated to be ~100 Bq/kg for 1000 s measurement. The results show that our developed GAGG well counter is promising for the detection of radiocesium in food.

  1. Optimizing microwave photodetection: input-output theory

    NASA Astrophysics Data System (ADS)

    Schöndorf, M.; Govia, L. C. G.; Vavilov, M. G.; McDermott, R.; Wilhelm, F. K.

    2018-04-01

    High fidelity microwave photon counting is an important tool for various areas from background radiation analysis in astronomy to the implementation of circuit quantum electrodynamic architectures for the realization of a scalable quantum information processor. In this work we describe a microwave photon counter coupled to a semi-infinite transmission line. We employ input-output theory to examine a continuously driven transmission line as well as traveling photon wave packets. Using analytic and numerical methods, we calculate the conditions on the system parameters necessary to optimize measurement and achieve high detection efficiency. With this we can derive a general matching condition depending on the different system rates, under which the measurement process is optimal.

  2. Relationship between salivary flow rates and Candida counts in subjects with xerostomia.

    PubMed

    Torres, Sandra R; Peixoto, Camila Bernardo; Caldas, Daniele Manhães; Silva, Eline Barboza; Akiti, Tiyomi; Nucci, Márcio; de Uzeda, Milton

    2002-02-01

    This study evaluated the relationship between salivary flow and Candida colony counts in the saliva of patients with xerostomia. Sialometry and Candida colony-forming unit (CFU) counts were taken from 112 subjects who reported xerostomia in a questionnaire. Chewing-stimulated whole saliva was collected and streaked in Candida plates and counted in 72 hours. Species identification was accomplished under standard methods. There was a significant inverse relationship between salivary flow and Candida CFU counts (P =.007) when subjects with high colony counts were analyzed (cutoff point of 400 or greater CFU/mL). In addition, the median sialometry of men was significantly greater than that of women (P =.003), even after controlling for confounding variables like underlying disease and medications. Sjögren's syndrome was associated with low salivary flow rate (P =.007). There was no relationship between the median Candida CFU counts and gender or age. There was a high frequency (28%) of mixed colonization. Candida albicans was the most frequent species, followed by C parapsilosis, C tropicalis, and C krusei. In subjects with high Candida CFU counts there was an inverse relationship between salivary flow and Candida CFU counts.

  3. Bias sputtered NbN and superconducting nanowire devices

    NASA Astrophysics Data System (ADS)

    Dane, Andrew E.; McCaughan, Adam N.; Zhu, Di; Zhao, Qingyuan; Kim, Chung-Soo; Calandri, Niccolo; Agarwal, Akshay; Bellei, Francesco; Berggren, Karl K.

    2017-09-01

    Superconducting nanowire single photon detectors (SNSPDs) promise to combine near-unity quantum efficiency with >100 megacounts per second rates, picosecond timing jitter, and sensitivity ranging from x-ray to mid-infrared wavelengths. However, this promise is not yet fulfilled, as superior performance in all metrics is yet to be combined into one device. The highest single-pixel detection efficiency and the widest bias windows for saturated quantum efficiency have been achieved in SNSPDs based on amorphous materials, while the lowest timing jitter and highest counting rates were demonstrated in devices made from polycrystalline materials. Broadly speaking, the amorphous superconductors that have been used to make SNSPDs have higher resistivities and lower critical temperature (Tc) values than typical polycrystalline materials. Here, we demonstrate a method of preparing niobium nitride (NbN) that has lower-than-typical superconducting transition temperature and higher-than-typical resistivity. As we will show, NbN deposited onto unheated SiO2 has a low Tc and high resistivity but is too rough for fabricating unconstricted nanowires, and Tc is too low to yield SNSPDs that can operate well at liquid helium temperatures. By adding a 50 W RF bias to the substrate holder during sputtering, the Tc of the unheated NbN films was increased by up to 73%, and the roughness was substantially reduced. After optimizing the deposition for nitrogen flow rates, we obtained 5 nm thick NbN films with a Tc of 7.8 K and a resistivity of 253 μΩ cm. We used this bias sputtered room temperature NbN to fabricate SNSPDs. Measurements were performed at 2.5 K using 1550 nm light. Photon count rates appeared to saturate at bias currents approaching the critical current, indicating that the device's quantum efficiency was approaching unity. We measured a single-ended timing jitter of 38 ps. The optical coupling to these devices was not optimized; however, integration with front-side optical structures to improve absorption should be straightforward. This material preparation was further used to fabricate nanocryotrons and a large-area imager device, reported elsewhere. The simplicity of the preparation and promising device performance should enable future high-performance devices.

  4. Production of Short-Lived 37K

    NASA Astrophysics Data System (ADS)

    Stephens, Heather; Melconian, Dan; Shidling, Praveen

    2011-03-01

    The purpose of our work during the summer months of 2010 was to produce a beam of 37 K with >= 99 % purity and characterize in detail the remaining contaminants. A projectile beam of 38 Ar at 25 and 29 MeV/nucleon from the K500 cyclotron generated the 37 K by reacting with an H2 gas target. The MARS spectrometer was then used to separate the reaction products of interest from the primary beam and other unwanted reaction products. From analysis of our production experiment, we were able to successfully produce 807 counts/nC of 37 K with 99.19% purity at 25MeV/u and 1756 counts/nC with 98.93% purity at 29MeV/u. The purity of this beam and rate of production is more than adequate for use in determining the half-life of 37 K, the next step to be done by the team in August 2010. This measurement will be accomplished by implanting the activity into a Mylar tape, placing it between two high-efficiency gas counters and counting the amount of beta decays as a function of time. It is expected the half-life will be measured using the 37 K produced from 38 Ar at 29MeV/u. Funded by DOE and NSF-REU Program.

  5. Production of Short-Lived ^37K

    NASA Astrophysics Data System (ADS)

    Stephens, Heather; Melconian, Dan; Shidling, Praveen

    2010-11-01

    The purpose of our work during the summer months of 2010 was to produce a beam of ^37K with >= 99% purity and characterize in detail the remaining contaminants. A projectile beam of ^38Ar at 25 and 29 MeV/nucleon from the K500 cyclotron generated the ^37K by reacting with an H2 gas target. The MARS spectrometer was then used to separate the reaction products of interest from the primary beam and other unwanted reaction products. From analysis of our production experiment, we were able to successfully produce 807 counts/nC of ^37K with 99.19% purity at 25MeV/u and 1756 counts/nC with 98.93% purity at 29MeV/u. The purity of this beam and rate of production is more than adequate for use in determining the half-life of ^37K, the next step to be done by the team in August 2010. This measurement will be accomplished by implanting the activity into a Mylar tape, placing it between two high-efficiency gas counters and counting the amount of beta decays as a function of time. It is expected the half-life will be measured using the ^37K produced from ^38Ar at 29MeV/u.

  6. Production of Short-Lived 37 K

    NASA Astrophysics Data System (ADS)

    Stephens, Heather; Melconian, Dan; Shidling, Praveen

    2011-04-01

    The purpose of our work during the summer months of 2010 was to produce a beam of 37 K with >= 99% purity and characterize in detail the remaining contaminants. A projectile beam of 38Ar at 25 and 29 MeV/nucleon from the K500 cyclotron generated the 37 K by reacting with an H2 gas target. The MARS spectrometer was then used to separate the reaction products of interest from the primary beam and other unwanted reaction products. From analysis of our production experiment, we were able to successfully produce 807 counts/nC of 37 K with 99.19% purity at 25MeV/u and 1756 counts/nC with 98.93% purity at 29MeV/u. The purity of this beam and rate of production is more than adequate for use in determining the half-life of 37 K, the next step to be done by the team in August 2010. This measurement will be accomplished by implanting the activity into a Mylar tape, placing it between two high-efficiency gas counters and counting the amount of beta decays as a function of time. It is expected the half-life will be measured using the 37 K produced from 38Ar at 29MeV/u. Funded by DOE and NSF-REU Program

  7. Allele-specific copy-number discovery from whole-genome and whole-exome sequencing

    PubMed Central

    Wang, WeiBo; Wang, Wei; Sun, Wei; Crowley, James J.; Szatkiewicz, Jin P.

    2015-01-01

    Copy-number variants (CNVs) are a major form of genetic variation and a risk factor for various human diseases, so it is crucial to accurately detect and characterize them. It is conceivable that allele-specific reads from high-throughput sequencing data could be leveraged to both enhance CNV detection and produce allele-specific copy number (ASCN) calls. Although statistical methods have been developed to detect CNVs using whole-genome sequence (WGS) and/or whole-exome sequence (WES) data, information from allele-specific read counts has not yet been adequately exploited. In this paper, we develop an integrated method, called AS-GENSENG, which incorporates allele-specific read counts in CNV detection and estimates ASCN using either WGS or WES data. To evaluate the performance of AS-GENSENG, we conducted extensive simulations, generated empirical data using existing WGS and WES data sets and validated predicted CNVs using an independent methodology. We conclude that AS-GENSENG not only predicts accurate ASCN calls but also improves the accuracy of total copy number calls, owing to its unique ability to exploit information from both total and allele-specific read counts while accounting for various experimental biases in sequence data. Our novel, user-friendly and computationally efficient method and a complete analytic protocol is freely available at https://sourceforge.net/projects/asgenseng/. PMID:25883151

  8. Gadolinium-loaded Plastic Scintillators for Thermal Neutron Detection using Compensation

    NASA Astrophysics Data System (ADS)

    Dumazert, Jonathan; Coulon, Romain; Hamel, Matthieu; Carrel, Frédérick; Sguerra, Fabien; Normand, Stéphane; Méchin, Laurence; Bertrand, Guillaume H. V.

    2016-06-01

    Plastic scintillator loading with gadolinium-rich organometallic complexes shows a high potential for the deployment of efficient and cost-effective neutron detectors. Due to the low-energy photon and electron signature of thermal neutron capture by Gd-155 and Gd-157, alternative treatment to pulse-shape discrimination has to be proposed in order to display a count rate. This paper discloses the principle of a compensation method applied to a two-scintillator system: a detection scintillator interacts with photon and fast neutron radiation and is loaded with gadolinium organometallic compound to become a thermal neutron absorber, while a not-gadolinium loaded compensation scintillator solely interacts with the fast neutron and photon part of incident radiation. After the nonlinear smoothing of the counting signals, a hypothesis test determines whether the resulting count rate post-background response compensation falls into statistical fluctuations or provides a robust indication of neutron activity. Laboratory samples are tested under both photon and neutron irradiations, allowing the authors to investigate the performance of the overall detection system in terms of sensitivity and detection limits, especially with regards to a similar-active volume He-3 based commercial counter. The study reveals satisfactory figures of merit in terms of sensitivity and directs future investigation toward promising paths.

  9. Revision of the NIST Standard for (223)Ra: New Measurements and Review of 2008 Data.

    PubMed

    Zimmerman, B E; Bergeron, D E; Cessna, J T; Fitzgerald, R; Pibida, L

    2015-01-01

    After discovering a discrepancy in the transfer standard currently being disseminated by the National Institute of Standards and Technology (NIST), we have performed a new primary standardization of the alpha-emitter (223)Ra using Live-timed Anticoincidence Counting (LTAC) and the Triple-to-Double Coincidence Ratio Method (TDCR). Additional confirmatory measurements were made with the CIEMAT-NIST efficiency tracing method (CNET) of liquid scintillation counting, integral γ-ray counting using a NaI(Tl) well counter, and several High Purity Germanium (HPGe) detectors in an attempt to understand the origin of the discrepancy and to provide a correction. The results indicate that a -9.5 % difference exists between activity values obtained using the former transfer standard relative to the new primary standardization. During one of the experiments, a 2 % difference in activity was observed between dilutions of the (223)Ra master solution prepared using the composition used in the original standardization and those prepared using 1 mol·L(-1) HCl. This effect appeared to be dependent on the number of dilutions or the total dilution factor to the master solution, but the magnitude was not reproducible. A new calibration factor ("K-value") has been determined for the NIST Secondary Standard Ionization Chamber (IC "A"), thereby correcting the discrepancy between the primary and secondary standards.

  10. A method for quantitative analysis of standard and high-throughput qPCR expression data based on input sample quantity.

    PubMed

    Adamski, Mateusz G; Gumann, Patryk; Baird, Alison E

    2014-01-01

    Over the past decade rapid advances have occurred in the understanding of RNA expression and its regulation. Quantitative polymerase chain reactions (qPCR) have become the gold standard for quantifying gene expression. Microfluidic next generation, high throughput qPCR now permits the detection of transcript copy number in thousands of reactions simultaneously, dramatically increasing the sensitivity over standard qPCR. Here we present a gene expression analysis method applicable to both standard polymerase chain reactions (qPCR) and high throughput qPCR. This technique is adjusted to the input sample quantity (e.g., the number of cells) and is independent of control gene expression. It is efficiency-corrected and with the use of a universal reference sample (commercial complementary DNA (cDNA)) permits the normalization of results between different batches and between different instruments--regardless of potential differences in transcript amplification efficiency. Modifications of the input quantity method include (1) the achievement of absolute quantification and (2) a non-efficiency corrected analysis. When compared to other commonly used algorithms the input quantity method proved to be valid. This method is of particular value for clinical studies of whole blood and circulating leukocytes where cell counts are readily available.

  11. N-(2-Ethylhexyl)carbazole: A New Fluorophore Highly Suitable as a Monomolecular Liquid Scintillator.

    PubMed

    Montbarbon, Eva; Sguerra, Fabien; Bertrand, Guillaume H V; Magnier, Élodie; Coulon, Romain; Pansu, Robert B; Hamel, Matthieu

    2016-08-16

    The synthesis, photophysical properties, and applications in scintillation counting of N-(2-ethylhexyl)carbazole (EHCz) are reported. This molecule displays all of the required characteristics for an efficient liquid scintillator (emission wavelength, scintillation yield), and can be used without any extra fluorophores. Thus, its scintillation properties are discussed, as well as its fast neutron/gamma discrimination. For the latter application, the material is compared with the traditional liquid scintillator BC-501 A, and other liquid fluorescent molecules classically used as scintillation solvents, such as xylene, pseudocumene (PC), linear alkylbenzenes (LAB), diisopropylnaphthalene (DIN), 1-methylnaphthalene (1-MeNapht), and 4-isopropylbiphenyl (iPrBiph). For the first time, an excimeric form of a molecule has been advantageously used in scintillation counting. A moderate discrimination between fast neutrons and gamma rays was observed in bulk EHCz, with an apparent neutron/gamma discrimination potential half of that of BC-501 A. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. The ONR-602 experiment and investigation of particle precipitation near the equator

    NASA Technical Reports Server (NTRS)

    Miah, M. A.

    1991-01-01

    The global precipitation of radiation belt particles at low altitude was investigated, using the ONR-602 experiment on board U.S. Air Force mission S81-1. A combination of a main telescope, beginning analysis at a few MeV/nucleon, and a monitor system, giving results below 1 MeV/nucleon, was designed for measuring particle phenomena characterized by almost any energy spectrum. The monitor provides an indication of the presence of the particles at low energy, while the main telescope gives detailed flux and composition data for the higher energy events. Results of the instrument performance analysis indicate that, at the equator, the monitor telescope has the peak efficiency for particles of about 90 deg pitch angles. The large opening angle of 75 deg makes it possible to detect omnidirectional flux of quasi-trapped particles. The high-energy cosmic-ray background count is found to be very insignificant. It is demonstrated that the particle counting rates for the low-energy threshold have been almost entirely due to protons.

  13. Space Station Freedom power supply commonality via modular design

    NASA Technical Reports Server (NTRS)

    Krauthamer, S.; Gangal, M. D.; Das, R.

    1990-01-01

    At mature operations, Space Station Freedom will need more than 2000 power supplies to feed housekeeping and user loads. Advanced technology power supplies from 20 to 250 W have been hybridized for terrestrial, aerospace, and industry applications in compact, efficient, reliable, lightweight packages compatible with electromagnetic interference requirements. The use of these hybridized packages as modules, either singly or in parallel, to satisfy the wide range of user power supply needs for all elements of the station is proposed. Proposed characteristics for the power supplies include common mechanical packaging, digital control, self-protection, high efficiency at full and partial loads, synchronization capability to reduce electromagnetic interference, redundancy, and soft-start capability. The inherent reliability is improved compared with conventional discrete component power supplies because the hybrid circuits use high-reliability components such as ceramic capacitors. Reliability is further improved over conventional supplies because the hybrid packages, which may be treated as a single part, reduce the parts count in the power supply.

  14. Ultrafine particle emissions by in-use diesel buses of various generations at low-load regimes

    NASA Astrophysics Data System (ADS)

    Tartakovsky, L.; Baibikov, V.; Comte, P.; Czerwinski, J.; Mayer, A.; Veinblat, M.; Zimmerli, Y.

    2015-04-01

    Ultrafine particles (UFP) are major contributors to air pollution due to their easy gas-like penetration into the human organism, causing adverse health effects. This study analyzes UFP emissions by buses of different technologies (from Euro II till Euro V EEV - Enhanced Environmentally-friendly Vehicle) at low-load regimes. Additionally, the emission-reduction potential of retrofitting with a diesel particle filter (DPF) is demonstrated. A comparison of the measured, engine-out, particle number concentrations (PNC) for buses of different technological generations shows that no substantial reduction of engine-out emissions at low-load operating modes is observed for newer bus generations. Retrofitting the in-use urban and interurban buses of Euro II till Euro IV technologies by the VERT-certified DPF confirmed its high efficiency in reduction of UFP emissions. Particle-count filtration efficiency values of the retrofit DPF were found to be extremely high - greater than 99.8%, similar to that of the OEM filter in the Euro V bus.

  15. Atmospheric deposition of {sup 7}Be by rain events, incentral Argentina

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ayub, J. Juri; Velasco, H.; Rizzotto, M.

    2008-08-07

    Beryllium-7 is a natural radionuclide that enters into the ecosystems through wet and dry depositions and has numerous environmental applications in terrestrial and aquatic ecosystems. Atmospheric wet deposition of {sup 7}Be was measured in central Argentina. Rain traps were installed (1 m above ground) and individual rain events have been collected. Rain samples were filtered and analyzed by gamma spectrometry. The gamma counting was undertaken using a 40%-efficient p-type coaxial intrinsic high-purity natural germanium crystal built by Princeton Gamma-Tech. The cryostat was made from electroformed high-purity copper using ultralow-background technology. The detector was surrounded by 50 cm of lead bricksmore » to provide shielding against radioactive background. The detector gamma efficiency was determined using a water solution with known amounts of chemical compounds containing long-lived naturally occurring radioisotopes, {sup 176}Lu, {sup 138}La and {sup 40}K. Due to the geometry of the sample and its position close to the detector, the efficiency points from the {sup 176}Lu decay, had to be corrected for summing effects. The measured samples were 400 ml in size and were counted curing one day. The {sup 7}Be detection limit for the present measurements was as low as 0.2 Bq l{sup -1}. Thirty two rain events were sampled and analyzed (November 2006-May 2007). The measured values show that the events corresponding to low rainfall (<20 mm) are characterized by significantly higher activity concentrations (Bq l{sup -1}). The activity concentration of each individual event varied from 0.8 to 3.5 Bq l{sup -1}, while precipitations varied between 4 and 70 mm. The integrated activity by event of {sup 7}Be was fitted with a model that takes into account the precipitation amount and the elapsed time between two rain events. The integrated activities calculated with this model show a good agreement with experimental values.« less

  16. Performance evaluation of the Abbott CELL-DYN Ruby and the Sysmex XT-2000i haematology analysers.

    PubMed

    Leers, M P G; Goertz, H; Feller, A; Hoffmann, J J M L

    2011-02-01

    Two mid-range haematology analysers (Abbott CELL-DYN Ruby and Sysmex XT-2000i) were evaluated to determine their analytical performance and workflow efficiency in the haematology laboratory. In total 418 samples were processed for determining equivalence of complete blood count (CBC) measurements, and 100 for reticulocyte comparison. Blood smears served for assessing the agreement of the differential counts. Inter-instrument agreement for most parameters was good although small numbers of discrepancies were observed. Systematic biases were found for mean cell volume, reticulocytes, platelets and mean platelet volume. CELL-DYN Ruby WBC differentials were obtained with all samples while the XT-2000i suppressed differentials partially or completely in 13 samples (3.1%). WBC subpopulation counts were otherwise in good agreement with no major outliers. Following first-pass CBC/differential analysis, 88 (21%) of XT-2000i samples required further analyser processing compared to 18 (4.3%) for the CELL-DYN Ruby. Smear referrals for suspected WBC/nucleated red blood cells and platelet abnormalities were indicated for 106 (25.4%) and 95 (22.7%) of the XT-2000i and CELL-DYN Ruby samples respectively. Flagging efficiencies for both analysers were found to be similar. The Sysmex XT-2000i and Abbott CELL-DYN Ruby analysers have broadly comparable analytical performance, but the CELL-DYN Ruby showed superior first-pass efficiency. © 2010 Blackwell Publishing Ltd.

  17. Photon-Counting Kinetic Inductance Detectors (KID) for Far/Mid-Infrared Space Spectroscopy with the Origins Space Telescope (OST)

    NASA Astrophysics Data System (ADS)

    Noroozian, Omid; Barrentine, Emily M.; Stevenson, Thomas R.; Brown, Ari D.; Moseley, Samuel Harvey; Wollack, Edward; Pontoppidan, Klaus Martin; U-Yen, Konpop; Mikula, Vilem

    2018-01-01

    Photon-counting detectors are highly desirable for reaching the ~ 10-20 W/√Hz power sensitivity permitted by the Origins Space Telescope (OST). We are developing unique Kinetic Inductance Detectors (KIDs) with photon counting capability in the far/mid-IR. Combined with an on-chip far-IR spectrometer onboard OST these detectors will enable a new data set for exploring galaxy evolution and the growth of structure in the Universe. Mid-IR spectroscopic surveys using these detectors will enable mapping the composition of key volatiles in planet-forming material around protoplanetary disks and their evolution into solar systems. While these OST science objectives represent a well-organized community agreement they are impossible to reach without a significant leap forward in detector technology, and the OST is likely not to be recommended if a path to suitable detectors does not exist.To reach the required sensitivity we are experimenting with superconducting resonators made from thin aluminum films on single-crystal silicon substrates. Under the right conditions, small-volume inductors made from these films can become ultra-sensitive to single photons >90 GHz. Understanding the physics of these superconductor-dielectric systems is critical to performance. We achieved a very high quality factor of 0.5 x 106 for a 10-nm Al resonator at n ~ 1 microwave photon drive power, by far the highest value for such thin films in the literature. We measured a residual electron density of < 5 /µm3 and extremely long lifetime of ~ 6.0 ms, both within requirements for photon-counting. To realize an optically coupled device, we are integrating these films with our on-chip spectrometer (μ-Spec) fabrication process. Using a detailed model we simulated the detector when illuminated with randomly arriving photon events. Our results show that photon counting with >95% efficiency at 0.5 - 1.0 THz is achievable.We report on these developments and discuss plans to test in our facility through funding from our recently awarded ROSES-APRA grant and Roman Technology Fellowship award.

  18. IRF4 Variants Have Age-Specific Effects on Nevus Count and Predispose to Melanoma

    PubMed Central

    Duffy, David L.; Iles, Mark M.; Glass, Dan; Zhu, Gu; Barrett, Jennifer H.; Höiom, Veronica; Zhao, Zhen Z.; Sturm, Richard A.; Soranzo, Nicole; Hammond, Chris; Kvaskoff, Marina; Whiteman, David C.; Mangino, Massimo; Hansson, Johan; Newton-Bishop, Julia A.; Bataille, Veronique; Hayward, Nicholas K.; Martin, Nicholas G.; Bishop, D. Timothy; Spector, Timothy D.; Montgomery, Grant W.

    2010-01-01

    High melanocytic nevus count is a strong predictor of melanoma risk. A GWAS of nevus count in Australian adolescent twins identified an association of nevus count with the interferon regulatory factor 4 gene (IRF4 [p = 6 × 10−9]). There was a strong genotype-by-age interaction, which was replicated in independent UK samples of adolescents and adults. The rs12203592∗T allele was associated with high nevus counts and high freckling scores in adolescents, but with low nevus counts and high freckling scores in adults. The rs12203592∗T increased counts of flat (compound and junctional) nevi in Australian adolescent twins, but decreased counts of raised (intradermal) nevi. In combined analysis of melanoma case-control data from Australia, the UK, and Sweden, the rs12203592∗C allele was associated with melanoma (odds ratio [OR] 1.15, p = 4 × 10−3), most significantly on the trunk (OR = 1.33, p = 2.5 × 10−5). The melanoma association was corroborated in a GWAS performed by the GenoMEL consortium for an adjacent SNP, rs872071 (rs872071∗T: OR 1.14, p = 0.0035; excluding Australian, the UK, and Swedish samples typed at rs12203592: OR 1.08, p = 0.08). PMID:20602913

  19. Methods for assessing long-term mean pathogen count in drinking water and risk management implications.

    PubMed

    Englehardt, James D; Ashbolt, Nicholas J; Loewenstine, Chad; Gadzinski, Erik R; Ayenu-Prah, Albert Y

    2012-06-01

    Recently pathogen counts in drinking and source waters were shown theoretically to have the discrete Weibull (DW) or closely related discrete growth distribution (DGD). The result was demonstrated versus nine short-term and three simulated long-term water quality datasets. These distributions are highly skewed such that available datasets seldom represent the rare but important high-count events, making estimation of the long-term mean difficult. In the current work the methods, and data record length, required to assess long-term mean microbial count were evaluated by simulation of representative DW and DGD waterborne pathogen count distributions. Also, microbial count data were analyzed spectrally for correlation and cycles. In general, longer data records were required for more highly skewed distributions, conceptually associated with more highly treated water. In particular, 500-1,000 random samples were required for reliable assessment of the population mean ±10%, though 50-100 samples produced an estimate within one log (45%) below. A simple correlated first order model was shown to produce count series with 1/f signal, and such periodicity over many scales was shown in empirical microbial count data, for consideration in sampling. A tiered management strategy is recommended, including a plan for rapid response to unusual levels of routinely-monitored water quality indicators.

  20. Evaluation of the Performance of Iodine-Treated Biocidal Filters Under the Influence of Environmental Parameters

    DTIC Science & Technology

    2013-02-01

    analysis for total virus count . To examine the effects of bioaerosol on the release of iodine from the triiodide resin medium, MS2 aerosol was treated with...airborne pathogens. 2.2.2. Viral Aerosols Bioaerosols are airborne particles with biological origins, such as nonviable pollen , and viable fungi...performed: collection efficiency of BioSampler, virus PSD by SMPS, plaque assay for virus infectivity, and PCR analysis for total virus count . PSL

  1. Precision determination of absolute neutron flux

    DOE PAGES

    Yue, A. T.; Anderson, E. S.; Dewey, M. S.; ...

    2018-06-08

    A technique for establishing the total neutron rate of a highly-collimated monochromatic cold neutron beam was demonstrated using an alpha–gamma counter. The method involves only the counting of measured rates and is independent of neutron cross sections, decay chain branching ratios, and neutron beam energy. For the measurement, a target of 10B-enriched boron carbide totally absorbed the neutrons in a monochromatic beam, and the rate of absorbed neutrons was determined by counting 478 keV gamma rays from neutron capture on 10B with calibrated high-purity germanium detectors. A second measurement based on Bragg diffraction from a perfect silicon crystal was performedmore » to determine the mean de Broglie wavelength of the beam to a precision of 0.024%. With these measurements, the detection efficiency of a neutron monitor based on neutron absorption on 6Li was determined to an overall uncertainty of 0.058%. We discuss the principle of the alpha–gamma method and present details of how the measurement was performed including the systematic effects. We further describe how this method may be used for applications in neutron dosimetry and metrology, fundamental neutron physics, and neutron cross section measurements.« less

  2. Model and reconstruction of a K-edge contrast agent distribution with an X-ray photon-counting detector

    PubMed Central

    Meng, Bo; Cong, Wenxiang; Xi, Yan; De Man, Bruno; Yang, Jian; Wang, Ge

    2017-01-01

    Contrast-enhanced computed tomography (CECT) helps enhance the visibility for tumor imaging. When a high-Z contrast agent interacts with X-rays across its K-edge, X-ray photoelectric absorption would experience a sudden increment, resulting in a significant difference of the X-ray transmission intensity between the left and right energy windows of the K-edge. Using photon-counting detectors, the X-ray intensity data in the left and right windows of the K-edge can be measured simultaneously. The differential information of the two kinds of intensity data reflects the contrast-agent concentration distribution. K-edge differences between various matters allow opportunities for the identification of contrast agents in biomedical applications. In this paper, a general radon transform is established to link the contrast-agent concentration to X-ray intensity measurement data. An iterative algorithm is proposed to reconstruct a contrast-agent distribution and tissue attenuation background simultaneously. Comprehensive numerical simulations are performed to demonstrate the merits of the proposed method over the existing K-edge imaging methods. Our results show that the proposed method accurately quantifies a distribution of a contrast agent, optimizing the contrast-to-noise ratio at a high dose efficiency. PMID:28437900

  3. Precision determination of absolute neutron flux

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yue, A. T.; Anderson, E. S.; Dewey, M. S.

    A technique for establishing the total neutron rate of a highly-collimated monochromatic cold neutron beam was demonstrated using an alpha–gamma counter. The method involves only the counting of measured rates and is independent of neutron cross sections, decay chain branching ratios, and neutron beam energy. For the measurement, a target of 10B-enriched boron carbide totally absorbed the neutrons in a monochromatic beam, and the rate of absorbed neutrons was determined by counting 478 keV gamma rays from neutron capture on 10B with calibrated high-purity germanium detectors. A second measurement based on Bragg diffraction from a perfect silicon crystal was performedmore » to determine the mean de Broglie wavelength of the beam to a precision of 0.024%. With these measurements, the detection efficiency of a neutron monitor based on neutron absorption on 6Li was determined to an overall uncertainty of 0.058%. We discuss the principle of the alpha–gamma method and present details of how the measurement was performed including the systematic effects. We further describe how this method may be used for applications in neutron dosimetry and metrology, fundamental neutron physics, and neutron cross section measurements.« less

  4. The effect of dietary garlic supplementation on body weight gain, feed intake, feed conversion efficiency, faecal score, faecal coliform count and feeding cost in crossbred dairy calves.

    PubMed

    Ghosh, Sudipta; Mehla, Ram K; Sirohi, S K; Roy, Biswajit

    2010-06-01

    Thirty-six crossbred calves (Holstein cross) of 5 days of age were used to study the effect of garlic extract feeding on their performance up to the age of 2 months (pre-ruminant stage). They were randomly allotted into treatment and control groups (18 numbers in each group). Performance was evaluated by measuring average body weight (BW) gain, feed intake (dry matter (DM), total digestible nutrient (TDN) and crude protein (CP)), feed conversion efficiency (FCE; DM, TDN and CP), faecal score, faecal coliform count and feeding cost. Diets were the same for the both groups. In addition, treatment group received garlic extract supplementation at 250 mg/kg BW per day per calf. Body weight measured weekly, feed intake measured twice daily, proximate analysis of feeds and fodders analysed weekly, faecal scores monitored daily and faecal coliform count done weekly. There was significant increase in average body weight gain, feed intake and FCE and significant decrease in severity of scours as measured by faecal score and faecal coliform count in the treatment group compared to the control group (P < 0.01). Feed cost per kilogramme BW gain was significantly lower in the treatment group compared to control group (P < 0.01). The results suggest that garlic extract can be supplemented to the calves for better performance.

  5. Evaluation of petrifilm series 2000 as a possible rapid method to count coliforms in foods.

    PubMed

    Priego, R; Medina, L M; Jordano, R

    2000-08-01

    This research note is a preliminary comparison between the Petrifilm 2000 method and a widely used traditional enumeration method (on violet red bile agar); six batches of different foods (egg, frozen green beans, fresh sausage, a bakery product, raw minced meat, and raw milk) were studied. The reliability of the presumptive counts taken at 10, 12, and 14 h of incubation using this method was also verified by comparing the counts with the total confirmed counts at 24 h. In all the batches studied, results obtained with Petrifilm 2000 presented a close correlation to those obtained using violet red bile agar (r = 0.860) and greater sensitivity (93.33% of the samples displayed higher counts on Petrifilm 2000), showing that this method is a reliable and efficient alternative. The count taken at 10-h incubation is of clear interest as an early indicator of results in microbiological food control, since it accounted for 90% of the final count in all the batches analyzed. Counts taken at 12 and 14 h bore a greater similarity to those taken at 24 h. The Petrifilm 2000 method provides results in less than 12 h of incubation, making it a possible rapid method that adapts perfectly to hazard analysis critical control point system by enabling the microbiological quality control of the processing.

  6. Vane clocking effects in an embedded compressor stage

    NASA Astrophysics Data System (ADS)

    Key, Nicole Leanne

    The objective of this research was to experimentally investigate the effects of vane clocking, the circumferential indexing of adjacent vane rows with similar vane counts, in an embedded compressor stage. Experiments were performed in the Purdue 3-Stage Compressor, which consists of an IGV followed by three stages. The IGV, Stator 1, and Stator 2 have identical vane counts of 44, and the effects of clocking were studied on Stage 2. The clocking configuration that located the upstream vane wake on the Stator 2 leading edge was identified with total pressure measurements at the inlet to Stator 2 and confirmed with measurements at the exit of Stator 2. For both loading conditions, the total temperature results showed that there was no measurable change associated with vane clocking in the amount of work done on the flow. At design loading, the change in stage efficiency with vane clocking was 0.27 points between the maximum and minimum efficiency clocking configurations. The maximum efficiency configuration was the case where the Stator 1 wake impinged on the Stator 2 leading edge. This condition produced a shallower and thinner Stator 2 wake compared to the clocking configuration that located the wake in the middle of the Stator 2 passage. By locating the Stator 1 wake at the leading edge, it dampened the Stator 2 boundary layer response to inlet fluctuations associated with the Rotor 2 wakes. At high loading, the change in Stage 2 efficiency increased to 1.07 points; however, the maximum efficiency clocking configuration was the case where the Stator 1 wake passed through the middle of the downstream vane passage. At high loading, the flow physics associated with vane clocking were different than at design loading because the location of the Stator 1 wake fluid on the Stator 2 leading edge triggered a boundary layer separation on the suction side of Stator 2 producing a wider and deeper wake. Vane clocking essentially affects the amount of interaction between the upstream vane wake and the boundary layer of the downstream vane. Whether this dampens the adverse effects of the rotor wakes or triggers boundary layer separation will depend on the flow conditions such as Reynolds number, turbulence intensity, and pressure gradient (vane loading), to name a few.

  7. Duffy-Null–Associated Low Neutrophil Counts Influence HIV-1 Susceptibility in High-Risk South African Black Women

    PubMed Central

    Ramsuran, Veron; Kulkarni, Hemant; He, Weijing; Mlisana, Koleka; Wright, Edwina J.; Werner, Lise; Castiblanco, John; Dhanda, Rahul; Le, Tuan; Dolan, Matthew J.; Guan, Weihua; Weiss, Robin A.; Clark, Robert A.; Abdool Karim, Salim S.; Ndung'u, Thumbi

    2011-01-01

    Background. The Duffy-null trait and ethnic netropenia are both highly prevalent in Africa. The influence of pre-seroconversion levels of peripheral blood cell counts (PBCs) on the risk of acquiring human immunodeficiency virus (HIV)–1 infection among Africans is unknown. Methods. The triangular relationship among pre-seroconversion PBC counts, host genotypes, and risk of HIV acquisition was determined in a prospective cohort of black South African high-risk female sex workers. Twenty-seven women had seroconversion during follow-up, and 115 remained HIV negative for 2 years, despite engaging in high-risk activity. Results. Pre-seroconversion neutrophil counts in women who subsequently had seroconversion were significantly lower, whereas platelet counts were higher, compared with those who remained HIV negative. Comprising 27% of the cohort, subjects with pre-seroconversion neutrophil counts of <2500 cells/mm3 had a ∼3-fold greater risk of acquiring HIV infection. In a genome-wide association analyses, an African-specific polymorphism (rs2814778) in the promoter of Duffy Antigen Receptor for Chemokines (DARC −46T > C) was significantly associated with neutrophil counts (P = 7.9 × 10−11). DARC −46C/C results in loss of DARC expression on erthyrocytes (Duffy-null) and resistance to Plasmodium vivax malaria, and in our cohort, only subjects with this genotype had pre-seroconversion neutrophil counts of <2500 cells/mm3. The risk of acquiring HIV infection was ∼3-fold greater in those with the trait of Duffy-null–associated low neutrophil counts, compared with all other study participants. Conclusions. Pre-seroconversion neutrophil and platelet counts influence risk of HIV infection. The trait of Duffy-null–associated low neutrophil counts influences HIV susceptibility. Because of the high prevalence of this trait among persons of African ancestry, it may contribute to the dynamics of the HIV epidemic in Africa. PMID:21507922

  8. High-efficiency integrated readout circuit for single photon avalanche diode arrays in fluorescence lifetime imaging.

    PubMed

    Acconcia, G; Cominelli, A; Rech, I; Ghioni, M

    2016-11-01

    In recent years, lifetime measurements by means of the Time Correlated Single Photon Counting (TCSPC) technique have led to a significant breakthrough in medical and biological fields. Unfortunately, the many advantages of TCSPC-based approaches come along with the major drawback of a relatively long acquisition time. The exploitation of multiple channels in parallel could in principle mitigate this issue, and at the same time it opens the way to a multi-parameter analysis of the optical signals, e.g., as a function of wavelength or spatial coordinates. The TCSPC multichannel solutions proposed so far, though, suffer from a tradeoff between number of channels and performance, and the overall measurement speed has not been increased according to the number of channels, thus reducing the advantages of having a multichannel system. In this paper, we present a novel readout architecture for bi-dimensional, high-density Single Photon Avalanche Diode (SPAD) arrays, specifically designed to maximize the throughput of the whole system and able to guarantee an efficient use of resources. The core of the system is a routing logic that can provide a dynamic connection between a large number of SPAD detectors and a much lower number of high-performance acquisition channels. A key feature of our smart router is its ability to guarantee high efficiency under any operating condition.

  9. Large Scale Frequent Pattern Mining using MPI One-Sided Model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vishnu, Abhinav; Agarwal, Khushbu

    In this paper, we propose a work-stealing runtime --- Library for Work Stealing LibWS --- using MPI one-sided model for designing scalable FP-Growth --- {\\em de facto} frequent pattern mining algorithm --- on large scale systems. LibWS provides locality efficient and highly scalable work-stealing techniques for load balancing on a variety of data distributions. We also propose a novel communication algorithm for FP-growth data exchange phase, which reduces the communication complexity from state-of-the-art O(p) to O(f + p/f) for p processes and f frequent attributed-ids. FP-Growth is implemented using LibWS and evaluated on several work distributions and support counts. Anmore » experimental evaluation of the FP-Growth on LibWS using 4096 processes on an InfiniBand Cluster demonstrates excellent efficiency for several work distributions (87\\% efficiency for Power-law and 91% for Poisson). The proposed distributed FP-Tree merging algorithm provides 38x communication speedup on 4096 cores.« less

  10. Efficiency of a new bioaerosol sampler in sampling Betula pollen for antigen analyses.

    PubMed

    Rantio-Lehtimäki, A; Kauppinen, E; Koivikko, A

    1987-01-01

    A new bioaerosol sampler consisting of Liu-type atmospheric aerosol sampling inlet, coarse particle inertial impactor, two-stage high-efficiency virtual impactor (aerodynamic particle sizes respectively in diameter: greater than or equal to 8 microns, 8-2.5 microns, and 2.5 microns; sampling on filters) and a liquid-cooled condenser was designed, fabricated and field-tested in sampling birch (Betula) pollen grains and smaller particles containing Betula antigens. Both microscopical (pollen counts) and immunochemical (enzyme-linked immunosorbent assay) analyses of each stage were carried out. The new sampler was significantly more efficient than Burkard trap e.g. in sampling particles of Betula pollen size (ca. 25 microns in diameter). This was prominent during pollen peak periods (e.g. May 19th, 1985, in the virtual impactor 9482 and in the Burkard trap 2540 Betula p.g. X m-3 of air). Betula antigens were detected also in filter stages where no intact pollen grains were found; in the condenser unit the antigen concentrations instead were very low.

  11. High rates of viral suppression in adults and children with high CD4+ counts using a streamlined ART delivery model in the SEARCH trial in rural Uganda and Kenya

    PubMed Central

    Kwarisiima, Dalsone; Kamya, Moses R.; Owaraganise, Asiphas; Mwangwa, Florence; Byonanebye, Dathan M.; Ayieko, James; Plenty, Albert; Black, Doug; Clark, Tamara D.; Nzarubara, Bridget; Snyman, Katherine; Brown, Lillian; Bukusi, Elizabeth; Cohen, Craig R.; Geng, Elvin H.; Charlebois, Edwin D.; Ruel, Theodore D.; Petersen, Maya L.; Havlir, Diane; Jain, Vivek

    2017-01-01

    Abstract Introduction: The 2015 WHO recommendation of antiretroviral therapy (ART) for all HIV-positive persons calls for treatment initiation in millions of persons newly eligible with high CD4+ counts. Efficient and effective care models are urgently needed for this population. We evaluated clinical outcomes of asymptomatic HIV-positive adults and children starting ART with high CD4+ counts using a novel streamlined care model in rural Uganda and Kenya. Methods: In the 16 intervention communities of the HIV test-and-treat Sustainable East Africa Research for Community Health Study (NCT01864603), all HIV-positive individuals irrespective of CD4 were offered ART (efavirenz [EFV]/tenofovir disoproxil fumarate + emtricitabine (FTC) or lamivudine (3TC). We studied adults (≥fifteen years) with CD4 ≥ 350/μL and children (two to fourteen years) with CD4 > 500/μL otherwise ineligible for ART by country guidelines. Clinics implemented a patient-centred streamlined care model designed to reduce patient-level barriers and maximize health system efficiency. It included (1) nurse-conducted visits with physician referral of complex cases, (2) multi-disease chronic care (including for hypertension/diabetes), (3) patient-centred, friendly staff, (4) viral load (VL) testing and counselling, (5) three-month return visits and ART refills, (6) appointment reminders, (7) tiered tracking for missed appointments, (8) flexible clinic hours (outside routine schedule) and (9) telephone access to clinicians. Primary outcomes were 48-week retention in care, viral suppression (% with measured week 48 VL ≤ 500 copies/mL) and adverse events. Results: Overall, 972 HIV-positive adults with CD4+ ≥ 350/μL initiated ART with streamlined care. Patients were 66% female and had median age thirty-four years (IQR, 28–42), CD4+ 608/μL (IQR, 487–788/μL) and VL 6775 copies/mL (IQR, <500–37,003 c/mL). At week 48, retention was 92% (897/972; 2 died/40 moved/8 withdrew/4 transferred care/21/964 [2%] were lost to follow-up). Viral suppression occurred in 778/838 (93%) and 800/972 (82%) in intention-to-treat analysis. Grade III/IV clinical/laboratory adverse events were rare: 95 occurred in 74/972 patients (7.6%). Only 8/972 adults (0.8%) switched ART from EFV to lopinavir (LPV) (n = 2 for dizziness, n = 2 for gynaecomastia, n = 4 for other reasons). Among 83 children, week 48 retention was 89% (74/83), viral suppression was 92% (65/71) and grade III/IV adverse events occurred in 4/83 (4.8%). Conclusions: Using a streamlined care model, viral suppression, retention and ART safety were high among asymptomatic East African adults and children with high CD4+ counts initiating treatment. Clinical Trial Number: NCT01864603 PMID:28770596

  12. Characterization of photon-counting multislit breast tomosynthesis.

    PubMed

    Berggren, Karl; Cederström, Björn; Lundqvist, Mats; Fredenberg, Erik

    2018-02-01

    It has been shown that breast tomosynthesis may improve sensitivity and specificity compared to two-dimensional mammography, resulting in increased detection-rate of cancers or lowered call-back rates. The purpose of this study is to characterize a spectral photon-counting multislit breast tomosynthesis system that is able to do single-scan spectral imaging with multiple collimated x-ray beams. The system differs in many aspects compared to conventional tomosynthesis using energy-integrating flat-panel detectors. The investigated system was a prototype consisting of a dual-threshold photon-counting detector with 21 collimated line detectors scanning across the compressed breast. A review of the system is done in terms of detector, acquisition geometry, and reconstruction methods. Three reconstruction methods were used, simple back-projection, filtered back-projection and an iterative algebraic reconstruction technique. The image quality was evaluated by measuring the modulation transfer-function (MTF), normalized noise-power spectrum, detective quantum-efficiency (DQE), and artifact spread-function (ASF) on reconstructed spectral tomosynthesis images for a total-energy bin (defined by a low-energy threshold calibrated to remove electronic noise) and for a high-energy bin (with a threshold calibrated to split the spectrum in roughly equal parts). Acquisition was performed using a 29 kVp W/Al x-ray spectrum at a 0.24 mGy exposure. The difference in MTF between the two energy bins was negligible, that is, there was no energy dependence on resolution. The MTF dropped to 50% at 1.5 lp/mm to 2.3 lp/mm in the scan direction and 2.4 lp/mm to 3.3 lp/mm in the slit direction, depending on the reconstruction method. The full width at half maximum of the ASF was found to range from 13.8 mm to 18.0 mm for the different reconstruction methods. The zero-frequency DQE of the system was found to be 0.72. The fraction of counts in the high-energy bin was measured to be 59% of the total detected spectrum. Scantimes ranged from 4 s to 16.5 s depending on voltage and current settings. The characterized system generates spectral tomosynthesis images with a dual-energy photon-counting detector. Measurements show a high DQE, enabling high image quality at a low dose, which is beneficial for low-dose applications such as screening. The single-scan spectral images open up for applications such as quantitative material decomposition and contrast-enhanced tomosynthesis. © 2017 American Association of Physicists in Medicine.

  13. Free-running InGaAs single photon detector with 1 dark count per second at 10% efficiency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korzh, B., E-mail: Boris.Korzh@unige.ch; Walenta, N.; Lunghi, T.

    We present a free-running single photon detector for telecom wavelengths based on a negative feedback avalanche photodiode (NFAD). A dark count rate as low as 1 cps was obtained at a detection efficiency of 10%, with an afterpulse probability of 2.2% for 20 μs of deadtime. This was achieved by using an active hold-off circuit and cooling the NFAD with a free-piston stirling cooler down to temperatures of −110 °C. We integrated two detectors into a practical, 625 MHz clocked quantum key distribution system. Stable, real-time key distribution in the presence of 30 dB channel loss was possible, yielding a secret key rate of 350 bps.

  14. An Extremely Low Power Quantum Optical Communication Link for Autonomous Robotic Explorers

    NASA Technical Reports Server (NTRS)

    Lekki, John; Nguyen, Quang-Viet; Bizon, Tom; Nguyen, Binh; Kojima, Jun

    2007-01-01

    One concept for planetary exploration involves using many small robotic landers that can cover more ground than a single conventional lander. In addressing this vision, NASA has been challenged in the National Nanotechnology Initiative to research the development of miniature robots built from nano-sized components. These robots have very significant challenges, such as mobility and communication, given the small size and limited power generation capability. The research presented here has been focused on developing a communications system that has the potential for providing ultra-low power communications for robots such as these. In this paper an optical communications technique that is based on transmitting recognizable sets of photons is presented. Previously pairs of photons that have an entangled quantum state have been shown to be recognizable in ambient light. The main drawback to utilizing entangled photons is that they can only be generated through a very energy inefficient nonlinear process. In this paper a new technique that generates sets of photons from pulsed sources is described and an experimental system demonstrating this technique is presented. This technique of generating photon sets from pulsed sources has the distinct advantage in that it is much more flexible and energy efficient, and is well suited to take advantage of the very high energy efficiencies that are possible when using nano scale sources. For these reasons the communication system presented in this paper is well suited for use in very small, low power landers and rovers. In this paper a very low power optical communications system for miniature robots, as small as 1 cu cm is addressed. The communication system is a variant of photon counting communications. Instead of counting individual photons the system only counts the arrival of time coincident sets of photons. Using sets of photons significantly decreases the bit error rate because they are highly identifiable in the presence of ambient light. An experiment demonstrating reliable communication over a distance of 70 meters using less than a billionth of a watt of radiated power is presented. The components used in this system were chosen so that they could in the future be integrated into a cubic centimeter device.

  15. Bidirectional DC/DC Converter

    NASA Astrophysics Data System (ADS)

    Pedersen, F.

    2008-09-01

    The presented bidirectional DC/DC converter design concept is a further development of an already existing converter used for low battery voltage operation.For low battery voltage operation a high efficient low parts count DC/DC converter was developed, and used in a satellite for the battery charge and battery discharge function.The converter consists in a bidirectional, non regulating DC/DC converter connected to a discharge regulating Buck converter and a charge regulating Buck converter.The Bidirectional non regulating DC/DC converter performs with relatively high efficiency even at relatively high currents, which here means up to 35Amps.This performance was obtained through the use of power MOSFET's with on- resistances of only a few mille Ohms connected to a special transformer allowing paralleling several transistor stages on the low voltage side of the transformer. The design is patent protected. Synchronous rectification leads to high efficiency at the low battery voltages considered, which was in the range 2,7- 4,3 Volt DC.The converter performs with low switching losses as zero voltage zero current switching is implemented in all switching positions of the converter.Now, the drive power needed, to switch a relatively large number of low Ohm , hence high drive capacitance, power MOSFET's using conventional drive techniques would limit the overall conversion efficiency.Therefore a resonant drive consuming considerable less power than a conventional drive circuit was implemented in the converter.To the originally built and patent protected bidirectional non regulating DC/DC converter, is added the functionality of regulation.Hereby the need for additional converter stages in form of a Charge Buck regulator and a Discharge Buck regulator is eliminated.The bidirectional DC/DC converter can be used in connection with batteries, motors, etc, where the bidirectional feature, simple design and high performance may be useful.

  16. Advanced Multilayer Composite Heavy-Oxide Scintillator Detectors for High Efficiency Fast Neutron Detection

    NASA Astrophysics Data System (ADS)

    Ryzhikov, Vladimir D.; Naydenov, Sergei V.; Pochet, Thierry; Onyshchenko, Gennadiy M.; Piven, Leonid A.; Smith, Craig F.

    2018-01-01

    We have developed and evaluated a new approach to fast neutron and neutron-gamma detection based on large-area multilayer composite heterogeneous detection media consisting of dispersed granules of small-crystalline scintillators contained in a transparent organic (plastic) matrix. Layers of the composite material are alternated with layers of transparent plastic scintillator material serving as light guides. The resulting detection medium - designated as ZEBRA - serves as both an active neutron converter and a detection scintillator which is designed to detect both neutrons and gamma-quanta. The composite layers of the ZEBRA detector consist of small heavy-oxide scintillators in the form of granules of crystalline BGO, GSO, ZWO, PWO and other materials. We have produced and tested the ZEBRA detector of sizes 100x100x41 mm and greater, and determined that they have very high efficiency of fast neutron detection (up to 49% or greater), comparable to that which can be achieved by large sized heavy-oxide single crystals of about Ø40x80 cm3 volume. We have also studied the sensitivity variation to fast neutron detection by using different types of multilayer ZEBRA detectors of 100 cm2 surface area and 41 mm thickness (with a detector weight of about 1 kg) and found it to be comparable to the sensitivity of a 3He-detector representing a total cross-section of about 2000 cm2 (with a weight of detector, including its plastic moderator, of about 120 kg). The measured count rate in response to a fast neutron source of 252Cf at 2 m for the ZEBRA-GSO detector of size 100x100x41 mm3 was 2.84 cps/ng, and this count rate can be doubled by increasing the detector height (and area) up to 200x100 mm2. In summary, the ZEBRA detectors represent a new type of high efficiency and low cost solid-state neutron detector that can be used for stationary neutron/gamma portals. They may represent an interesting alternative to expensive, bulky gas counters based on 3He or 10B neutron detection technologies.

  17. Comparison of Drive Counts and Mark-Resight As Methods of Population Size Estimation of Highly Dense Sika Deer (Cervus nippon) Populations.

    PubMed

    Takeshita, Kazutaka; Ikeda, Takashi; Takahashi, Hiroshi; Yoshida, Tsuyoshi; Igota, Hiromasa; Matsuura, Yukiko; Kaji, Koichi

    2016-01-01

    Assessing temporal changes in abundance indices is an important issue in the management of large herbivore populations. The drive counts method has been frequently used as a deer abundance index in mountainous regions. However, despite an inherent risk for observation errors in drive counts, which increase with deer density, evaluations of the utility of drive counts at a high deer density remain scarce. We compared the drive counts and mark-resight (MR) methods in the evaluation of a highly dense sika deer population (MR estimates ranged between 11 and 53 individuals/km2) on Nakanoshima Island, Hokkaido, Japan, between 1999 and 2006. This deer population experienced two large reductions in density; approximately 200 animals in total were taken from the population through a large-scale population removal and a separate winter mass mortality event. Although the drive counts tracked temporal changes in deer abundance on the island, they overestimated the counts for all years in comparison to the MR method. Increased overestimation in drive count estimates after the winter mass mortality event may be due to a double count derived from increased deer movement and recovery of body condition secondary to the mitigation of density-dependent food limitations. Drive counts are unreliable because they are affected by unfavorable factors such as bad weather, and they are cost-prohibitive to repeat, which precludes the calculation of confidence intervals. Therefore, the use of drive counts to infer the deer abundance needs to be reconsidered.

  18. Microfluidic Mixing Technology for a Universal Health Sensor

    NASA Technical Reports Server (NTRS)

    Chan, Eugene Y.; Bae, Candice

    2009-01-01

    A highly efficient means of microfluidic mixing has been created for use with the rHEALTH sensor an elliptical mixer and passive curvilinear mixing patterns. The rHEALTH sensor provides rapid, handheld, complete blood count, cell differential counts, electrolyte measurements, and other lab tests based on a reusable, flow-based microfluidic platform. These geometries allow for cleaning in a reusable manner, and also allow for complete mixing of fluid streams. The microfluidic mixing is performed by flowing two streams of fluid into an elliptical or curvilinear design that allows the combination of the flows into one channel. The mixing is accomplished by either chaotic advection around micro - fluidic loops. All components of the microfluidic chip are flow-through, meaning that cleaning solution can be introduced into the chip to flush out cells, plasma proteins, and dye. Tests were performed on multiple chip geometries to show that cleaning is efficient in any flowthrough design. The conclusion from these experiments is that the chip can indeed be flushed out with microliter volumes of solution and biological samples are cleaned readily from the chip with minimal effort. The technology can be applied in real-time health monitoring at patient s bedside or in a doctor s office, and real-time clinical intervention in acute situations. It also can be used for daily measurement of hematocrit for patients on anticoagulant drugs, or to detect acute myocardial damage outside a hospital.

  19. Learning linear transformations between counting-based and prediction-based word embeddings

    PubMed Central

    Hayashi, Kohei; Kawarabayashi, Ken-ichi

    2017-01-01

    Despite the growing interest in prediction-based word embedding learning methods, it remains unclear as to how the vector spaces learnt by the prediction-based methods differ from that of the counting-based methods, or whether one can be transformed into the other. To study the relationship between counting-based and prediction-based embeddings, we propose a method for learning a linear transformation between two given sets of word embeddings. Our proposal contributes to the word embedding learning research in three ways: (a) we propose an efficient method to learn a linear transformation between two sets of word embeddings, (b) using the transformation learnt in (a), we empirically show that it is possible to predict distributed word embeddings for novel unseen words, and (c) empirically it is possible to linearly transform counting-based embeddings to prediction-based embeddings, for frequent words, different POS categories, and varying degrees of ambiguities. PMID:28926629

  20. Photon-counting detector arrays based on microchannel array plates. [for image enhancement

    NASA Technical Reports Server (NTRS)

    Timothy, J. G.

    1975-01-01

    The recent development of the channel electron multiplier (CEM) and its miniaturization into the microchannel array plate (MCP) offers the possibility of fully combining the advantages of the photographic and photoelectric detection systems. The MCP has an image-intensifying capability and the potential of being developed to yield signal outputs superior to those of conventional photomultipliers. In particular, the MCP has a photon-counting capability with a negligible dark-count rate. Furthermore, the MCP can operate stably and efficiently at extreme-ultraviolet and soft X-ray wavelengths in a windowless configuration or can be integrated with a photo-cathode in a sealed tube for use at ultraviolet and visible wavelengths. The operation of one- and two-dimensional photon-counting detector arrays based on the MCP at extreme-ultraviolet wavelengths is described, and the design of sealed arrays for use at ultraviolet and visible wavelengths is briefly discussed.

  1. Noise limitations of multiplier phototubes in the radiation environment of space

    NASA Technical Reports Server (NTRS)

    Viehmann, W.; Eubanks, A. G.

    1976-01-01

    The contributions of Cerenkov emission, luminescence, secondary electron emission, and bremsstrahlung to radiation-induced data current and noise of multiplier phototubes were analyzed quantitatively. Fluorescence and Cerenkov emission in the tube window are the major contributors and can quantitatively account for dark count levels observed in orbit. Radiation-induced noise can be minimized by shielding, tube selection, and mode of operation. Optical decoupling of windows and cathode (side-window tubes) leads to further reduction of radiation-induced dark counts, as does reducing the window thickness and effective cathode area, and selection of window/cathode combinations of low fluorescence efficiency. In trapped radiation-free regions of near-earth orbits and in free space, Cerenkov emission by relativistic particles contributes predominantly to the photoelectron yield per event. Operating multiplier phototubes in the photon (pulse) counting mode will discriminate against these large pulses and substantially reduce the dark count and noise to levels determined by fluorescence.

  2. Standardization of Ga-68 by coincidence measurements, liquid scintillation counting and 4πγ counting.

    PubMed

    Roteta, Miguel; Peyres, Virginia; Rodríguez Barquero, Leonor; García-Toraño, Eduardo; Arenillas, Pablo; Balpardo, Christian; Rodrígues, Darío; Llovera, Roberto

    2012-09-01

    The radionuclide (68)Ga is one of the few positron emitters that can be prepared in-house without the use of a cyclotron. It disintegrates to the ground state of (68)Zn partially by positron emission (89.1%) with a maximum energy of 1899.1 keV, and partially by electron capture (10.9%). This nuclide has been standardized in the frame of a cooperation project between the Radionuclide Metrology laboratories from CIEMAT (Spain) and CNEA (Argentina). Measurements involved several techniques: 4πβ-γ coincidences, integral gamma counting and Liquid Scintillation Counting using the triple to double coincidence ratio and the CIEMAT/NIST methods. Given the short half-life of the radionuclide assayed, a direct comparison between results from both laboratories was excluded and a comparison of experimental efficiencies of similar NaI detectors was used instead. Copyright © 2012 Elsevier Ltd. All rights reserved.

  3. Spoilage of Microfiltered and Pasteurized Extended Shelf Life Milk Is Mainly Induced by Psychrotolerant Spore-Forming Bacteria that often Originate from Recontamination

    PubMed Central

    Doll, Etienne V.; Scherer, Siegfried; Wenning, Mareike

    2017-01-01

    Premature spoilage and varying product quality due to microbial contamination still constitute major problems in the production of microfiltered and pasteurized extended shelf life (ESL) milk. Spoilage-associated bacteria may enter the product either as part of the raw milk microbiota or as recontaminants in the dairy plant. To identify spoilage-inducing bacteria and their routes of entry, we analyzed end products for their predominant microbiota as well as the prevalence and biodiversity of psychrotolerant spores in bulk tank milk. Process analyses were performed to determine the removal of psychrotolerant spores at each production step. To detect transmission and recontamination events, strain typing was conducted with isolates obtained from all process stages. Microbial counts in 287 ESL milk packages at the end of shelf life were highly diverse ranging from <1 to 7.9 log cfu/mL. In total, 15% of samples were spoiled. High G+C Gram-positive bacteria were the most abundant taxonomic group, but were responsible for only 31% of spoilage. In contrast, psychrotolerant spores were isolated from 55% of spoiled packages. In 90% of samples with pure cultures of Bacillus cereus sensu lato and Paenibacillus spp., counts exceeded 6 log cfu/mL. In bulk tank milk, the concentration of psychrotolerant spores was low, accounting for merely 0.5 ± 0.8 MPN/mL. Paenibacillus amylolyticus/xylanexedens was by far the most dominant species in bulk tank milk (48% of all isolates), but was never detected in ESL milk, pointing to efficient removal during manufacturing. Six large-scale process analyses confirmed a high removal rate for psychrotolerant spores (reduction by nearly 4 log-units). B. cereus sensu lato, on the contrary, was frequently found in spoiled end products, but was rarely detected in bulk tank milk. Due to low counts in bulk tank samples and efficient spore removal during production, we suggest that shelf life is influenced only to a minor extent by raw-milk-associated factors. In contrast, recontamination with spores, particularly from the B. cereus complex, seems to occur. To enhance milk quality throughout the entire shelf life, improved plant sanitation and disinfection that target the elimination of spores are necessary. PMID:28197147

  4. Spoilage of Microfiltered and Pasteurized Extended Shelf Life Milk Is Mainly Induced by Psychrotolerant Spore-Forming Bacteria that often Originate from Recontamination.

    PubMed

    Doll, Etienne V; Scherer, Siegfried; Wenning, Mareike

    2017-01-01

    Premature spoilage and varying product quality due to microbial contamination still constitute major problems in the production of microfiltered and pasteurized extended shelf life (ESL) milk. Spoilage-associated bacteria may enter the product either as part of the raw milk microbiota or as recontaminants in the dairy plant. To identify spoilage-inducing bacteria and their routes of entry, we analyzed end products for their predominant microbiota as well as the prevalence and biodiversity of psychrotolerant spores in bulk tank milk. Process analyses were performed to determine the removal of psychrotolerant spores at each production step. To detect transmission and recontamination events, strain typing was conducted with isolates obtained from all process stages. Microbial counts in 287 ESL milk packages at the end of shelf life were highly diverse ranging from <1 to 7.9 log cfu/mL. In total, 15% of samples were spoiled. High G+C Gram-positive bacteria were the most abundant taxonomic group, but were responsible for only 31% of spoilage. In contrast, psychrotolerant spores were isolated from 55% of spoiled packages. In 90% of samples with pure cultures of Bacillus cereus sensu lato and Paenibacillus spp., counts exceeded 6 log cfu/mL. In bulk tank milk, the concentration of psychrotolerant spores was low, accounting for merely 0.5 ± 0.8 MPN/mL. Paenibacillus amylolyticus/xylanexedens was by far the most dominant species in bulk tank milk (48% of all isolates), but was never detected in ESL milk, pointing to efficient removal during manufacturing. Six large-scale process analyses confirmed a high removal rate for psychrotolerant spores (reduction by nearly 4 log-units). B. cereus sensu lato, on the contrary, was frequently found in spoiled end products, but was rarely detected in bulk tank milk. Due to low counts in bulk tank samples and efficient spore removal during production, we suggest that shelf life is influenced only to a minor extent by raw-milk-associated factors. In contrast, recontamination with spores, particularly from the B. cereus complex, seems to occur. To enhance milk quality throughout the entire shelf life, improved plant sanitation and disinfection that target the elimination of spores are necessary.

  5. Usability of small impact craters on small surface areas in crater count dating: Analysing examples from the Harmakhis Vallis outflow channel, Mars

    NASA Astrophysics Data System (ADS)

    Kukkonen, S.; Kostama, V.-P.

    2018-05-01

    The availability of very high-resolution images has made it possible to extend crater size-frequency distribution studies to small, deca/hectometer-scale craters. This has enabled the dating of small and young surface units, as well as recent, short-time and small-scale geologic processes that have occurred on the units. Usually, however, the higher the spatial resolution of space images is, the smaller area is covered by the images. Thus the use of single, very high-resolution images in crater count age determination may be debatable if the images do not cover the studied region entirely. Here we compare the crater count results for the floor of the Harmakhis Vallis outflow channel obtained from the images of the ConTeXt camera (CTX) and High Resolution Imaging Science Experiment (HiRISE) aboard the Mars Reconnaissance Orbiter (MRO). The CTX images enable crater counts for entire units on the Harmakhis Vallis main valley, whereas the coverage of the higher-resolution HiRISE images is limited and thus the images can only be used to date small parts of the units. Our case study shows that the crater count data based on small impact craters and small surface areas mainly correspond with the crater count data based on larger craters and more extensive counting areas on the same unit. If differences between the results were founded, they could usually be explained by the regional geology. Usually, these differences appeared when at least one cratering model age is missing from either of the crater datasets. On the other hand, we found only a few cases in which the cratering model ages were completely different. We conclude that the crater counts using small impact craters on small counting areas provide useful information about the geological processes which have modified the surface. However, it is important to remember that all the crater counts results obtained from a specific counting area always primarily represent the results from the counting area-not the whole unit. On the other hand, together with crater count results from extensive counting areas and lower-resolution images, crater counts on small counting areas but by using very high-resolution images is a very valuable tool for obtaining unique additional information about the local processes on the surface units.

  6. User Guide for the LORE1 Insertion Mutant Resource.

    PubMed

    Mun, Terry; Małolepszy, Anna; Sandal, Niels; Stougaard, Jens; Andersen, Stig U

    2017-01-01

    Lotus japonicus is a model legume used in the study of plant-microbe interactions, especially in the field of biological nitrogen fixation due to its ability to enter into a symbiotic relationship with a soil bacterium, Mesorhizobium loti. The LORE1 mutant population is a valuable resource for reverse genetics in L. japonicus due to its non-transgenic nature, high tagging efficiency, and low copy count. Here, we outline a workflow for identifying, ordering, and establishing homozygous LORE1 mutant lines for a gene of interest, LjFls2, including protocols for growth and genotyping of a segregating LORE1 population.

  7. Flow Control on Low-Pressure Turbine Airfoils Using Vortex Generator Jets

    NASA Technical Reports Server (NTRS)

    Volino, Ralph J.; Ibrahim, Mounir B.; Kartuzova, Olga

    2010-01-01

    Motivation - Higher loading on Low-Pressure Turbine (LPT) airfoils: Reduce airfoil count, weight, cost. Increase efficiency, and Limited by suction side separation. Growing understanding of transition, separation, wake effects: Improved models. Take advantage of wakes. Higher lift airfoils in use. Further loading increases may require flow control: Passive: trips, dimples, etc. Active: plasma actuators, vortex generator jets (VGJs). Can increased loading offset higher losses on high lift airfoils. Objectives: Advance knowledge of boundary layer separation and transition under LPT conditions. Demonstrate, improve understanding of separation control with pulsed VGJs. Produce detailed experimental data base. Test and develop computational models.

  8. Ultrabright femtosecond source of biphotons based on a spatial mode inverter.

    PubMed

    Jarutis, Vygandas; Juodkazis, Saulius; Mizeikis, Vygantas; Sasaki, Keiji; Misawa, Hiroaki

    2005-02-01

    A method of enhancing the efficiency of entangled biphoton sources based on a type II femtosecond spontaneous parametric downconversion (SPDC) process is proposed and implemented experimentally. Enhancement is obtained by mode inversion of one of the SPDC output beams, which allows the beams to overlap completely, thus maximizing the number of SPDC photon pairs with optimum spatiotemporal overlap. By use of this method, biphoton count rates as high as 16 kHz from a single 0.5-mm-long beta-barium borate crystal pumped by second-harmonic radiation from a Ti:sapphire laser were obtained.

  9. Neutron coincidence counting based on time interval analysis with one- and two-dimensional Rossi-alpha distributions: an application for passive neutron waste assay

    NASA Astrophysics Data System (ADS)

    Bruggeman, M.; Baeten, P.; De Boeck, W.; Carchon, R.

    1996-02-01

    Neutron coincidence counting is commonly used for the non-destructive assay of plutonium bearing waste or for safeguards verification measurements. A major drawback of conventional coincidence counting is related to the fact that a valid calibration is needed to convert a neutron coincidence count rate to a 240Pu equivalent mass ( 240Pu eq). In waste assay, calibrations are made for representative waste matrices and source distributions. The actual waste however may have quite different matrices and source distributions compared to the calibration samples. This often results in a bias of the assay result. This paper presents a new neutron multiplicity sensitive coincidence counting technique including an auto-calibration of the neutron detection efficiency. The coincidence counting principle is based on the recording of one- and two-dimensional Rossi-alpha distributions triggered respectively by pulse pairs and by pulse triplets. Rossi-alpha distributions allow an easy discrimination between real and accidental coincidences and are aimed at being measured by a PC-based fast time interval analyser. The Rossi-alpha distributions can be easily expressed in terms of a limited number of factorial moments of the neutron multiplicity distributions. The presented technique allows an unbiased measurement of the 240Pu eq mass. The presented theory—which will be indicated as Time Interval Analysis (TIA)—is complementary to Time Correlation Analysis (TCA) theories which were developed in the past, but is from the theoretical point of view much simpler and allows a straightforward calculation of deadtime corrections and error propagation. Analytical expressions are derived for the Rossi-alpha distributions as a function of the factorial moments of the efficiency dependent multiplicity distributions. The validity of the proposed theory is demonstrated and verified via Monte Carlo simulations of pulse trains and the subsequent analysis of the simulated data.

  10. Platelet count and total and cause-specific mortality in the Women's Health Initiative.

    PubMed

    Kabat, Geoffrey C; Kim, Mimi Y; Verma, Amit K; Manson, JoAnn E; Lin, Juan; Lessin, Lawrence; Wassertheil-Smoller, Sylvia; Rohan, Thomas E

    2017-04-01

    We used data from the Women's Health Initiative to examine the association of platelet count with total mortality, coronary heart disease (CHD) mortality, cancer mortality, and non-CHD/noncancer mortality. Platelet count was measured at baseline in 159,746 postmenopausal women and again in year 3 in 75,339 participants. Participants were followed for a median of 15.9 years. Cox proportional hazards models were used to estimate the relative mortality hazards associated with deciles of baseline platelet count and of the mean of baseline + year 3 platelet count. Low and high deciles of both baseline and mean platelet count were positively associated with total mortality, CHD mortality, cancer mortality, and non-CHD/noncancer mortality. The association was robust and was not affected by adjustment for a number of potential confounding factors, exclusion of women with comorbidity, or allowance for reverse causality. Low- and high-platelet counts were associated with all four outcomes in never smokers, former smokers, and current smokers. In this large study of postmenopausal women, both low- and high-platelet counts were associated with total and cause-specific mortality. Copyright © 2017 Elsevier Inc. All rights reserved.

  11. Population trends, survival, and sampling methodologies for a population of Rana draytonii

    USGS Publications Warehouse

    Fellers, Gary M.; Kleeman, Patrick M.; Miller, David A.W.; Halstead, Brian J.

    2017-01-01

    Estimating population trends provides valuable information for resource managers, but monitoring programs face trade-offs between the quality and quantity of information gained and the number of sites surveyed. We compared the effectiveness of monitoring techniques for estimating population trends of Rana draytonii (California Red-legged Frog) at Point Reyes National Seashore, California, USA, over a 13-yr period. Our primary goals were to: 1) estimate trends for a focal pond at Point Reyes National Seashore, and 2) evaluate whether egg mass counts could reliably estimate an index of abundance relative to more-intensive capture–mark–recapture methods. Capture–mark–recapture (CMR) surveys of males indicated a stable population from 2005 to 2009, despite low annual apparent survival (26.3%). Egg mass counts from 2000 to 2012 indicated that despite some large fluctuations, the breeding female population was generally stable or increasing, with annual abundance varying between 26 and 130 individuals. Minor modifications to egg mass counts, such as marking egg masses, can allow estimation of egg mass detection probabilities necessary to convert counts to abundance estimates, even when closure of egg mass abundance cannot be assumed within a breeding season. High egg mass detection probabilities (mean per-survey detection probability = 0.98 [0.89–0.99]) indicate that egg mass surveys can be an efficient and reliable method for monitoring population trends of federally threatened R. draytonii. Combining egg mass surveys to estimate trends at many sites with CMR methods to evaluate factors affecting adult survival at focal populations is likely a profitable path forward to enhance understanding and conservation of R. draytonii.

  12. Large-scale femtoliter droplet array for digital counting of single biomolecules.

    PubMed

    Kim, Soo Hyeon; Iwai, Shino; Araki, Suguru; Sakakihara, Shouichi; Iino, Ryota; Noji, Hiroyuki

    2012-12-07

    We present a novel device employing one million femtoliter droplets immobilized on a substrate for the quantitative detection of extremely low concentrations of biomolecules in a sample. Surface-modified polystyrene beads carrying either zero or a single biomolecule-reporter enzyme complex are efficiently isolated into femtoliter droplets formed on hydrophilic-in-hydrophobic surfaces. Using a conventional micropipette, this is achieved by sequential injection first with an aqueous solution containing beads, and then with fluorinated oil. The concentration of target biomolecules is estimated from the ratio of the number of signal-emitting droplets to the total number of trapped beads (digital counting). The performance of our digital counting device was demonstrated by detecting a streptavidin-β-galactosidase conjugate with a limit of detection (LOD) of 10 zM. The sensitivity of our device was >20-fold higher than that noted in previous studies where a smaller number of reactors (fifty thousand reactors) were used. Such a low LOD was achieved because of the large number of droplets in an array, allowing simultaneous examination of a large number of beads. When combined with bead-based enzyme-linked immunosorbent assay (digital ELISA), the LOD for the detection of prostate specific antigen reached 2 aM. This value, again, was improved over that noted in a previous study, because of the decreased coefficient of variance of the background measurement determined by the Poisson noise. Our digital counting device using one million droplets has great potential as a highly sensitive, portable immunoassay device that could be used to diagnose diseases.

  13. High linearity SPAD and TDC array for TCSPC and 3D ranging applications

    NASA Astrophysics Data System (ADS)

    Villa, Federica; Lussana, Rudi; Bronzi, Danilo; Dalla Mora, Alberto; Contini, Davide; Tisa, Simone; Tosi, Alberto; Zappa, Franco

    2015-01-01

    An array of 32x32 Single-Photon Avalanche-Diodes (SPADs) and Time-to-Digital Converters (TDCs) has been fabricated in a 0.35 μm automotive-certified CMOS technology. The overall dimension of the chip is 9x9 mm2. Each pixel is able to detect photons in the 300 nm - 900 nm wavelength range with a fill-factor of 3.14% and either to count them or to time stamp their arrival time. In photon-counting mode an in-pixel 6-bit counter provides photon-numberresolved intensity movies at 100 kfps, whereas in photon-timing mode the 10-bit in-pixel TDC provides time-resolved maps (Time-Correlated Single-Photon Counting measurements) or 3D depth-resolved (through direct time-of-flight technique) images and movies, with 312 ps resolution. The photodetector is a 30 μm diameter SPAD with low Dark Count Rate (120 cps at room temperature, 3% hot-pixels) and 55% peak Photon Detection Efficiency (PDE) at 450 nm. The TDC has a 6-bit counter and a 4-bit fine interpolator, based on a Delay Locked Loop (DLL) line, which makes the TDC insensitive to process, voltage, and temperature drifts. The implemented sliding-scale technique improves linearity, giving 2% LSB DNL and 10% LSB INL. The single-shot precision is 260 ps rms, comprising SPAD, TDC and driving board jitter. Both optical and electrical crosstalk among SPADs and TDCs are negligible. 2D fast movies and 3D reconstructions with centimeter resolution are reported.

  14. Susceptibility constants of airborne bacteria to dielectric barrier discharge for antibacterial performance evaluation.

    PubMed

    Park, Chul Woo; Hwang, Jungho

    2013-01-15

    Dielectric barrier discharge (DBD) is a promising method to remove contaminant bioaerosols. The collection efficiency of a DBD reactor is an important factor for determining a reactor's removal efficiency. Without considering collection, simply defining the inactivation efficiency based on colony counting numbers for DBD as on and off may lead to overestimation of the inactivation efficiency of the DBD reactor. One-pass removal tests of bioaerosols were carried out to deduce the inactivation efficiency of the DBD reactor using both aerosol- and colony-counting methods. Our DBD reactor showed good performance for removing test bioaerosols for an applied voltage of 7.5 kV and a residence time of 0.24s, with η(CFU), η(Number), and η(Inactivation) values of 94%, 64%, and 83%, respectively. Additionally, we introduce the susceptibility constant of bioaerosols to DBD as a quantitative parameter for the performance evaluation of a DBD reactor. The modified susceptibility constant, which is the ratio of the susceptibility constant to the volume of the plasma reactor, has been successfully demonstrated for the performance evaluation of different sized DBD reactors under different DBD operating conditions. Our methodology will be used for design optimization, performance evaluation, and prediction of power consumption of DBD for industrial applications. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Efficiency of aerosol collection on wires exposed in the stratosphere

    NASA Technical Reports Server (NTRS)

    Lem, H. Y.; Farlow, N. H.

    1979-01-01

    The theory of inertial impaction is briefly presented. Stratospheric aerosol research experiments were performed duplicating Wong et al. experiments. The use of the curve of inertial parameters vs particle collection efficiency, derived from Wong et al., was found to be justified. The results show that stratospheric aerosol particles of all sizes are collectible by wire impaction technique. Curves and tables are presented and used to correct particle counts for collection efficiencies less than 100%.

  16. High resolution gamma-ray spectroscopy at high count rates with a prototype High Purity Germanium detector

    NASA Astrophysics Data System (ADS)

    Cooper, R. J.; Amman, M.; Vetter, K.

    2018-04-01

    High-resolution gamma-ray spectrometers are required for applications in nuclear safeguards, emergency response, and fundamental nuclear physics. To overcome one of the shortcomings of conventional High Purity Germanium (HPGe) detectors, we have developed a prototype device capable of achieving high event throughput and high energy resolution at very high count rates. This device, the design of which we have previously reported on, features a planar HPGe crystal with a reduced-capacitance strip electrode geometry. This design is intended to provide good energy resolution at the short shaping or digital filter times that are required for high rate operation and which are enabled by the fast charge collection afforded by the planar geometry crystal. In this work, we report on the initial performance of the system at count rates up to and including two million counts per second.

  17. [The effect of selected antibiotics on microorganisms contaminating boar ejaculate].

    PubMed

    Mazurová, J; Vinter, P

    1991-04-01

    The occurrence of microorganisms, including their total counts in boar native ejaculates, was investigated in two stages; the objective of this investigation also was to determine contamination after the sperms were treated with diluents containing the antibiotics ampicillin, gentamycin, apramycin, cefoxitin, or antibiotic combinations penicillin + streptomycin, ampicillin + cefoxitin, gentamycin + cefoxitin and ampicillin + gentamycin. The representation of bacterial species and total counts of microbes in 1 ml diluted sperm stored at a temperature of about 18 degrees C were determined in 24, 48 and 72 h after dilution. The microorganisms were cultivated from all native ejaculates. Proteus sp. (63.3%) and Pseudomonas aeruginosa (51.5% of the total number of examined samples) were the most frequent species. The number of contaminated diluted ejaculates ranged from 12.5 to 95.8% in 24 h after dilution, from 12.5 to 98.5% in 48 h and from 16.8 to 95.8% of the total number of examined ejaculates in 72 h. The occurrence of microorganisms correlated mostly with the efficiency spectrum of the antibiotics or their combinations. The average counts of microorganisms in 1 ml of native ejaculate made 2,363,000 in stage I and 1,472,108 in stage II. The highest average counts in 1 ml of diluted sperm were found in ejaculates containing cefoxitin and apramycin. Gentamycin was the most effective antibiotic used as a sole component (average counts of microorganisms CPM in 1 ml were 416 in 24 h, 955 in 48 h and 2260 in 72 h after dilution); ampicillin and gentamycin were the most efficient combination (14--20--21). This combination exerted very good effects also on Proteus sp. and Pseudomonas aeruginosa.

  18. Wedge-shaped microfluidic chip for circulating tumor cells isolation and its clinical significance in gastric cancer.

    PubMed

    Yang, Chaogang; Zhang, Nangang; Wang, Shuyi; Shi, Dongdong; Zhang, Chunxiao; Liu, Kan; Xiong, Bin

    2018-05-23

    Circulating tumor cells (CTCs) have great potential in both basic research and clinical application for the managements of cancer. However, the complicated fabrication processes and expensive materials of the existing CTCs isolation devices, to a large extent, limit their clinical translation and CTCs' clinical value. Therefore, it remains to be urgently needed to develop a new platform for achieving CTCs detection with low-cost, mass-producible but high performance. In the present study, we introduced a novel wedge-shaped microfluidic chip (named CTC-ΔChip) fabricated by two pieces of glass through wet etching and thermal bonding technique for CTCs isolation, which achieved CTCs enrichment by different size without cell surface expression markers and CTCs identification with three-color immunocytochemistry method (CK+/CD45-/Nucleus+). We validated the feasibility of CTC-ΔChip for detecting CTCs from different types of solid tumor. Furthermore, we applied the newly-developed platform to investigate the clinical significance of CTCs in gastric cancer (GC). Based on "label-free" characteristic, the capture efficiency of CTC-ΔChip can be as high as 93.7 ± 3.2% in DMEM and 91.0 ± 3.0% in whole blood sample under optimized conditions. Clinically, CTC-ΔChip exhibited the feasibility of detecting CTCs from different types of solid tumor, and it identified 7.30 ± 7.29 CTCs from 2 mL peripheral blood with a positive rate of 75% (30/40) in GC patients. Interestingly, we found that GC CTCs count was significantly correlated with multiple systemic inflammation indexes, including the lymphocyte count, platelet count, the level of neutrophil to lymphocyte ratio and platelet to lymphocyte ratio. In addition, we also found that both the positivity rate and CTCs count were significantly associated with multiple clinicopathology parameters. Our novel CTC-ΔChip shows high performance for detecting CTCs from less volume of blood samples of cancer patients and important clinical significance in GC. Owing to the advantages of low-cost and mass-producible, CTC-ΔChip holds great potential of clinical application for cancer therapeutic guidance and prognostic monitoring in the future.

  19. A statistical treatment of bioassay pour fractions

    NASA Astrophysics Data System (ADS)

    Barengoltz, Jack; Hughes, David

    A bioassay is a method for estimating the number of bacterial spores on a spacecraft surface for the purpose of demonstrating compliance with planetary protection (PP) requirements (Ref. 1). The details of the process may be seen in the appropriate PP document (e.g., for NASA, Ref. 2). In general, the surface is mechanically sampled with a damp sterile swab or wipe. The completion of the process is colony formation in a growth medium in a plate (Petri dish); the colonies are counted. Consider a set of samples from randomly selected, known areas of one spacecraft surface, for simplicity. One may calculate the mean and standard deviation of the bioburden density, which is the ratio of counts to area sampled. The standard deviation represents an estimate of the variation from place to place of the true bioburden density commingled with the precision of the individual sample counts. The accuracy of individual sample results depends on the equipment used, the collection method, and the culturing method. One aspect that greatly influences the result is the pour fraction, which is the quantity of fluid added to the plates divided by the total fluid used in extracting spores from the sampling equipment. In an analysis of a single sample’s counts due to the pour fraction, one seeks to answer the question: What is the probability that if a certain number of spores are counted with a known pour fraction, that there are an additional number of spores in the part of the rinse not poured. This is given for specific values by the binomial distribution density, where detection (of culturable spores) is success and the probability of success is the pour fraction. A special summation over the binomial distribution, equivalent to adding for all possible values of the true total number of spores, is performed. This distribution when normalized will almost yield the desired quantity. It is the probability that the additional number of spores does not exceed a certain value. Of course, for a desired value of uncertainty, one must invert the calculation. However, this probability of finding exactly the number of spores in the poured part is correct only in the case where all values of the true number of spores greater than or equal to the adjusted count are equally probable. This is not realistic, of course, but the result can only overestimate the uncertainty. So it is useful. In probability speak, one has the conditional probability given any true total number of spores. Therefore one must multiply it by the probability of each possible true count, before the summation. If the counts for a sample set (of which this is one sample) are available, one may use the calculated variance and the normal probability distribution. In this approach, one assumes a normal distribution and neglects the contribution from spatial variation. The former is a common assumption. The latter can only add to the conservatism (over estimate the number of spores at some level of confidence). A more straightforward approach is to assume a Poisson probability distribution for the measured total sample set counts, and use the product of the number of samples and the mean number of counts per sample as the mean of the Poisson distribution. It is necessary to set the total count to 1 in the Poisson distribution when actual total count is zero. Finally, even when the planetary protection requirements for spore burden refer only to the mean values, they require an adjustment for pour fraction and method efficiency (a PP specification based on independent data). The adjusted mean values are a 50/50 proposition (e.g., the probability of the true total counts in the sample set exceeding the estimate is 0.50). However, this is highly unconservative when the total counts are zero. No adjustment to the mean values occurs for either pour fraction or efficiency. The recommended approach is once again to set the total counts to 1, but now applied to the mean values. Then one may apply the corrections to the revised counts. It can be shown by the methods developed in this work that this change is usually conservative enough to increase the level of confidence in the estimate to 0.5. 1. NASA. (2005) Planetary protection provisions for robotic extraterrestrial missions. NPR 8020.12C, April 2005, National Aeronautics and Space Administration, Washington, DC. 2. NASA. (2010) Handbook for the Microbiological Examination of Space Hardware, NASA-HDBK-6022, National Aeronautics and Space Administration, Washington, DC.

  20. Use of RNA amplification and electrophoresis for studying virus aerosol collection efficiency and their comparison with plaque assays.

    PubMed

    Jiang, Xiao; Pan, Maohua; Hering, Susanne V; Lednicky, John A; Wu, Chang-Yu; Fan, Z Hugh

    2016-10-01

    The spread of virus-induced infectious diseases through airborne routes of transmission is a global concern for economic and medical reasons. To study virus transmission, it is essential to have an effective aerosol collector such as the growth tube collector (GTC) system that utilizes water-based condensation for collecting virus-containing aerosols. In this work, we characterized the GTC system using bacteriophage MS2 as a surrogate for a small RNA virus. We investigated using RNA extraction and reverse transcription- polymerase chain reaction (RT-PCR) to study the total virus collection efficiency of the GTC system. Plaque assays were also used to enumerate viable viruses collected by the GTC system compared to that by a commercially available apparatus, the SKC® Biosampler. The plaque assay counts were used to enumerate viable viruses whereas RT-PCR provides a total virus count, including those viruses inactivated during collection. The effects of relative humidity (RH) and other conditions on collection efficiency were also investigated. Our results suggest that the GTC has a collection efficiency for viable viruses between 0.24 and 1.8% and a total virus collection efficiency between 18.3 and 79.0%, which is 1-2 orders of magnitude higher than that of the SKC® Biosampler. Moreover, higher RH significantly increases both the viable and total collection efficiency of the GTC, while its effect on the collection efficiency of the SKC® Biosampler is not significant. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Composting Explosives/Organics Contaminated Soils

    DTIC Science & Technology

    1986-05-01

    29 144. Quantitation of C Trapped by Activated Carbon . ... 29 5. Preliminary Extraction Trials .... ........ ..... . 30 6. Tetryl Product...ppm (standard deviation 1892 ppm). All samples of soil from Letterkenny AD were pooled to yield one composite sample. Pooled samples from Louisiana...combustion efficiency, and counting efficiency. 4. Quantitation of 14 C Trapped by Activated Carbon Random subsamples of carbon from the air intake

  2. Investigation of a functional role for Titin in the bovine ovary based on the results of an initial whole genome scan for antral follicle count

    USDA-ARS?s Scientific Manuscript database

    A world-wide food shortage is predicted by the year 2050, and biotechnologies are needed to improve production efficiency in agriculture. Biotechnologies that improve reproductive efficiency in domestic farm species will improve the availability and price of food for the growing world population. ...

  3. Reliable enumeration of malaria parasites in thick blood films using digital image analysis.

    PubMed

    Frean, John A

    2009-09-23

    Quantitation of malaria parasite density is an important component of laboratory diagnosis of malaria. Microscopy of Giemsa-stained thick blood films is the conventional method for parasite enumeration. Accurate and reproducible parasite counts are difficult to achieve, because of inherent technical limitations and human inconsistency. Inaccurate parasite density estimation may have adverse clinical and therapeutic implications for patients, and for endpoints of clinical trials of anti-malarial vaccines or drugs. Digital image analysis provides an opportunity to improve performance of parasite density quantitation. Accurate manual parasite counts were done on 497 images of a range of thick blood films with varying densities of malaria parasites, to establish a uniformly reliable standard against which to assess the digital technique. By utilizing descriptive statistical parameters of parasite size frequency distributions, particle counting algorithms of the digital image analysis programme were semi-automatically adapted to variations in parasite size, shape and staining characteristics, to produce optimum signal/noise ratios. A reliable counting process was developed that requires no operator decisions that might bias the outcome. Digital counts were highly correlated with manual counts for medium to high parasite densities, and slightly less well correlated with conventional counts. At low densities (fewer than 6 parasites per analysed image) signal/noise ratios were compromised and correlation between digital and manual counts was poor. Conventional counts were consistently lower than both digital and manual counts. Using open-access software and avoiding custom programming or any special operator intervention, accurate digital counts were obtained, particularly at high parasite densities that are difficult to count conventionally. The technique is potentially useful for laboratories that routinely perform malaria parasite enumeration. The requirements of a digital microscope camera, personal computer and good quality staining of slides are potentially reasonably easy to meet.

  4. Clustering method for counting passengers getting in a bus with single camera

    NASA Astrophysics Data System (ADS)

    Yang, Tao; Zhang, Yanning; Shao, Dapei; Li, Ying

    2010-03-01

    Automatic counting of passengers is very important for both business and security applications. We present a single-camera-based vision system that is able to count passengers in a highly crowded situation at the entrance of a traffic bus. The unique characteristics of the proposed system include, First, a novel feature-point-tracking- and online clustering-based passenger counting framework, which performs much better than those of background-modeling-and foreground-blob-tracking-based methods. Second, a simple and highly accurate clustering algorithm is developed that projects the high-dimensional feature point trajectories into a 2-D feature space by their appearance and disappearance times and counts the number of people through online clustering. Finally, all test video sequences in the experiment are captured from a real traffic bus in Shanghai, China. The results show that the system can process two 320×240 video sequences at a frame rate of 25 fps simultaneously, and can count passengers reliably in various difficult scenarios with complex interaction and occlusion among people. The method achieves high accuracy rates up to 96.5%.

  5. Comparison of Drive Counts and Mark-Resight As Methods of Population Size Estimation of Highly Dense Sika Deer (Cervus nippon) Populations

    PubMed Central

    Takeshita, Kazutaka; Yoshida, Tsuyoshi; Igota, Hiromasa; Matsuura, Yukiko

    2016-01-01

    Assessing temporal changes in abundance indices is an important issue in the management of large herbivore populations. The drive counts method has been frequently used as a deer abundance index in mountainous regions. However, despite an inherent risk for observation errors in drive counts, which increase with deer density, evaluations of the utility of drive counts at a high deer density remain scarce. We compared the drive counts and mark-resight (MR) methods in the evaluation of a highly dense sika deer population (MR estimates ranged between 11 and 53 individuals/km2) on Nakanoshima Island, Hokkaido, Japan, between 1999 and 2006. This deer population experienced two large reductions in density; approximately 200 animals in total were taken from the population through a large-scale population removal and a separate winter mass mortality event. Although the drive counts tracked temporal changes in deer abundance on the island, they overestimated the counts for all years in comparison to the MR method. Increased overestimation in drive count estimates after the winter mass mortality event may be due to a double count derived from increased deer movement and recovery of body condition secondary to the mitigation of density-dependent food limitations. Drive counts are unreliable because they are affected by unfavorable factors such as bad weather, and they are cost-prohibitive to repeat, which precludes the calculation of confidence intervals. Therefore, the use of drive counts to infer the deer abundance needs to be reconsidered. PMID:27711181

  6. Low CD1c + myeloid dendritic cell counts correlated with a high risk of rapid disease progression during early HIV-1 infection.

    PubMed

    Diao, Yingying; Geng, Wenqing; Fan, Xuejie; Cui, Hualu; Sun, Hong; Jiang, Yongjun; Wang, Yanan; Sun, Amy; Shang, Hong

    2015-08-19

    During early HIV-1 infection (EHI), the interaction between the immune response and the virus determines disease progression. Although CD1c + myeloid dendritic cells (mDCs) can trigger the immune response, the relationship between CD1c + mDC alteration and disease progression has not yet been defined. EHI changes in CD1c + mDC counts, surface marker (CD40, CD86, CD83) expression, and IL-12 secretion were assessed by flow cytometry in 29 patients. When compared with the normal controls, patients with EHI displayed significantly lower CD1c + mDC counts and IL-12 secretion and increased surface markers. CD1c + mDC counts were positively correlated with CD4+ T cell counts and inversely associated with viral loads. IL-12 secretion was only positively associated with CD4+ T cell counts. Rapid progressors had lower counts, CD86 expression, and IL-12 secretion of CD1c + mDCs comparing with typical progressors. Kaplan-Meier analysis and Cox regression models suggested patients with low CD1c + mDC counts (<10 cells/μL) had a 4-fold higher risk of rapid disease progression than those with high CD1c + mDC counts. However, no relationship was found between surface markers or IL-12 secretion and disease progression. During EHI, patients with low CD1c + mDC counts were more likely to experience rapid disease progression than those with high CD1c + mDC counts.

  7. Triple-Label β Liquid Scintillation Counting

    PubMed Central

    Bukowski, Thomas R.; Moffett, Tyler C.; Revkin, James H.; Ploger, James D.; Bassingthwaighte, James B.

    2010-01-01

    The detection of radioactive compounds by liquid scintillation has revolutionized modern biology, yet few investigators make full use of the power of this technique. Even though multiple isotope counting is considerably more difficult than single isotope counting, many experimental designs would benefit from using more than one isotope. The development of accurate isotope counting techniques enabling the simultaneous use of three β-emitting tracers has facilitated studies in our laboratory using the multiple tracer indicator dilution technique for assessing rates of transmembrane transport and cellular metabolism. The details of sample preparation, and of stabilizing the liquid scintillation spectra of the tracers, are critical to obtaining good accuracy. Reproducibility is enhanced by obtaining detailed efficiency/quench curves for each particular set of tracers and solvent media. The numerical methods for multiple-isotope quantitation depend on avoiding error propagation (inherent to successive subtraction techniques) by using matrix inversion. Experimental data obtained from triple-label β counting illustrate reproducibility and good accuracy even when the relative amounts of different tracers in samples of protein/electrolyte solutions, plasma, and blood are changed. PMID:1514684

  8. A unified genetic association test robust to latent population structure for a count phenotype.

    PubMed

    Song, Minsun

    2018-06-04

    Confounding caused by latent population structure in genome-wide association studies has been a big concern despite the success of genome-wide association studies at identifying genetic variants associated with complex diseases. In particular, because of the growing interest in association mapping using count phenotype data, it would be interesting to develop a testing framework for genetic associations that is immune to population structure when phenotype data consist of count measurements. Here, I propose a solution for testing associations between single nucleotide polymorphisms and a count phenotype in the presence of an arbitrary population structure. I consider a classical range of models for count phenotype data. Under these models, a unified test for genetic associations that protects against confounding was derived. An algorithm was developed to efficiently estimate the parameters that are required to fit the proposed model. I illustrate the proposed approach using simulation studies and an empirical study. Both simulated and real-data examples suggest that the proposed method successfully corrects population structure. Copyright © 2018 John Wiley & Sons, Ltd.

  9. Rapid and automated enumeration of viable bacteria in compost using a micro-colony auto counting system.

    PubMed

    Wang, Xiaodan; Yamaguchi, Nobuyasu; Someya, Takashi; Nasu, Masao

    2007-10-01

    The micro-colony method was used to enumerate viable bacteria in composts. Cells were vacuum-filtered onto polycarbonate filters and incubated for 18 h on LB medium at 37 degrees C. Bacteria on the filters were stained with SYBR Green II, and enumerated using a newly developed micro-colony auto counting system which can automatically count micro-colonies on half the area of the filter within 90 s. A large number of bacteria in samples retained physiological activity and formed micro-colonies within 18 h, whereas most could not form large colonies on conventional media within 1 week. The results showed that this convenient technique can enumerate viable bacteria in compost rapidly for its efficient quality control.

  10. SOI metal-oxide-semiconductor field-effect transistor photon detector based on single-hole counting.

    PubMed

    Du, Wei; Inokawa, Hiroshi; Satoh, Hiroaki; Ono, Atsushi

    2011-08-01

    In this Letter, a scaled-down silicon-on-insulator (SOI) metal-oxide-semiconductor field-effect transistor (MOSFET) is characterized as a photon detector, where photogenerated individual holes are trapped below the negatively biased gate and modulate stepwise the electron current flowing in the bottom channel induced by the positive substrate bias. The output waveforms exhibit clear separation of current levels corresponding to different numbers of trapped holes. Considering this capability of single-hole counting, a small dark count of less than 0.02 s(-1) at room temperature, and low operation voltage of 1 V, SOI MOSFET could be a unique photon-number-resolving detector if the small quantum efficiency were improved. © 2011 Optical Society of America

  11. Marine phages as excellent tracers for reactive colloidal transport in porous media

    NASA Astrophysics Data System (ADS)

    Ghanem, Nawras; Chatzinotas, Antonis; Harms, Hauke; Wick, Lukas Y.

    2016-04-01

    Question: Here we evaluate marine phages as specific markers of hydrological flow and reactive transport of colloidal particles in the Earth's critical zone (CZ). Marine phages and their bacterial hosts are naturally absent in the CZ, and can be detected with extremely high sensitivity. In the framework of the DFG Collaborative Research Center AquaDiva, we asked the following questions: (1) Are marine phages useful specific markers of hydrological flow and reactive transport in porous media? and (2) Which phage properties are relevant drivers for the transport of marine phages in porous media? Methods: Seven marine phages from different families (as well two commonly used terrestrial phages) were selected based on their morphology, size and physico-chemical surface properties (surface charge and hydrophobicity). Phage properties were assessed by electron microscopy, dynamic light scattering and water contact angle analysis (CA). Sand-filled laboratory percolation columns were used to study transport. The breakthrough curves of the phages were analyzed using the clean bed filtration theory and the XDLVO theory of colloid stability, respectively. Phages were quantified by a modified high- throughput plaque assay and a culture-independent particle counting method approach. Results: Our data show that most marine tested phages exhibited highly variable transport rates and deposition efficiency, yet generally high colloidal stability and viability. We find that size, morphology and hydrophobicity are key factors shaping the transport efficiency of phages. Differing deposition efficiencies of the phages were also supported by calculated XDLVO interaction energy profile. Conclusion: Marine phages have a high potential for the use as sensitive tracers in terrestrial habitats with their surface properties playing a crucial role for their transport. Marine phages however, exhibit differences in their deposition efficiency depending on their morphology, hydrophobicity and availability.

  12. Performance Evaluation of High Fluorescence Lymphocyte Count: Comparability to Atypical Lymphocyte Count and Clinical Significance.

    PubMed

    Tantanate, Chaicharoen; Klinbua, Cherdsak

    2018-06-15

    To investigate the association between high-fluorescence lymphocyte cell (HFLC) and atypical lymphocyte (AL) counts, and to determine the clinical significance of HFLC. We compared automated HFLC and microscopic AL counts and analyzed the findings. Patient clinical data for each specimen were reviewed. A total of 320 blood specimens were included. The correlation between HFLC and microscopic AL counts was 0.865 and 0.893 for absolute and percentage counts, respectively. Sensitivity, specificity, and accuracy of HFLC at the cutoff value of 0.1 × 109 per L for detection of AL were 0.8, 0.77, and 0.8, respectively. Studied patients were classified into 4 groups: infection, immunological disorders, malignant neoplasms, and others. Patients with infections had the highest HFLC. Most of those patients (67.7%) had dengue infection. HFLC counts were well-correlated with AL counts with the acceptable test characteristics. Applying HFLC flagging may alert laboratory staff to be aware of ALs.

  13. HgCdTe APD-based linear-mode photon counting components and ladar receivers

    NASA Astrophysics Data System (ADS)

    Jack, Michael; Wehner, Justin; Edwards, John; Chapman, George; Hall, Donald N. B.; Jacobson, Shane M.

    2011-05-01

    Linear mode photon counting (LMPC) provides significant advantages in comparison with Geiger Mode (GM) Photon Counting including absence of after-pulsing, nanosecond pulse to pulse temporal resolution and robust operation in the present of high density obscurants or variable reflectivity objects. For this reason Raytheon has developed and previously reported on unique linear mode photon counting components and modules based on combining advanced APDs and advanced high gain circuits. By using HgCdTe APDs we enable Poisson number preserving photon counting. A metric of photon counting technology is dark count rate and detection probability. In this paper we report on a performance breakthrough resulting from improvement in design, process and readout operation enabling >10x reduction in dark counts rate to ~10,000 cps and >104x reduction in surface dark current enabling long 10 ms integration times. Our analysis of key dark current contributors suggest that substantial further reduction in DCR to ~ 1/sec or less can be achieved by optimizing wavelength, operating voltage and temperature.

  14. 3D Silicon Coincidence Avalanche Detector (3D-SiCAD) for charged particle detection

    NASA Astrophysics Data System (ADS)

    Vignetti, M. M.; Calmon, F.; Pittet, P.; Pares, G.; Cellier, R.; Quiquerez, L.; Chaves de Albuquerque, T.; Bechetoille, E.; Testa, E.; Lopez, J.-P.; Dauvergne, D.; Savoy-Navarro, A.

    2018-02-01

    Single-Photon Avalanche Diodes (SPADs) are p-n junctions operated in Geiger Mode by applying a reverse bias above the breakdown voltage. SPADs have the advantage of featuring single photon sensitivity with timing resolution in the picoseconds range. Nevertheless, their relatively high Dark Count Rate (DCR) is a major issue for charged particle detection, especially when it is much higher than the incoming particle rate. To tackle this issue, we have developed a 3D Silicon Coincidence Avalanche Detector (3D-SiCAD). This novel device implements two vertically aligned SPADs featuring on-chip electronics for the detection of coincident avalanche events occurring on both SPADs. Such a coincidence detection mode allows an efficient discrimination of events related to an incoming charged particle (producing a quasi-simultaneous activation of both SPADs) from dark counts occurring independently on each SPAD. A 3D-SiCAD detector prototype has been fabricated in CMOS technology adopting a 3D flip-chip integration technique, and the main results of its characterization are reported in this work. The particle detection efficiency and noise rejection capability for this novel device have been evaluated by means of a β- strontium-90 radioactive source. Moreover the impact of the main operating parameters (i.e. the hold-off time, the coincidence window duration, the SPAD excess bias voltage) over the particle detection efficiency has been studied. Measurements have been performed with different β- particles rates and show that a 3D-SiCAD device outperforms single SPAD detectors: the former is indeed capable to detect particle rates much lower than the individual DCR observed in a single SPAD-based detectors (i.e. 2 to 3 orders of magnitudes lower).

  15. MANTA--an open-source, high density electrophysiology recording suite for MATLAB.

    PubMed

    Englitz, B; David, S V; Sorenson, M D; Shamma, S A

    2013-01-01

    The distributed nature of nervous systems makes it necessary to record from a large number of sites in order to decipher the neural code, whether single cell, local field potential (LFP), micro-electrocorticograms (μECoG), electroencephalographic (EEG), magnetoencephalographic (MEG) or in vitro micro-electrode array (MEA) data are considered. High channel-count recordings also optimize the yield of a preparation and the efficiency of time invested by the researcher. Currently, data acquisition (DAQ) systems with high channel counts (>100) can be purchased from a limited number of companies at considerable prices. These systems are typically closed-source and thus prohibit custom extensions or improvements by end users. We have developed MANTA, an open-source MATLAB-based DAQ system, as an alternative to existing options. MANTA combines high channel counts (up to 1440 channels/PC), usage of analog or digital headstages, low per channel cost (<$90/channel), feature-rich display and filtering, a user-friendly interface, and a modular design permitting easy addition of new features. MANTA is licensed under the GPL and free of charge. The system has been tested by daily use in multiple setups for >1 year, recording reliably from 128 channels. It offers a growing list of features, including integrated spike sorting, PSTH and CSD display and fully customizable electrode array geometry (including 3D arrays), some of which are not available in commercial systems. MANTA runs on a typical PC and communicates via TCP/IP and can thus be easily integrated with existing stimulus generation/control systems in a lab at a fraction of the cost of commercial systems. With modern neuroscience developing rapidly, MANTA provides a flexible platform that can be rapidly adapted to the needs of new analyses and questions. Being open-source, the development of MANTA can outpace commercial solutions in functionality, while maintaining a low price-point.

  16. MANTA—an open-source, high density electrophysiology recording suite for MATLAB

    PubMed Central

    Englitz, B.; David, S. V.; Sorenson, M. D.; Shamma, S. A.

    2013-01-01

    The distributed nature of nervous systems makes it necessary to record from a large number of sites in order to decipher the neural code, whether single cell, local field potential (LFP), micro-electrocorticograms (μECoG), electroencephalographic (EEG), magnetoencephalographic (MEG) or in vitro micro-electrode array (MEA) data are considered. High channel-count recordings also optimize the yield of a preparation and the efficiency of time invested by the researcher. Currently, data acquisition (DAQ) systems with high channel counts (>100) can be purchased from a limited number of companies at considerable prices. These systems are typically closed-source and thus prohibit custom extensions or improvements by end users. We have developed MANTA, an open-source MATLAB-based DAQ system, as an alternative to existing options. MANTA combines high channel counts (up to 1440 channels/PC), usage of analog or digital headstages, low per channel cost (<$90/channel), feature-rich display and filtering, a user-friendly interface, and a modular design permitting easy addition of new features. MANTA is licensed under the GPL and free of charge. The system has been tested by daily use in multiple setups for >1 year, recording reliably from 128 channels. It offers a growing list of features, including integrated spike sorting, PSTH and CSD display and fully customizable electrode array geometry (including 3D arrays), some of which are not available in commercial systems. MANTA runs on a typical PC and communicates via TCP/IP and can thus be easily integrated with existing stimulus generation/control systems in a lab at a fraction of the cost of commercial systems. With modern neuroscience developing rapidly, MANTA provides a flexible platform that can be rapidly adapted to the needs of new analyses and questions. Being open-source, the development of MANTA can outpace commercial solutions in functionality, while maintaining a low price-point. PMID:23653593

  17. [In vitro comparison of epidural bacteria filters permeability and screening scanning electron microscopy].

    PubMed

    Sener, Aysin; Erkin, Yuksel; Sener, Alper; Tasdogen, Aydin; Dokumaci, Esra; Elar, Zahide

    2015-01-01

    Epidural catheter bacteria filters are barriers in the patient-controlled analgesia/anaesthesia for preventing contamination at the epidural insertion site. The efficiency of these filters varies according to pore sizes and materials. The bacterial adhesion capability of the two filters was measured in vitro experiment. Adhesion capacities for standard Staphylococcus aureus (ATCC 25923) and Pseudomonas aeruginosa (ATCC 27853) strains of the two different filters (Portex and Rusch) which have the same pore size were examined. Bacterial suspension of 0.5 Mc Farland was placed in the patient-controlled analgesia pump, was filtered at a speed of 5mL/h. in continuous infusion for 48h and accumulated in bottle. The two filters were compared with colony counts of bacteria in the filters and bottles. At the same time, the filters and adhered bacteria were monitored by scanning electron microscope. Electron microscopic examination of filters showed that the Portex filter had a granular and the Rusch filter fibrillary structure. Colony counting from the catheter and bottle showed that both of the filters have significant bacterial adhesion capability (p<0.001). After the bacteria suspension infusion, colony countings showed that the Portex filter was more efficient (p<0.001). There was not any difference between S. aureus and P. aeruginosa bacteria adhesion. In the SEM monitoring after the infusion, it was physically shown that the bacteria were adhered efficiently by both of the filters. The granular structured filter was found statistically and significantly more successful than the fibrial. Although the pore sizes of the filters were same - of which structural differences shown by SEM were the same - it would not be right to attribute the changes in the efficiencies to only structural differences. Using microbiological and physical proofs with regard to efficiency at the same time has been another important aspect of this experiment. Copyright © 2013 Sociedade Brasileira de Anestesiologia. Publicado por Elsevier Editora Ltda. All rights reserved.

  18. In vitro comparison of epidural bacteria filters permeability and screening scanning electron microscopy.

    PubMed

    Sener, Aysin; Erkin, Yuksel; Sener, Alper; Tasdogen, Aydin; Dokumaci, Esra; Elar, Zahide

    2015-01-01

    Epidural catheter bacteria filters are barriers in the patient-controlled analgesia/anaesthesia for preventing contamination at the epidural insertion site. The efficiency of these filters varies according to pore sizes and materials. The bacterial adhesion capability of the two filters was measured in vitro experiment. Adhesion capacities for standard Staphylococcus aureus (ATCC 25923) and Pseudomonas aeruginosa (ATCC 27853) strains of the two different filters (Portex and Rusch) which have the same pore size were examined. Bacterial suspension of 0.5 Mc Farland was placed in the patient-controlled analgesia pump, was filtered at a speed of 5 mL/h. in continuous infusion for 48 h and accumulated in bottle. The two filters were compared with colony counts of bacteria in the filters and bottles. At the same time, the filters and adhered bacteria were monitored by scanning electron microscope. Electron microscopic examination of filters showed that the Portex filter had a granular and the Rusch filter fibrillary structure. Colony counting from the catheter and bottle showed that both of the filters have significant bacterial adhesion capability (p<0.001). After the bacteria suspension infusion, colony countings showed that the Portex filter was more efficient (p<0.001). There was not any difference between S. aureus and P. aeruginosa bacteria adhesion. In the SEM monitoring after the infusion, it was physically shown that the bacteria were adhered efficiently by both of the filters. The granular structured filter was found statistically and significantly more successful than the fibrial. Although the pore sizes of the filters were same - of which structural differences shown by SEM were the same - it would not be right to attribute the changes in the efficiencies to only structural differences. Using microbiological and physical proofs with regard to efficiency at the same time has been another important aspect of this experiment. Copyright © 2013 Sociedade Brasileira de Anestesiologia. Published by Elsevier Editora Ltda. All rights reserved.

  19. Radiation shielding materials characterization in the MoMa-Count program and further evolutions

    NASA Astrophysics Data System (ADS)

    Lobascio, Cesare

    In the frame of the space research programme MoMa (From Molecules to Man) -Count (Coun-termeasures), funded by the Italian Space Agency, multi-functional protections for human space exploration have been investigated, paying particular attention to flexible materials, selected also for their excellent structural, thermal and ballistic performances. Flexible materials such as Kevlar R are qualified for space application, but have poorly known space radiation prop-erties, with consequent uncertainties about their shielding efficiency against the radiation en-vironment. The necessary evaluation of their shielding efficiency has been chiefly based on dedicated ground experiments in accelerators, supplemented by Monte Carlo simulations of the particle transport in the materials or multi-layers. In addition, flight experiments have been performed in Low Earth Orbit (LEO), onboard the International Space Station (ISS) and the re-entry capsule Foton, to measure the shielding behaviour in the actual operating environment of space, via dedicated detectors and dosimeters. This paper aims at presenting the results and lessons learned accrued within the MoMa-Count program, as well as the future actions planned for improving radiation shielding in long duration human exploration missions.

  20. Influence of dietary supplementation of prebiotics (mannanoligosaccharide) on the performance of crossbred calves.

    PubMed

    Ghosh, Sudipta; Mehla, Ram Kumar

    2012-03-01

    Thirty-six Holstein cross calves 5 days of age in their preruminant stage were used to study the effect of feeding prebiotic (mannanoligosaccharide) on their performance up to the age of 2 months. Treatment and control groups consisted of 18 calves each. Treatment group was supplemented with 4 g prebiotic (mannanoligosaccharide)/calf/day. Performance was evaluated by measuring average body weight (BW) gain, feed intake [dry matter (DM), total digestible nutrient (TDN) and crude protein(CP)], feed conversion efficiency (DM, TDN, and CP), fecal score, fecal coliform count and feeding cost. Body weight measured weekly, feed intake measured twice daily, proximate analysis of feeds and fodders analyzed weekly, fecal score monitored daily and fecal coliform count done weekly. There was a significant increase in average body weight gain, feed intake and feed conversion efficiency; and a significant decrease in severity of scours as measured by fecal score and fecal coliform count in the treatment group compared with control group (P < 0.01). Feed cost/kg BW gain was significantly lower in the treatment group compared to control group (P < 0.01). The results suggest that prebiotic (mannanoligosaccharide) can be supplemented to the calves for better performance.

  1. Liquid scintillation counting methodology for 99Tc analysis. A remedy for radiopharmaceutical waste

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Khan, Mumtaz; Um, Wooyong

    2015-08-13

    This paper presents a new approach for liquid scintillation counting (LSC) analysis of single-radionuclide samples containing appreciable organic or inorganic quench. This work offers better analytical results than existing LSC methods for technetium-99 ( 99gTc) analysis with significant savings in analysis cost and time. The method was developed to quantify 99gTc in environmental liquid and urine samples using LSC. Method efficiency was measured in the presence of 1.9 to 11,900 ppm total dissolved solids. The quench curve was proved to be effective in the case of spiked 99gTc activity calculation for deionized water, tap water, groundwater, seawater, and urine samples.more » Counting efficiency was found to be 91.66% for Ultima Gold LLT (ULG-LLT) and Ultima Gold (ULG). Relative error in spiked 99gTc samples was ±3.98% in ULG and ULG-LLT cocktails. Minimum detectable activity was determined to be 25.3 mBq and 22.7 mBq for ULG-LLT and ULG cocktails, respectively. A pre-concentration factor of 1000 was achieved at 100°C for 100% chemical recovery.« less

  2. Real-time yield estimation based on deep learning

    NASA Astrophysics Data System (ADS)

    Rahnemoonfar, Maryam; Sheppard, Clay

    2017-05-01

    Crop yield estimation is an important task in product management and marketing. Accurate yield prediction helps farmers to make better decision on cultivation practices, plant disease prevention, and the size of harvest labor force. The current practice of yield estimation based on the manual counting of fruits is very time consuming and expensive process and it is not practical for big fields. Robotic systems including Unmanned Aerial Vehicles (UAV) and Unmanned Ground Vehicles (UGV), provide an efficient, cost-effective, flexible, and scalable solution for product management and yield prediction. Recently huge data has been gathered from agricultural field, however efficient analysis of those data is still a challenging task. Computer vision approaches currently face diffident challenges in automatic counting of fruits or flowers including occlusion caused by leaves, branches or other fruits, variance in natural illumination, and scale. In this paper a novel deep convolutional network algorithm was developed to facilitate the accurate yield prediction and automatic counting of fruits and vegetables on the images. Our method is robust to occlusion, shadow, uneven illumination and scale. Experimental results in comparison to the state-of-the art show the effectiveness of our algorithm.

  3. Mass-dependent channel electron multiplier operation. [for ion detection

    NASA Technical Reports Server (NTRS)

    Fields, S. A.; Burch, J. L.; Oran, W. A.

    1977-01-01

    The absolute counting efficiency and pulse height distributions of a continuous-channel electron multiplier used in the detection of hydrogen, argon and xenon ions are assessed. The assessment technique, which involves the post-acceleration of 8-eV ion beams to energies from 100 to 4000 eV, provides information on counting efficiency versus post-acceleration voltage characteristics over a wide range of ion mass. The charge pulse height distributions for H2 (+), A (+) and Xe (+) were measured by operating the experimental apparatus in a marginally gain-saturated mode. It was found that gain saturation occurs at lower channel multiplier operating voltages for light ions such as H2 (+) than for the heavier ions A (+) and Xe (+), suggesting that the technique may be used to discriminate between these two classes of ions in electrostatic analyzers.

  4. Adrenaline administration promotes the efficiency of granulocyte colony stimulating factor-mediated hematopoietic stem and progenitor cell mobilization in mice.

    PubMed

    Chen, Chong; Cao, Jiang; Song, Xuguang; Zeng, Lingyu; Li, Zhenyu; Li, Yong; Xu, Kailin

    2013-01-01

    A high dose of granulocyte colony stimulating factor (G-CSF) is widely used to mobilize hematopoietic stem and progenitor cells (HSPC), but G-CSF is relatively inefficient and may cause adverse effects. Recently, adrenaline has been found to play important roles in HSPC mobilization. In this study, we explored whether adrenaline combined with G-CSF could induce HSPC mobilization in a mouse model. Mice were treated with adrenaline and either a high or low dose of G-CSF alone or in combination. Peripheral blood HSPC counts were evaluated by flow cytometry. Levels of bone marrow SDF-1 were measured by ELISA, the transcription of CXCR4 and SDF-1 was measured by real-time RT-PCR, and CXCR4 protein was detected by Western blot. Our results showed that adrenaline alone fails to mobilize HSPCs into the peripheral blood; however, when G-CSF and adrenaline are combined, the WBC counts and percentages of HSPCs are significantly higher compared to those in mice that received G-CSF alone. The combined use of adrenaline and G-CSF not only accelerated HSPC mobilization, but also enabled the efficient mobilization of HSPCs into the peripheral blood at lower doses of G-CSF. Adrenaline/G-CSF treatment also extensively downregulated levels of SDF-1 and CXCR4 in mouse bone marrow. These results demonstrated that adrenaline combined with G-CSF can induce HSPC mobilization by down-regulating the CXCR4/SDF-1 axis, indicating that the use of adrenaline may enable the use of reduced dosages or durations of G-CSF treatment, minimizing G-CSF-associated complications.

  5. Factors influencing platelet clumping during peripheral blood hematopoietic stem cell collection

    PubMed Central

    Mathur, Gagan; Bell, Sarah L.; Collins, Laura; Nelson, Gail A.; Knudson, C. Michael; Schlueter, Annette J.

    2018-01-01

    BACKGROUND Platelet clumping is a common occurrence during peripheral blood hematopoietic stem cell (HSC) collection using the Spectra Optia mononuclear cell (MNC) protocol. If clumping persists, it may prevent continuation of the collection and interfere with proper MNC separation. This study is the first to report the incidence of clumping, identify precollection factors associated with platelet clumping, and describe the degree to which platelet clumping interferes with HSC product yield. STUDY DESIGN AND METHODS In total, 258 HSC collections performed on 116 patients using the Optia MNC protocol were reviewed. Collections utilized heparin in anticoagulant citrate dextrose to facilitate large-volume leukapheresis. Linear and logistic regression models were utilized to determine which precollection factors were predictive of platelet clumping and whether clumping was associated with product yield or collection efficiency. RESULTS Platelet clumping was observed in 63% of collections. Multivariable analysis revealed that a lower white blood cell count was an independent predictor of clumping occurrence. Chemotherapy mobilization and a lower peripheral blood CD34+ cell count were predictors of the degree of clumping. Procedures with clumping had higher collection efficiency but lower blood volume processed on average, resulting in no difference in collection yields. Citrate toxicity did not correlate with clumping. CONCLUSION Although platelet clumping is a common technical problem seen during HSC collection, the total CD34+ cell-collection yields were not affected by clumping. WBC count, mobilization approach, and peripheral blood CD34+ cell count can help predict clumping and potentially drive interventions to proactively manage clumping. PMID:28150319

  6. Neutron counting with cameras

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Esch, Patrick; Crisanti, Marta; Mutti, Paolo

    2015-07-01

    A research project is presented in which we aim at counting individual neutrons with CCD-like cameras. We explore theoretically a technique that allows us to use imaging detectors as counting detectors at lower counting rates, and transits smoothly to continuous imaging at higher counting rates. As such, the hope is to combine the good background rejection properties of standard neutron counting detectors with the absence of dead time of integrating neutron imaging cameras as well as their very good spatial resolution. Compared to Xray detection, the essence of thermal neutron detection is the nuclear conversion reaction. The released energies involvedmore » are of the order of a few MeV, while X-ray detection releases energies of the order of the photon energy, which is in the 10 KeV range. Thanks to advances in camera technology which have resulted in increased quantum efficiency, lower noise, as well as increased frame rate up to 100 fps for CMOS-type cameras, this more than 100-fold higher available detection energy implies that the individual neutron detection light signal can be significantly above the noise level, as such allowing for discrimination and individual counting, which is hard to achieve with X-rays. The time scale of CMOS-type cameras doesn't allow one to consider time-of-flight measurements, but kinetic experiments in the 10 ms range are possible. The theory is next confronted to the first experimental results. (authors)« less

  7. Application of DNA Chip Scanning Technology for Automatic Detection of Chlamydia trachomatis and Chlamydia pneumoniae Inclusions

    PubMed Central

    Bogdanov, Anita; Endrész, Valeria; Urbán, Szabolcs; Lantos, Ildikó; Deák, Judit; Burián, Katalin; Önder, Kamil; Ayaydin, Ferhan; Balázs, Péter

    2014-01-01

    Chlamydiae are obligate intracellular bacteria that propagate in the inclusion, a specific niche inside the host cell. The standard method for counting chlamydiae is immunofluorescent staining and manual counting of chlamydial inclusions. High- or medium-throughput estimation of the reduction in chlamydial inclusions should be the basis of testing antichlamydial compounds and other drugs that positively or negatively influence chlamydial growth, yet low-throughput manual counting is the common approach. To overcome the time-consuming and subjective manual counting, we developed an automatic inclusion-counting system based on a commercially available DNA chip scanner. Fluorescently labeled inclusions are detected by the scanner, and the image is processed by ChlamyCount, a custom plug-in of the ImageJ software environment. ChlamyCount was able to measure the inclusion counts over a 1-log-unit dynamic range with a high correlation to the theoretical counts. ChlamyCount was capable of accurately determining the MICs of the novel antimicrobial compound PCC00213 and the already known antichlamydial antibiotics moxifloxacin and tetracycline. ChlamyCount was also able to measure the chlamydial growth-altering effect of drugs that influence host-bacterium interaction, such as gamma interferon, DEAE-dextran, and cycloheximide. ChlamyCount is an easily adaptable system for testing antichlamydial antimicrobials and other compounds that influence Chlamydia-host interactions. PMID:24189259

  8. Single photon counting linear mode avalanche photodiode technologies

    NASA Astrophysics Data System (ADS)

    Williams, George M.; Huntington, Andrew S.

    2011-10-01

    The false count rate of a single-photon-sensitive photoreceiver consisting of a high-gain, low-excess-noise linear-mode InGaAs avalanche photodiode (APD) and a high-bandwidth transimpedance amplifier (TIA) is fit to a statistical model. The peak height distribution of the APD's multiplied dark current is approximated by the weighted sum of McIntyre distributions, each characterizing dark current generated at a different location within the APD's junction. The peak height distribution approximated in this way is convolved with a Gaussian distribution representing the input-referred noise of the TIA to generate the statistical distribution of the uncorrelated sum. The cumulative distribution function (CDF) representing count probability as a function of detection threshold is computed, and the CDF model fit to empirical false count data. It is found that only k=0 McIntyre distributions fit the empirically measured CDF at high detection threshold, and that false count rate drops faster than photon count rate as detection threshold is raised. Once fit to empirical false count data, the model predicts the improvement of the false count rate to be expected from reductions in TIA noise and APD dark current. Improvement by at least three orders of magnitude is thought feasible with further manufacturing development and a capacitive-feedback TIA (CTIA).

  9. Tumor angiogenesis in advanced stage ovarian carcinoma.

    PubMed

    Hollingsworth, H C; Kohn, E C; Steinberg, S M; Rothenberg, M L; Merino, M J

    1995-07-01

    Tumor angiogenesis has been found to have prognostic significance in many tumor types for predicting an increased risk of metastasis. We assessed tumor vascularity in 43 cases of advanced stage (International Federation of Gynecologists and Obstetricians stages III and IV) ovarian cancer by using the highly specific endothelial cell marker CD34. Microvessel counts and stage were associated with disease-free survival and with overall survival by Kaplan-Meier analysis. The plots show that higher stage, higher average vessel count at 200x (200x avg) and 400x (400x avg) magnification and highest vessel count at 400x (400x high) magnification confer a worse prognosis for disease-free survival. Average vessel count of less than 16 (400x avg, P2 = 0.01) and less than 45 (200x avg, P2 = 0.026) suggested a better survival. Similarly, a high vessel count of less than 20 (400x high, P2 = 0.019) conferred a better survival as well. The plots suggest that higher stage, higher average vessel count at 200x and 400x, and highest vessel count at 200x and 400x show a trend to worse overall survival as well. With the Cox proportional hazards model, stage was the best predictor of overall survival, however, the average microvessel count at 400x was found to be the best predictor of disease-free survival. These results suggest that analysis of neovascularization in advanced stage ovarian cancer may be a useful prognostic factor.

  10. Short-Term Clinical Disease Progression in HIV-Infected Patients Receiving Combination Antiretroviral Therapy: Results from the TREAT Asia HIV Observational Database

    PubMed Central

    Srasuebkul, Preeyaporn; Lim, Poh Lian; Lee, Man Po; Kumarasamy, Nagalingeswaran; Zhou, Jialun; Sirisanthana, Thira; Li, Patrick C. K.; Kamarulzaman, Adeeba; Oka, Shinichi; Phanuphak, Praphan; Vonthanak, Saphonn; Merati, Tuti P.; Chen, Yi-Ming A.; Sungkanuparph, Somnuek; Tau, Goa; Zhang, Fujie; Lee, Christopher K. C.; Ditangco, Rossana; Pujari, Sanjay; Choi, Jun Y.; Smith, Jeffery; Law, Matthew G.

    2009-01-01

    Objective The aim of our study was to develop, on the basis of simple clinical data, predictive short-term risk equations for AIDS or death in Asian patients infected with human immunodeficiency virus (HIV) who were included in the TREAT Asia HIV Observational Database. Methods Inclusion criteria were highly active antiretroviral therapy initiation and completion of required laboratory tests. Predictors of short-term AIDS or death were assessed using Poisson regression. Three different models were developed: a clinical model, a CD4 cell count model, and a CD4 cell count and HIV RNA level model. We separated patients into low-risk, high-risk, and very high-risk groups according to the key risk factors Identified. Results In the clinical model, patients with severe anemia or a body mass index (BMI; calculated as the weight in kilograms divided by the square of the height in meters) ≤18 were at very high risk, and patients who were aged <40 years or were male and had mild anemia were at high risk. In the CD4 cell count model, patients with a CD4 cell count <50 cells/µL, severe anemia, or a BMI ≤18 were at very high risk, and patients who had a CD4 cell count of 51–200 cells/µL, were aged <40 years, or were male and had mild anemia were at high risk. In the CD4 cell count and HIV RNA level model, patients with a CD4 cell count <50 cells/µL, a detectable viral load, severe anemia, or a BMI ≤18 were at very high risk, and patients with a CD4 cell count of 51–200 cells/µL and mild anemia were at high risk. The incidence of new AIDS or death in the clinical model was 1.3, 4.9, and 15.6 events per 100 person-years in the low-risk, high-risk, and very high-risk groups, respectively. In the CD4 cell count model the respective incidences were 0.9, 2.7, and 16.02 events per 100 person-years; in the CD4 cell count and HIV RNA level model, the respective incidences were 0.8, 1.8, and 6.2 events per 100 person-years. Conclusions These models are simple enough for widespread use in busy clinics and should allow clinicians to identify patients who are at high risk of AIDS or death in Asia and the Pacific region and in resource-poor settings. PMID:19226231

  11. A comparison of methods for counting viruses in aquatic systems.

    PubMed

    Bettarel, Y; Sime-Ngando, T; Amblard, C; Laveran, H

    2000-06-01

    In this study, we compared different methods-including transmission electron microscopy-and various nucleic acid labeling methods in which we used the fluorochromes 4',6'-diamidino-2-phenylindole (DAPI), 4-[3-methyl-2,3-dihydro-(benzo-1, 3-oxazole)-2-methylmethyledene]-1-(3'-trimethyl ammoniumpropyl)-quinilinium diioide (YOPRO-1), and SYBR Green I, which can be detected by epifluorescence microscopy (EM), for counting viruses in samples obtained from freshwater ecosystems whose trophic status varied and from a culture of T7 phages. From a quantitative and qualitative viewpoint, our results showed that the greatest efficiency for all ecosystems was obtained when we used the EM counting protocol in which YOPRO-1 was the label, as this fluorochrome exhibited strong and very stable fluorescence. A modification of the original protocol in which YOPRO-1 was used is recommended, because this modification makes the protocol faster and allows it to be used for routine analysis of fixed samples. Because SYBR Green I fades very quickly, the use of this fluorochrome is not recommended for systems in which the viral content is very high (>10(8) particles/ml), such as treated domestic sewage effluents. Experiments in which we used DNase and RNase revealed that the number of viruses determined by EM was slightly overestimated (by approximately 15%) because of interference caused by the presence of free nucleic acids.

  12. Allele-specific copy-number discovery from whole-genome and whole-exome sequencing.

    PubMed

    Wang, WeiBo; Wang, Wei; Sun, Wei; Crowley, James J; Szatkiewicz, Jin P

    2015-08-18

    Copy-number variants (CNVs) are a major form of genetic variation and a risk factor for various human diseases, so it is crucial to accurately detect and characterize them. It is conceivable that allele-specific reads from high-throughput sequencing data could be leveraged to both enhance CNV detection and produce allele-specific copy number (ASCN) calls. Although statistical methods have been developed to detect CNVs using whole-genome sequence (WGS) and/or whole-exome sequence (WES) data, information from allele-specific read counts has not yet been adequately exploited. In this paper, we develop an integrated method, called AS-GENSENG, which incorporates allele-specific read counts in CNV detection and estimates ASCN using either WGS or WES data. To evaluate the performance of AS-GENSENG, we conducted extensive simulations, generated empirical data using existing WGS and WES data sets and validated predicted CNVs using an independent methodology. We conclude that AS-GENSENG not only predicts accurate ASCN calls but also improves the accuracy of total copy number calls, owing to its unique ability to exploit information from both total and allele-specific read counts while accounting for various experimental biases in sequence data. Our novel, user-friendly and computationally efficient method and a complete analytic protocol is freely available at https://sourceforge.net/projects/asgenseng/. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  13. A Comparison of Methods for Counting Viruses in Aquatic Systems

    PubMed Central

    Bettarel, Yvan; Sime-Ngando, Telesphore; Amblard, Christian; Laveran, Henri

    2000-01-01

    In this study, we compared different methods—including transmission electron microscopy—and various nucleic acid labeling methods in which we used the fluorochromes 4′,6′-diamidino-2-phenylindole (DAPI), 4-[3-methyl-2,3-dihydro-(benzo-1,3-oxazole)-2-methylmethyledene]-1-(3′-trimethyl ammoniumpropyl)-quinilinium diioide (YOPRO-1), and SYBR Green I, which can be detected by epifluorescence microscopy (EM), for counting viruses in samples obtained from freshwater ecosystems whose trophic status varied and from a culture of T7 phages. From a quantitative and qualitative viewpoint, our results showed that the greatest efficiency for all ecosystems was obtained when we used the EM counting protocol in which YOPRO-1 was the label, as this fluorochrome exhibited strong and very stable fluorescence. A modification of the original protocol in which YOPRO-1 was used is recommended, because this modification makes the protocol faster and allows it to be used for routine analysis of fixed samples. Because SYBR Green I fades very quickly, the use of this fluorochrome is not recommended for systems in which the viral content is very high (>108 particles/ml), such as treated domestic sewage effluents. Experiments in which we used DNase and RNase revealed that the number of viruses determined by EM was slightly overestimated (by approximately 15%) because of interference caused by the presence of free nucleic acids. PMID:10831400

  14. Effects of lek count protocols on greater sage-grouse population trend estimates

    USGS Publications Warehouse

    Monroe, Adrian; Edmunds, David; Aldridge, Cameron L.

    2016-01-01

    Annual counts of males displaying at lek sites are an important tool for monitoring greater sage-grouse populations (Centrocercus urophasianus), but seasonal and diurnal variation in lek attendance may increase variance and bias of trend analyses. Recommendations for protocols to reduce observation error have called for restricting lek counts to within 30 minutes of sunrise, but this may limit the number of lek counts available for analysis, particularly from years before monitoring was widely standardized. Reducing the temporal window for conducting lek counts also may constrain the ability of agencies to monitor leks efficiently. We used lek count data collected across Wyoming during 1995−2014 to investigate the effect of lek counts conducted between 30 minutes before and 30, 60, or 90 minutes after sunrise on population trend estimates. We also evaluated trends across scales relevant to management, including statewide, within Working Group Areas and Core Areas, and for individual leks. To further evaluate accuracy and precision of trend estimates from lek count protocols, we used simulations based on a lek attendance model and compared simulated and estimated values of annual rate of change in population size (λ) from scenarios of varying numbers of leks, lek count timing, and count frequency (counts/lek/year). We found that restricting analyses to counts conducted within 30 minutes of sunrise generally did not improve precision of population trend estimates, although differences among timings increased as the number of leks and count frequency decreased. Lek attendance declined >30 minutes after sunrise, but simulations indicated that including lek counts conducted up to 90 minutes after sunrise can increase the number of leks monitored compared to trend estimates based on counts conducted within 30 minutes of sunrise. This increase in leks monitored resulted in greater precision of estimates without reducing accuracy. Increasing count frequency also improved precision. These results suggest that the current distribution of count timings available in lek count databases such as that of Wyoming (conducted up to 90 minutes after sunrise) can be used to estimate sage-grouse population trends without reducing precision or accuracy relative to trends from counts conducted within 30 minutes of sunrise. However, only 10% of all Wyoming counts in our sample (1995−2014) were conducted 61−90 minutes after sunrise, and further increasing this percentage may still bias trend estimates because of declining lek attendance. 

  15. Effects of household washing on bacterial load and removal of Escherichia coli from lettuce and "ready-to-eat" salads.

    PubMed

    Uhlig, Elisabeth; Olsson, Crister; He, Jiayi; Stark, Therese; Sadowska, Zuzanna; Molin, Göran; Ahrné, Siv; Alsanius, Beatrix; Håkansson, Åsa

    2017-11-01

    Customer demands for fresh salads are increasing, but leafy green vegetables have also been linked to food-borne illness due to pathogens such as Escherichia coli O157:H7. As a safety measure, consumers often wash leafy vegetables in water before consumption. In this study, we analyzed the efficiency of household washing to reduce the bacterial content. Romaine lettuce and ready-to-eat mixed salad were washed several times in flowing water at different rates and by immersing the leaves in water. Lettuce was also inoculated with E. coli before washing. Only washing in a high flow rate (8 L/min) resulted in statistically significant reductions ( p  < .05), "Total aerobic count" was reduced by 80%, and Enterobacteriaceae count was reduced by 68% after the first rinse. The number of contaminating E. coli was not significantly reduced. The dominating part of the culturable microbiota of the washed lettuce was identified by rRNA 16S sequencing of randomly picked colonies. The majority belonged to Pseudomonadaceae , but isolates from Enterobacteriaceae and Staphylococcaceaceae were also frequently found. This study shows the inefficiency of tap water washing methods available for the consumer when it comes to removal of bacteria from lettuce. Even after washing, the lettuce contained high levels of bacteria that in a high dose and under certain circumstances may constitute a health risk.

  16. Expected count rate for the Self- Interrogation Neutron Resonance Densitometry measurements of spent nuclear fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rossa, Riccardo; Universite libre de Bruxelles, Ecole polytechnique de Bruxelles - Service de Metrologie Nucleaire, CP 165/84, Avenue F.D. Roosevelt, 50 - B1050 Brussels; Borella, Alessandro

    The Self-Interrogation Neutron Resonance Densitometry (SINRD) is a passive neutron technique that aims at a direct quantification of {sup 239}Pu in the fuel assemblies by measuring the attenuation of the neutron flux in the energy region close to the 0.3 eV resonance of {sup 239}Pu. The {sup 239}Pu mass is estimated by calculating the SINRD signature, that is the ratio between the neutron flux integrated over the fast energy region and around the 0.3 eV resonance region. The SINRD measurement approach considered in this study consists in introducing a small neutron detector in the central guide tube of a PWRmore » 17x17 fuel assembly. In order to measure the neutron flux in the energy regions defined in the SINRD signature, different detector types are used. The response of a bare {sup 238}U fission chamber is considered for the determination of the fast neutron flux, while other thermal-epithermal detectors wrapped in neutron absorbers are envisaged to measure the neutron flux around the resonance region. This paper provides an estimation of the count rate that can be achieved with the detector types proposed for the SINRD measurement. In the first section a set of detectors are evaluated in terms of count rate and sensitivity to the {sup 239}Pu content, in order to identify the optimal measurement configuration for each detector type. Then a study is performed to increase the count rate by increasing the detector size. The study shows that the highest count rate is achieved by using either {sup 3}He or {sup 10}B proportional counters because of the high neutron efficiency of these detectors. However, the calculations indicate that the biggest contribution to the measurement uncertainty is due to the measurement of the fast neutron flux. Finally, similar sensitivity to the {sup 239}Pu content is obtained by using the different detector types for the measurement of the neutron flux close to the resonance region. Therefore, the count rate associated to each detector type will play a major role in the selection of the detector types used for the SINRD measurement. (authors)« less

  17. Assessing the transferability of a hybrid Taguchi-objective function method to optimize image segmentation for detecting and counting cave roosting birds using terrestrial laser scanning data

    NASA Astrophysics Data System (ADS)

    Idrees, Mohammed Oludare; Pradhan, Biswajeet; Buchroithner, Manfred F.; Shafri, Helmi Zulhaidi Mohd; Khairunniza Bejo, Siti

    2016-07-01

    As far back as early 15th century during the reign of the Ming Dynasty (1368 to 1634 AD), Gomantong cave in Sabah (Malaysia) has been known as one of the largest roosting sites for wrinkle-lipped bats (Chaerephon plicata) and swiftlet birds (Aerodramus maximus and Aerodramus fuciphagus) in very large colonies. Until recently, no study has been done to quantify or estimate the colony sizes of these inhabitants in spite of the grave danger posed to this avifauna by human activities and potential habitat loss to postspeleogenetic processes. This paper evaluates the transferability of a hybrid optimization image analysis-based method developed to detect and count cave roosting birds. The method utilizes high-resolution terrestrial laser scanning intensity image. First, segmentation parameters were optimized by integrating objective function and the statistical Taguchi methods. Thereafter, the optimized parameters were used as input into the segmentation and classification processes using two images selected from Simud Hitam (lower cave) and Simud Putih (upper cave) of the Gomantong cave. The result shows that the method is capable of detecting birds (and bats) from the image for accurate population censusing. A total number of 9998 swiftlet birds were counted from the first image while 1132 comprising of both bats and birds were obtained from the second image. Furthermore, the transferability evaluation yielded overall accuracies of 0.93 and 0.94 (area under receiver operating characteristic curve) for the first and second image, respectively, with p value of <0.0001 at 95% confidence level. The findings indicate that the method is not only efficient for the detection and counting cave birds for which it was developed for but also useful for counting bats; thus, it can be adopted in any cave.

  18. An Energy-Efficient Underground Localization System Based on Heterogeneous Wireless Networks

    PubMed Central

    Yuan, Yazhou; Chen, Cailian; Guan, Xinping; Yang, Qiuling

    2015-01-01

    A precision positioning system with energy efficiency is of great necessity for guaranteeing personnel safety in underground mines. The location information of the miners' should be transmitted to the control center timely and reliably; therefore, a heterogeneous network with the backbone based on high speed Industrial Ethernet is deployed. Since the mobile wireless nodes are working in an irregular tunnel, a specific wireless propagation model cannot fit all situations. In this paper, an underground localization system is designed to enable the adaptation to kinds of harsh tunnel environments, but also to reduce the energy consumption and thus prolong the lifetime of the network. Three key techniques are developed and implemented to improve the system performance, including a step counting algorithm with accelerometers, a power control algorithm and an adaptive packets scheduling scheme. The simulation study and experimental results show the effectiveness of the proposed algorithms and the implementation. PMID:26016918

  19. Efficient and accurate treatment of electron correlations with correlation matrix renormalization theory

    DOE PAGES

    Yao, Y. X.; Liu, J.; Liu, C.; ...

    2015-08-28

    We present an efficient method for calculating the electronic structure and total energy of strongly correlated electron systems. The method extends the traditional Gutzwiller approximation for one-particle operators to the evaluation of the expectation values of two particle operators in the many-electron Hamiltonian. The method is free of adjustable Coulomb parameters, and has no double counting issues in the calculation of total energy, and has the correct atomic limit. We demonstrate that the method describes well the bonding and dissociation behaviors of the hydrogen and nitrogen clusters, as well as the ammonia composed of hydrogen and nitrogen atoms. We alsomore » show that the method can satisfactorily tackle great challenging problems faced by the density functional theory recently discussed in the literature. The computational workload of our method is similar to the Hartree-Fock approach while the results are comparable to high-level quantum chemistry calculations.« less

  20. Membrane technology revolutionizes water treatment.

    PubMed

    Wilderer, P A; Paris, S

    2007-01-01

    Membranes play a crucial role in living cells, plants and animals. They not only serve as barriers between the inside and outside world of cells and organs. More importantly, they are means of selective transport of materials and host for biochemical conversion. Natural membrane systems have demonstrated efficiency and reliability for millions of years and it is remarkable that most of these systems are small, efficient and highly reliable even under rapidly changing ambient conditions. Thus, it appears to be advisable for technology developers to keep a close eye on Mother Nature. By doing so it is most likely that ideas for novel technical solutions are born. Following the concept of natural systems it is hypothesized that the Millennium Development Goals can be best met when counting on small water and wastewater treatment systems. The core of such systems could be membranes in which chemical reactions are integrated allowing recovery and direct utilization of valuable substances.

  1. A Prototype {sup 212}Pb Medical Dose Calibrator for Alpha Radioimmunotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mueller, W.F.; Patil, A.; Russ, W.R.

    AREVA Med, an AREVA group subsidiary, is developing innovative cancer-fighting therapies involving the use of {sup 212}Pb for alpha radioimmunotherapy. Canberra Industries, the nuclear measurement subsidiary of AREVA, has been working with AREVA Med to develop a prototype measurement system to assay syringes containing a {sup 212}Pb solution following production by an elution system. The relative fraction of emitted radiation from the source associated directly with the {sup 212}Pb remains dynamic for approximately 6 hours after the parent is chemically purified. A significant challenge for this measurement application is that the short half-life of the parent nuclide requires assay priormore » to reaching equilibrium with progeny nuclides. A gross counting detector was developed to minimize system costs and meet the large dynamic range of source activities. Prior to equilibrium, a gross counting system must include the period since the {sup 212}Pb was pure to calculate the count rate attributable to the parent rather than the progeny. The dynamic state is determined by solving the set of differential equations, or Bateman equations, describing the source decay behavior while also applying the component measurement efficiencies for each nuclide. The efficiencies were initially estimated using mathematical modeling (MCNP) but were then benchmarked with source measurements. The goal accuracy of the system was required to be within 5%. Independent measurements of the source using a high resolution spectroscopic detector have shown good agreement with the prototype system results. The prototype design was driven by cost, compactness and simplicity. The detector development costs were minimized by using existing electronics and firmware with a Geiger-Mueller tube derived from Canberra's EcoGamma environmental monitoring product. The acquisition electronics, communications and interface were controlled using Python with the EcoGamma software development kit on a Raspberry Pi Linux computer mounted inside a standard project box. The results of initial calibration measurements are presented. (authors)« less

  2. Real-time passenger counting by active linear cameras

    NASA Astrophysics Data System (ADS)

    Khoudour, Louahdi; Duvieubourg, Luc; Deparis, Jean-Pierre

    1996-03-01

    The companies operating subways are very much concerned with counting the passengers traveling through their transport systems. One of the most widely used systems for counting passengers consists of a mechanical gate equipped with a counter. However, such simple systems are not able to count passengers jumping above the gates. Moreover, passengers carrying large luggage or bags may meet some difficulties when going through such gates. The ideal solution is a contact-free counting system that would bring more comfort of use for the passengers. For these reasons, we propose to use a video processing system instead of these mechanical gates. The optical sensors discussed in this paper offer several advantages including well defined detection areas, fast response time and reliable counting capability. A new technology has been developed and tested, based on linear cameras. Preliminary results show that this system is very efficient when the passengers crossing the optical gate are well separated. In other cases, such as in compact crowd conditions, reasonable accuracy has been demonstrated. These results are illustrated by means of a number of sequences shot in field conditions. It is our belief that more precise measurements could be achieved, in the case of compact crowd, by other algorithms and acquisition techniques of the line images that we are presently developing.

  3. Making It Count: Understanding the Value of Energy Efficiency Financing Programs Funded by Utility Customers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kramer, Chris; Fadrhonc, Emily Martin; Goldman, Charles

    Utility customer-supported financing programs are receiving increased attention as a strategy for achieving energy saving goals. Rationales for using utility customer funds to support financing initiatives

  4. Determination of esophageal eosinophil counts and other histologic features of eosinophilic esophagitis by pathology trainees is highly accurate

    PubMed Central

    Rusin, Spencer; Covey, Shannon; Perjar, Irina; Hollyfield, Johnny; Speck, Olga; Woodward, Kimberly; Woosley, John T.; Dellon, Evan S.

    2017-01-01

    Summary Many studies of eosinophilic esophagitis (EoE) utilize expert pathology review, but it is unknown whether less experienced pathologists can reliably assess EoE histology. We aimed to determine whether trainee pathologists can accurately quantify esophageal eosinophil counts and identify associated histologic features of EoE, as compared to expert pathologists. We used a set of 40 digitized slides from patients with varying degrees of esophageal eosinophilia. Each of six trainee pathologists underwent a teaching session and used our validated protocol to determine eosinophil counts and associated EoE findings. The same slides had previously been evaluated by expert pathologists, and these results comprised the gold standard. Eosinophil counts were correlated, and agreement was calculated for the diagnostic threshold of 15 eosinophils per high-power field (eos/hpf) as well as for associated EoE findings. Peak eosinophil counts were highly correlated between the trainees and the gold standard (Rho ranged from 0.87–0.92; p<0.001 for all). Peak counts were also highly correlated between trainees (0.75–0.91; p<0.001), and results were similar for mean counts. Agreement was excellent for determining if a count exceeded the diagnostic threshold (kappa ranged from 0.83 to 0.89; p<0.001). Agreement was very good for eosinophil degranulation (kappa 0.54 to 0.83; p<0.01) and spongiosis (kappa 0.44–0.87; p<0.01), but was lower for eosinophil microabscesses (kappa 0.37–0.64; p<0.01). In conclusion, using a teaching session, digitized slide set, and validated protocol, the agreement between pathology trainees and expert pathologists for determining eosinophil counts was excellent. Agreement was very good for eosinophil degranulation and spongiosis, but less so for microabscesses. PMID:28041975

  5. Determination of esophageal eosinophil counts and other histologic features of eosinophilic esophagitis by pathology trainees is highly accurate.

    PubMed

    Rusin, Spencer; Covey, Shannon; Perjar, Irina; Hollyfield, Johnny; Speck, Olga; Woodward, Kimberly; Woosley, John T; Dellon, Evan S

    2017-04-01

    Many studies of eosinophilic esophagitis (EoE) use expert pathology review, but it is unknown whether less experienced pathologists can reliably assess EoE histology. We aimed to determine whether trainee pathologists can accurately quantify esophageal eosinophil counts and identify associated histologic features of EoE, as compared with expert pathologists. We used a set of 40 digitized slides from patients with varying degrees of esophageal eosinophilia. Each of 6 trainee pathologists underwent a teaching session and used our validated protocol to determine eosinophil counts and associated EoE findings. The same slides had previously been evaluated by expert pathologists, and these results comprised the criterion standard. Eosinophil counts were correlated, and agreement was calculated for the diagnostic threshold of 15 eosinophils per high-power field as well as for associated EoE findings. Peak eosinophil counts were highly correlated between the trainees and the criterion standard (ρ ranged from 0.87 to 0.92; P<.001 for all). Peak counts were also highly correlated between trainees (0.75-0.91; P<.001), and results were similar for mean counts. Agreement was excellent for determining if a count exceeded the diagnostic threshold (κ ranged from 0.83 to 0.89; P<.001). Agreement was very good for eosinophil degranulation (κ = 0.54-0.83; P<.01) and spongiosis (κ = 0.44-0.87; P<.01) but was lower for eosinophil microabscesses (κ = 0.37-0.64; P<.01). In conclusion, using a teaching session, digitized slide set, and validated protocol, the agreement between pathology trainees and expert pathologists for determining eosinophil counts was excellent. Agreement was very good for eosinophil degranulation and spongiosis but less so for microabscesses. Copyright © 2016 Elsevier Inc. All rights reserved.

  6. Microchannel plate detector technology potential for LUVOIR and HabEx

    NASA Astrophysics Data System (ADS)

    Siegmund, O. H. W.; Ertley, C.; Vallerga, J. V.; Schindhelm, E. R.; Harwit, A.; Fleming, B. T.; France, K. C.; Green, J. C.; McCandliss, S. R.; Harris, W. M.

    2017-08-01

    Microchannel plate (MCP) detectors have been the detector of choice for ultraviolet (UV) instruments onboard many NASA missions. These detectors have many advantages, including high spatial resolution (<20 μm), photon counting, radiation hardness, large formats (up to 20 cm), and ability for curved focal plane matching. Novel borosilicate glass MCPs with atomic layer deposition combine extremely low backgrounds, high strength, and tunable secondary electron yield. GaN and combinations of bialkali/alkali halide photocathodes show promise for broadband, higher quantum efficiency. Cross-strip anodes combined with compact ASIC readout electronics enable high spatial resolution over large formats with high dynamic range. The technology readiness levels of these technologies are each being advanced through research grants for laboratory testing and rocket flights. Combining these capabilities would be ideal for UV instruments onboard the Large UV/Optical/IR Surveyor (LUVOIR) and the Habitable Exoplanet Imaging Mission (HABEX) concepts currently under study for NASA's Astrophysics Decadal Survey.

  7. Influence of observers and stream flow on northern two-lined salamander (Eurycea bislineata bislineata) relative abundance estimates in Acadia and Shenandoah National Parks, USA

    USGS Publications Warehouse

    Crocker, J.B.; Bank, M.S.; Loftin, C.S.; Jung Brown, R.E.

    2007-01-01

    We investigated effects of observers and stream flow on Northern Two-Lined Salamander (Eurycea bislineata bislineata) counts in streams in Acadia (ANP) and Shenandoah National Parks (SNP). We counted salamanders in 22 ANP streams during high flow (May to June 2002) and during low flow (July 2002). We also counted salamanders in SNP in nine streams during high flow (summer 2003) and 11 streams during low flow (summers 2001?02, 2004). In 2002, we used a modified cover-controlled active search method with a first and second observer. In succession, observers turned over 100 rocks along five 1-m belt transects across the streambed. The difference between observers in total salamander counts was not significant. We counted fewer E. b. bislineata during high flow conditions, confirming that detection of this species is reduced during high flow periods and that assessment of stream salamander relative abundance is likely more reliable during low or base flow conditions.

  8. Expected total counts for the Self-Interrogation Neutron Resonance Densitometry measurements of spent nuclear fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rossa, Riccardo; Universite Libre de Bruxelles; Borella, Alessandro

    2015-07-01

    The Self-Interrogation Neutron Resonance Densitometry (SINRD) is a passive neutron technique that aims at a direct quantification of {sup 239}Pu in spent fuel assemblies by measuring the attenuation of the neutron flux in the energy region close to the 0.3 eV resonance of {sup 239}Pu. The {sup 239}Pu mass is estimated by calculating the SINRD signature, that is the ratio between the neutron counts in the fast energy region and around the 0.3 eV resonance region. The SINRD measurement approach in this study consisted in introducing a small neutron detector in the central guide tube of a PWR 17x17 fuelmore » assembly. In order to measure the neutron flux in the energy regions defined in the SINRD signature, different detector types were used. The response of a bare {sup 238}U fission chamber is considered for the determination of the fast neutron flux, while other thermal-epithermal detectors wrapped in neutron absorbers are envisaged to measure the neutron flux around the resonance region. This paper provides an estimation of the total neutron counts that can be achieved with the detector types proposed for the SINRD measurement. In the first section a set of detectors are evaluated in terms of total neutron counts and sensitivity to the {sup 239}Pu content, in order to identify the optimal measurement configuration for each detector type. Then a study is performed to increase the total neutron counts by increasing the detector size. The study shows that the highest total neutron counts are achieved by using either {sup 3}He or {sup 10}B proportional counters because of the high neutron efficiency of these detectors. However, the calculations indicate that the biggest contribution to the measurement uncertainty is due to the measurement of the fast neutron flux. Finally, similar sensitivity to the {sup 239}Pu content is obtained by using the different detector types for the measurement of the neutron flux close to the resonance region. Therefore, the total neutron counts associated to each detector type will play a major role in the selection of the detector types used for the SINRD measurement. (authors)« less

  9. Blocking Losses With a Photon Counter

    NASA Technical Reports Server (NTRS)

    Moision, Burce E.; Piazzolla, Sabino

    2012-01-01

    It was not known how to assess accurately losses in a communications link due to photodetector blocking, a phenomenon wherein a detector is rendered inactive for a short time after the detection of a photon. When used to detect a communications signal, blocking leads to losses relative to an ideal detector, which may be measured as a reduction in the communications rate for a given received signal power, or an increase in the signal power required to support the same communications rate. This work involved characterizing blocking losses for single detectors and arrays of detectors. Blocking may be mitigated by spreading the signal intensity over an array of detectors, reducing the count rate on any one detector. A simple approximation was made to the blocking loss as a function of the probability that a detector is unblocked at a given time, essentially treating the blocking probability as a scaling of the detection efficiency. An exact statistical characterization was derived for a single detector, and an approximation for multiple detectors. This allowed derivation of several accurate approximations to the loss. Methods were also derived to account for a rise time in recovery, and non-uniform illumination due to diffraction and atmospheric distortion of the phase front. It was assumed that the communications signal is intensity modulated and received by an array of photon-counting photodetectors. For the purpose of this analysis, it was assumed that the detectors are ideal, in that they produce a signal that allows one to reproduce the arrival times of electrons, produced either as photoelectrons or from dark noise, exactly. For single detectors, the performance of the maximum-likelihood (ML) receiver in blocking is illustrated, as well as a maximum-count (MC) receiver, that, when receiving a pulse-position-modulated (PPM) signal, selects the symbol corresponding to the slot with the largest electron count. Whereas the MC receiver saturates at high count rates, the ML receiver may not. The loss in capacity, symbol-error-rate (SER), and count-rate were numerically computed. It was shown that the capacity and symbol-error-rate losses track, whereas the count-rate loss does not generally reflect the SER or capacity loss, as the slot-statistics at the detector output are no longer Poisson. It is also shown that the MC receiver loss may be accurately predicted for dead times on the order of a slot.

  10. PSA discriminator influence on (222)Rn efficiency detection in waters by liquid scintillation counting.

    PubMed

    Stojković, Ivana; Todorović, Nataša; Nikolov, Jovana; Tenjović, Branislava

    2016-06-01

    A procedure for the (222)Rn determination in aqueous samples using liquid scintillation counting (LSC) was evaluated and optimized. Measurements were performed by ultra-low background spectrometer Quantulus 1220™ equipped with PSA (Pulse Shape Analysis) circuit which discriminates alpha/beta spectra. Since calibration procedure is carried out with (226)Ra standard, which has both alpha and beta progenies, it is clear that PSA discriminator has vital importance in order to provide precise spectra separation. Improvement of calibration procedure was done through investigation of PSA discriminator level and, consequentially, the activity of (226)Ra calibration standard influence on (222)Rn efficiency detection. Quench effects on generated spectra i.e. determination of radon efficiency detection were also investigated with quench calibration curve obtained. Radon determination in waters based on modified procedure according to the activity of (226)Ra standard used, dependent on PSA setup, was evaluated with prepared (226)Ra solution samples and drinking water samples with assessment of measurement uncertainty variation included. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. A splay tree-based approach for efficient resource location in P2P networks.

    PubMed

    Zhou, Wei; Tan, Zilong; Yao, Shaowen; Wang, Shipu

    2014-01-01

    Resource location in structured P2P system has a critical influence on the system performance. Existing analytical studies of Chord protocol have shown some potential improvements in performance. In this paper a splay tree-based new Chord structure called SChord is proposed to improve the efficiency of locating resources. We consider a novel implementation of the Chord finger table (routing table) based on the splay tree. This approach extends the Chord finger table with additional routing entries. Adaptive routing algorithm is proposed for implementation, and it can be shown that hop count is significantly minimized without introducing any other protocol overheads. We analyze the hop count of the adaptive routing algorithm, as compared to Chord variants, and demonstrate sharp upper and lower bounds for both worst-case and average case settings. In addition, we theoretically analyze the hop reducing in SChord and derive the fact that SChord can significantly reduce the routing hops as compared to Chord. Several simulations are presented to evaluate the performance of the algorithm and support our analytical findings. The simulation results show the efficiency of SChord.

  12. Characteristics and performance of thin LaBr3(Ce) crystal for X-ray astronomy

    NASA Astrophysics Data System (ADS)

    Manchanda, R. K.

    Lanthanum Bromide crystal is the latest among the family of the scintillation counters and has an advantage over conventional room temperature detectors. It has a high atomic number, high light yield, and fast decay time compared to NaI(Tl) crystal and therefore, the energy resolution, of LaBr3 detector is superior and it has higher detection efficiency. In recent past, laboratory studies have been generally made using thick crystal geometry (1.5×1.5-inch and 2×2-inch). Similarly, simulation studies are also in progress for the use of LaBr3 detectors in the ground based high energy physics experiments. The detector background counting rate of LaBr3 crystal is affected by the internal radioactivity and is due to naturally occurring radioisotopes 138La and 227Ac, similar to the sodium Iodide detector which is affected by the iodine isotopes. We have developed a new detector using thin lanthanum bromide crystal (3×30-mm) for use in X-ray astronomy. The instrument was launched in high altitude balloon flight on Dec. 21, 2007, which reached a ceiling altitude of 4.3 mbs. A background counting rate of 1.6 ×10-2 ct cm-2 s-1 keV-1 sr-1 was observed at the ceiling altitude. This paper describes the details of the electronics hardware, energy resolution and the background characteristics of the detector at ceiling altitude

  13. Use of immunomagnetic separation for the detection of Desulfovibrio vulgaris from environmental samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakraborty, R.; Hazen, T.C.; Joyner, D.C.

    2011-04-15

    Immunomagnetic separation (IMS) has proved highly efficient for recovering microorganisms from heterogeneous samples. Current investigation targeted the separation of viable cells of the sulfate-reducing bacterium, Desulfovibrio vulgaris. Streptavidin-coupled paramagnetic beads and biotin labeled antibodies raised against surface antigens of this microorganism were used to capture D. vulgaris cells in both bioreactor grown laboratory samples and from extremely low-biomass environmental soil and subsurface drilling samples. Initial studies on detection, recovery efficiency and viability for IMS were performed with laboratory grown D. vulgaris cells using various cell densities. Efficiency of cell isolation and recovery (i.e., release of the microbial cells from themore » beads following separation) was followed by microscopic imaging and acridine orange direct counts (AODC). Excellent recovery efficiency encouraged the use of IMS to capture Desulfovibrio spp. cells from low-biomass environmental samples. The environmental samples were obtained from a radionuclide-contaminated site in Germany and the chromium (VI)-contaminated Hanford site, an ongoing bioremediation project of the U.S. Department of Energy. Field deployable IMS technology may greatly facilitate environmental sampling and bioremediation process monitoring and enable transcriptomics and proteomics/metabolomics-based studies directly on cells collected from the field.« less

  14. QUENCH: A software package for the determination of quenching curves in Liquid Scintillation counting.

    PubMed

    Cassette, Philippe

    2016-03-01

    In Liquid Scintillation Counting (LSC), the scintillating source is part of the measurement system and its detection efficiency varies with the scintillator used, the vial and the volume and the chemistry of the sample. The detection efficiency is generally determined using a quenching curve, describing, for a specific radionuclide, the relationship between a quenching index given by the counter and the detection efficiency. A quenched set of LS standard sources are prepared by adding a quenching agent and the quenching index and detection efficiency are determined for each source. Then a simple formula is fitted to the experimental points to define the quenching curve function. The paper describes a software package specifically devoted to the determination of quenching curves with uncertainties. The experimental measurements are described by their quenching index and detection efficiency with uncertainties on both quantities. Random Gaussian fluctuations of these experimental measurements are sampled and a polynomial or logarithmic function is fitted on each fluctuation by χ(2) minimization. This Monte Carlo procedure is repeated many times and eventually the arithmetic mean and the experimental standard deviation of each parameter are calculated, together with the covariances between these parameters. Using these parameters, the detection efficiency, corresponding to an arbitrary quenching index within the measured range, can be calculated. The associated uncertainty is calculated with the law of propagation of variances, including the covariance terms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Large-Area, Low-Cost, High-Efficiency Neutron Detector for Vehicle-Mounted Operation

    NASA Astrophysics Data System (ADS)

    Lacy, Jeffrey L.; Martin, Christopher S.; Athanasiades, Athanasios; Regmi, Murari; Vazquez-Flores, Gerson J.; Davenport, Stephen; King, Nicholas S.; Lyons, Tom

    2017-07-01

    We have developed a large-area, low-cost, high-efficiency neutron detector for vehicle-mounted operation. The detector, which has overall dimensions 12.7 cm x 91.4 cm x 102 cm (5”x36”x40”), a sensitive area equal to 0.85 m2 (1320 in2), and weight of 110 kg (242 lbs), employs an array of 90 boron-coated straw (BCS) detectors. PTI has also developed electronics to minimize cost and space while providing low-noise signal conditioning for both neutron and gamma detection channels, as well as low energy Bluetooth communication with handheld devices. Extremely low power consumption allows continuous use for 225 hours (-.10 days) using three AAA lithium-ion rechargeable batteries. We present radiological, mechanical, and environmental tests, collected from four full-scale prototypes. Outdoor neutron-counting tests with a moderated 252Cf source 2 m away from the center of the detector face showed an average detection rate of 5.5 cps/ng with a standard deviation of 0.09 cps/ng over the four individual detector measurements. Measurements showed a gamma rejection ratio of 1.0 x 10-8, and gamma absolute rejection ratio (GARRn) of 0.93. The prototypes were also operated successfully onboard a moving vehicle for high-speed tests and a long-range 1433-mile, two-day road trip from Houston, TX, USA, to Laurel, MD, USA. Using auxiliary DARPA SIGMA equipment, the GPS, timestamp, gamma and neutron data were transmitted over the cellular network with 10 Hz resolution to a server and real-time tracking website. Mechanical impact and electrostatic discharge testing produced no spurious counts in either the neutron or gamma channels. Ambient environmental temperature testing showed less than ±1% response variation over the range from -30°C to +55°C.

  16. Data indexing techniques for the EUVE all-sky survey

    NASA Technical Reports Server (NTRS)

    Lewis, J.; Saba, V.; Dobson, C.

    1992-01-01

    This poster describes techniques developed for manipulating large full-sky data sets for the Extreme Ultraviolet Explorer project. The authors have adapted the quatrilateralized cubic sphere indexing algorithm to allow us to efficiently store and process several types of large data sets, such as full-sky maps of photon counts, exposure time, and count rates. A variation of this scheme is used to index sparser data such as individual photon events and viewing times for selected areas of the sky, which are eventually used to create EUVE source catalogs.

  17. 34 CFR 410.5 - What definitions apply?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... student has enrolled. Credit earned by the student for purposes of obtaining a high school degree or its... classes offered during a summer term must be counted toward the computation of the Indian student count in... summer term must be counted toward the computation of the Indian student count if the institution at...

  18. 34 CFR 410.5 - What definitions apply?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... student has enrolled. Credit earned by the student for purposes of obtaining a high school degree or its... classes offered during a summer term must be counted toward the computation of the Indian student count in... summer term must be counted toward the computation of the Indian student count if the institution at...

  19. Royal jelly attenuates azathioprine induced toxicity in rats.

    PubMed

    Ahmed, Walaa M S; Khalaf, A A; Moselhy, Walaa A; Safwat, Ghada M

    2014-01-01

    In the present study, we investigated the potential protective effects of royal jelly against azathioprine-induced toxicity in rat. Intraperitoneal administration of azathioprine (50 mg/kgB.W.) induced a significant decrease in RBCs count, Hb concentration, PCV%, WBCs count, differential count and platelet count, hepatic antioxidant enzymes (reduced glutathione and glutathione s-transferase) and increase of serum transaminases (alanine aminotransferase and aspartate aminotransferase enzymes) activities, alkaline phosphatase and malondialdehyde formation. Azathioprine induced hepatotoxicity was reflected by marked pathological changes in the liver. Oral administration of royal jelly (200 mg/kgB.W.) was efficient in counteracting azathioprine toxicity whereas it altered the anemic condition, leucopenia and thrombocytopenia induced by azathioprine. Furthermore, royal jelly exerted significant protection against liver damage induced by azathioprine through reduction of the elevated activities of serum hepatic enzymes. Moreover, royal jelly blocked azathioprine-induced lipid peroxidation through decreasing the malondialdehyde formation. In conclusion, royal jelly possesses a capability to attenuate azathioprine-induced toxicity. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. A multispectral photon-counting double random phase encoding scheme for image authentication.

    PubMed

    Yi, Faliu; Moon, Inkyu; Lee, Yeon H

    2014-05-20

    In this paper, we propose a new method for color image-based authentication that combines multispectral photon-counting imaging (MPCI) and double random phase encoding (DRPE) schemes. The sparsely distributed information from MPCI and the stationary white noise signal from DRPE make intruder attacks difficult. In this authentication method, the original multispectral RGB color image is down-sampled into a Bayer image. The three types of color samples (red, green and blue color) in the Bayer image are encrypted with DRPE and the amplitude part of the resulting image is photon counted. The corresponding phase information that has nonzero amplitude after photon counting is then kept for decryption. Experimental results show that the retrieved images from the proposed method do not visually resemble their original counterparts. Nevertheless, the original color image can be efficiently verified with statistical nonlinear correlations. Our experimental results also show that different interpolation algorithms applied to Bayer images result in different verification effects for multispectral RGB color images.

Top