Sample records for background count rates

  1. Compton suppression gamma-counting: The effect of count rate

    USGS Publications Warehouse

    Millard, H.T.

    1984-01-01

    Past research has shown that anti-coincidence shielded Ge(Li) spectrometers enhanced the signal-to-background ratios for gamma-photopeaks, which are situated on high Compton backgrounds. Ordinarily, an anti- or non-coincidence spectrum (A) and a coincidence spectrum (C) are collected simultaneously with these systems. To be useful in neutron activation analysis (NAA), the fractions of the photopeak counts routed to the two spectra must be constant from sample to sample to variations must be corrected quantitatively. Most Compton suppression counting has been done at low count rate, but in NAA applications, count rates may be much higher. To operate over the wider dynamic range, the effect of count rate on the ratio of the photopeak counts in the two spectra (A/C) was studied. It was found that as the count rate increases, A/C decreases for gammas not coincident with other gammas from the same decay. For gammas coincident with other gammas, A/C increases to a maximum and then decreases. These results suggest that calibration curves are required to correct photopeak areas so quantitative data can be obtained at higher count rates. ?? 1984.

  2. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klumpp, John

    We propose a radiation detection system which generates its own discrete sampling distribution based on past measurements of background. The advantage to this approach is that it can take into account variations in background with respect to time, location, energy spectra, detector-specific characteristics (i.e. different efficiencies at different count rates and energies), etc. This would therefore be a 'machine learning' approach, in which the algorithm updates and improves its characterization of background over time. The system would have a 'learning mode,' in which it measures and analyzes background count rates, and a 'detection mode,' in which it compares measurements frommore » an unknown source against its unique background distribution. By characterizing and accounting for variations in the background, general purpose radiation detectors can be improved with little or no increase in cost. The statistical and computational techniques to perform this kind of analysis have already been developed. The necessary signal analysis can be accomplished using existing Bayesian algorithms which account for multiple channels, multiple detectors, and multiple time intervals. Furthermore, Bayesian machine-learning techniques have already been developed which, with trivial modifications, can generate appropriate decision thresholds based on the comparison of new measurements against a nonparametric sampling distribution. (authors)« less

  3. Initial Characterization of Unequal-Length, Low-Background Proportional Counters for Absolute Gas-Counting Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mace, Emily K.; Aalseth, Craig E.; Bonicalzi, Ricco

    Abstract. Characterization of two sets of custom unequal length proportional counters is underway at Pacific Northwest National Laboratory (PNNL). These detectors will be used in measurements to determine the absolute activity concentration of gaseous radionuclides (e.g., 37Ar). A set of three detectors has been fabricated based on previous PNNL ultra-low-background proportional counters (ULBPC) designs and now operate in PNNL’s shallow underground counting laboratory. A second set of four counters has also been fabricated using clean assembly of OFHC copper components for use in an above-ground counting laboratory. Characterization of both sets of detectors is underway with measurements of background rates,more » gas gain, energy resolution, and shielding considerations. These results will be presented along with uncertainty estimates of future absolute gas counting measurements.« less

  4. Low Background Counting at LBNL

    DOE PAGES

    Smith, A. R.; Thomas, K. J.; Norman, E. B.; ...

    2015-03-24

    The Low Background Facility (LBF) at Lawrence Berkeley National Laboratory in Berkeley, California provides low background gamma spectroscopy services to a wide array of experiments and projects. The analysis of samples takes place within two unique facilities; locally within a carefully-constructed, low background cave and remotely at an underground location that historically has operated underground in Oroville, CA, but has recently been relocated to the Sanford Underground Research Facility (SURF) in Lead, SD. These facilities provide a variety of gamma spectroscopy services to low background experiments primarily in the form of passive material screening for primordial radioisotopes (U, Th, K)more » or common cosmogenic/anthropogenic products, as well as active screening via Neutron Activation Analysis for specific applications. The LBF also provides hosting services for general R&D testing in low background environments on the surface or underground for background testing of detector systems or similar prototyping. A general overview of the facilities, services, and sensitivities is presented. Recent activities and upgrades will also be presented, such as the completion of a 3π anticoincidence shield at the surface station and environmental monitoring of Fukushima fallout. The LBF is open to any users for counting services or collaboration on a wide variety of experiments and projects.« less

  5. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources.

    PubMed

    Klumpp, John; Brandl, Alexander

    2015-03-01

    A particle counting and detection system is proposed that searches for elevated count rates in multiple energy regions simultaneously. The system analyzes time-interval data (e.g., time between counts), as this was shown to be a more sensitive technique for detecting low count rate sources compared to analyzing counts per unit interval (Luo et al. 2013). Two distinct versions of the detection system are developed. The first is intended for situations in which the sample is fixed and can be measured for an unlimited amount of time. The second version is intended to detect sources that are physically moving relative to the detector, such as a truck moving past a fixed roadside detector or a waste storage facility under an airplane. In both cases, the detection system is expected to be active indefinitely; i.e., it is an online detection system. Both versions of the multi-energy detection systems are compared to their respective gross count rate detection systems in terms of Type I and Type II error rates and sensitivity.

  6. Soudan Low Background Counting Facility (SOLO)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Attisha, Michael; Viveiros, Luiz de; Gaitksell, Richard

    2005-09-08

    The Soudan Low Background Counting Facility (SOLO) has been in operation at the Soudan Mine, MN since March 2003. In the past two years, we have gamma-screened samples for the Majorana, CDMS and XENON experiments. With individual sample exposure times of up to two weeks we have measured sample contamination down to the 0.1 ppb level for 238U / 232Th, and down to the 0.25 ppm level for 40K.

  7. LINEAR COUNT-RATE METER

    DOEpatents

    Henry, J.J.

    1961-09-01

    A linear count-rate meter is designed to provide a highly linear output while receiving counting rates from one cycle per second to 100,000 cycles per second. Input pulses enter a linear discriminator and then are fed to a trigger circuit which produces positive pulses of uniform width and amplitude. The trigger circuit is connected to a one-shot multivibrator. The multivibrator output pulses have a selected width. Feedback means are provided for preventing transistor saturation in the multivibrator which improves the rise and decay times of the output pulses. The multivibrator is connected to a diode-switched, constant current metering circuit. A selected constant current is switched to an averaging circuit for each pulse received, and for a time determined by the received pulse width. The average output meter current is proportional to the product of the counting rate, the constant current, and the multivibrator output pulse width.

  8. Background characterization of an ultra-low background liquid scintillation counter

    DOE PAGES

    Erchinger, J. L.; Orrell, John L.; Aalseth, C. E.; ...

    2017-01-26

    The Ultra-Low Background Liquid Scintillation Counter developed by Pacific Northwest National Laboratory will expand the application of liquid scintillation counting by enabling lower detection limits and smaller sample volumes. By reducing the overall count rate of the background environment approximately 2 orders of magnitude below that of commercially available systems, backgrounds on the order of tens of counts per day over an energy range of ~3–3600 keV can be realized. Finally, initial test results of the ULB LSC show promising results for ultra-low background detection with liquid scintillation counting.

  9. Compensated count-rate circuit for radiation survey meter

    DOEpatents

    Todd, Richard A.

    1981-01-01

    A count-rate compensating circuit is provided which may be used in a portable Geiger-Mueller (G-M) survey meter to ideally compensate for counting loss errors in the G-M tube detector. In a G-M survey meter, wherein the pulse rate from the G-M tube is converted into a pulse rate current applied to a current meter calibrated to indicate dose rate, the compensated circuit generates and controls a reference voltage in response to the rate of pulses from the detector. This reference voltage is gated to the current-generating circuit at a rate identical to the rate of pulses coming from the detector so that the current flowing through the meter is varied in accordance with both the frequency and amplitude of the reference voltage pulses applied thereto so that the count rate is compensated ideally to indicate a true count rate within 1% up to a 50% duty cycle for the detector. A positive feedback circuit is used to control the reference voltage so that the meter output tracks true count rate indicative of the radiation dose rate.

  10. The number counts and infrared backgrounds from infrared-bright galaxies

    NASA Technical Reports Server (NTRS)

    Hacking, P. B.; Soifer, B. T.

    1991-01-01

    Extragalactic number counts and diffuse backgrounds at 25, 60, and 100 microns are predicted using new luminosity functions and improved spectral-energy distribution density functions derived from IRAS observations of nearby galaxies. Galaxies at redshifts z less than 3 that are like those in the local universe should produce a minimum diffuse background of 0.0085, 0.038, and 0.13 MJy/sr at 25, 60, and 100 microns, respectively. Models with significant luminosity evolution predict backgrounds about a factor of 4 greater than this minimum.

  11. Compensated count-rate circuit for radiation survey meter

    DOEpatents

    Todd, R.A.

    1980-05-12

    A count-rate compensating circuit is provided which may be used in a portable Geiger-Mueller (G-M) survey meter to ideally compensate for couting loss errors in the G-M tube detector. In a G-M survey meter, wherein the pulse rate from the G-M tube is converted into a pulse rate current applied to a current meter calibrated to indicate dose rate, the compensation circuit generates and controls a reference voltage in response to the rate of pulses from the detector. This reference voltage is gated to the current-generating circuit at a rate identical to the rate of pulses coming from the detector so that the current flowing through the meter is varied in accordance with both the frequency and amplitude of the reference voltage pulses applied thereto so that the count rate is compensated ideally to indicate a true count rate within 1% up to a 50% duty cycle for the detector. A positive feedback circuit is used to control the reference voltage so that the meter output tracks true count rate indicative of the radiation dose rate.

  12. Initial characterization of unequal-length, low-background proportional counters for absolute gas-counting applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mace, E. K.; Aalseth, C. E.; Bonicalzi, R.

    Characterization of two sets of custom unequal length proportional counters is underway at Pacific Northwest National Laboratory (PNNL). These detectors will be used in measurements to determine the absolute activity concentration of gaseous radionuclides (e.g., {sup 37}Ar). A set of three detectors has been fabricated based on previous PNNL ultra-low-background proportional counter designs and now operate in PNNL's shallow underground counting laboratory. A second set of four counters has also been fabricated using clean assembly of Oxygen-Free High-Conductivity copper components for use in a shielded above-ground counting laboratory. Characterization of both sets of detectors is underway with measurements of backgroundmore » rates, gas gain, and energy resolution. These results will be presented along with a shielding study for the above-ground cave.« less

  13. The piecewise-linear dynamic attenuator reduces the impact of count rate loss with photon-counting detectors

    NASA Astrophysics Data System (ADS)

    Hsieh, Scott S.; Pelc, Norbert J.

    2014-06-01

    Photon counting x-ray detectors (PCXDs) offer several advantages compared to standard energy-integrating x-ray detectors, but also face significant challenges. One key challenge is the high count rates required in CT. At high count rates, PCXDs exhibit count rate loss and show reduced detective quantum efficiency in signal-rich (or high flux) measurements. In order to reduce count rate requirements, a dynamic beam-shaping filter can be used to redistribute flux incident on the patient. We study the piecewise-linear attenuator in conjunction with PCXDs without energy discrimination capabilities. We examined three detector models: the classic nonparalyzable and paralyzable detector models, and a ‘hybrid’ detector model which is a weighted average of the two which approximates an existing, real detector (Taguchi et al 2011 Med. Phys. 38 1089-102 ). We derive analytic expressions for the variance of the CT measurements for these detectors. These expressions are used with raw data estimated from DICOM image files of an abdomen and a thorax to estimate variance in reconstructed images for both the dynamic attenuator and a static beam-shaping (‘bowtie’) filter. By redistributing flux, the dynamic attenuator reduces dose by 40% without increasing peak variance for the ideal detector. For non-ideal PCXDs, the impact of count rate loss is also reduced. The nonparalyzable detector shows little impact from count rate loss, but with the paralyzable model, count rate loss leads to noise streaks that can be controlled with the dynamic attenuator. With the hybrid model, the characteristic count rates required before noise streaks dominate the reconstruction are reduced by a factor of 2 to 3. We conclude that the piecewise-linear attenuator can reduce the count rate requirements of the PCXD in addition to improving dose efficiency. The magnitude of this reduction depends on the detector, with paralyzable detectors showing much greater benefit than nonparalyzable detectors.

  14. Reference analysis of the signal + background model in counting experiments

    NASA Astrophysics Data System (ADS)

    Casadei, D.

    2012-01-01

    The model representing two independent Poisson processes, labelled as ``signal'' and ``background'' and both contributing additively to the total number of counted events, is considered from a Bayesian point of view. This is a widely used model for the searches of rare or exotic events in presence of a background source, as for example in the searches performed by high-energy physics experiments. In the assumption of prior knowledge about the background yield, a reference prior is obtained for the signal alone and its properties are studied. Finally, the properties of the full solution, the marginal reference posterior, are illustrated with few examples.

  15. Relationship between salivary flow rates and Candida albicans counts.

    PubMed

    Navazesh, M; Wood, G J; Brightman, V J

    1995-09-01

    Seventy-one persons (48 women, 23 men; mean age, 51.76 years) were evaluated for salivary flow rates and Candida albicans counts. Each person was seen on three different occasions. Samples of unstimulated whole, chewing-stimulated whole, acid-stimulated parotid, and candy-stimulated parotid saliva were collected under standardized conditions. An oral rinse was also obtained and evaluated for Candida albicans counts. Unstimulated and chewing-stimulated whole flow rates were negatively and significantly (p < 0.001) related to the Candida counts. Unstimulated whole saliva significantly (p < 0.05) differed in persons with Candida counts of 0 versus <500 versus < or = 500. Chewing-stimulated saliva was significantly (p < 0.05) different in persons with 0 counts compared with those with a > or = 500 count. Differences in stimulated parotid flow rates were not significant among different levels of Candida counts. The results of this study reveal that whole saliva is a better predictor than parotid saliva in identification of persons with high Candida albicans counts.

  16. Relationship of milking rate to somatic cell count.

    PubMed

    Brown, C A; Rischette, S J; Schultz, L H

    1986-03-01

    Information on milking rate, monthly bucket somatic cell counts, mastitis treatment, and milk production was obtained from 284 lactations of Holstein cows separated into three lactation groups. Significant correlations between somatic cell count (linear score) and other parameters included production in lactation 1 (-.185), production in lactation 2 (-.267), and percent 2-min milk in lactation 2 (.251). Somatic cell count tended to increase with maximum milking rate in all lactations, but correlations were not statistically significant. Twenty-nine percent of cows with milking rate measurements were treated for clinical mastitis. Treated cows in each lactation group produced less milk than untreated cows. In the second and third lactation groups, treated cows had a shorter total milking time and a higher percent 2-min milk than untreated cows, but differences were not statistically significant. Overall, the data support the concept that faster milking cows tend to have higher cell counts and more mastitis treatments, particularly beyond first lactation. However, the magnitude of the relationship was small.

  17. Pneumotachometer counts respiration rate of human subject

    NASA Technical Reports Server (NTRS)

    Graham, O.

    1964-01-01

    To monitor breaths per minute, two rate-to-analog converters are alternately used to read and count the respiratory rate from an impedance pneumograph sequentially displayed numerically on electroluminescent matrices.

  18. TU-FG-209-03: Exploring the Maximum Count Rate Capabilities of Photon Counting Arrays Based On Polycrystalline Silicon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liang, A K; Koniczek, M; Antonuk, L E

    Purpose: Photon counting arrays (PCAs) offer several advantages over conventional, fluence-integrating x-ray imagers, such as improved contrast by means of energy windowing. For that reason, we are exploring the feasibility and performance of PCA pixel circuitry based on polycrystalline silicon. This material, unlike the crystalline silicon commonly used in photon counting detectors, lends itself toward the economic manufacture of radiation tolerant, monolithic large area (e.g., ∼43×43 cm2) devices. In this presentation, exploration of maximum count rate, a critical performance parameter for such devices, is reported. Methods: Count rate performance for a variety of pixel circuit designs was explored through detailedmore » circuit simulations over a wide range of parameters (including pixel pitch and operating conditions) with the additional goal of preserving good energy resolution. The count rate simulations assume input events corresponding to a 72 kVp x-ray spectrum with 20 mm Al filtration interacting with a CZT detector at various input flux rates. Output count rates are determined at various photon energy threshold levels, and the percentage of counts lost (e.g., due to deadtime or pile-up) is calculated from the ratio of output to input counts. The energy resolution simulations involve thermal and flicker noise originating from each circuit element in a design. Results: Circuit designs compatible with pixel pitches ranging from 250 to 1000 µm that allow count rates over a megacount per second per pixel appear feasible. Such rates are expected to be suitable for radiographic and fluoroscopic imaging. Results for the analog front-end circuitry of the pixels show that acceptable energy resolution can also be achieved. Conclusion: PCAs created using polycrystalline silicon have the potential to offer monolithic large-area detectors with count rate performance comparable to those of crystalline silicon detectors. Further improvement through detailed

  19. Multianode cylindrical proportional counter for high count rates

    DOEpatents

    Hanson, J.A.; Kopp, M.K.

    1980-05-23

    A cylindrical, multiple-anode proportional counter is provided for counting of low-energy photons (< 60 keV) at count rates of greater than 10/sup 5/ counts/sec. A gas-filled proportional counter cylinder forming an outer cathode is provided with a central coaxially disposed inner cathode and a plurality of anode wires disposed in a cylindrical array in coaxial alignment with and between the inner and outer cathodes to form a virtual cylindrical anode coaxial with the inner and outer cathodes. The virtual cylindrical anode configuration improves the electron drift velocity by providing a more uniform field strength throughout the counter gas volume, thus decreasing the electron collection time following the detection of an ionizing event. This avoids pulse pile-up and coincidence losses at these high count rates. Conventional RC position encoding detection circuitry may be employed to extract the spatial information from the counter anodes.

  20. Multianode cylindrical proportional counter for high count rates

    DOEpatents

    Hanson, James A.; Kopp, Manfred K.

    1981-01-01

    A cylindrical, multiple-anode proportional counter is provided for counting of low-energy photons (<60 keV) at count rates of greater than 10.sup.5 counts/sec. A gas-filled proportional counter cylinder forming an outer cathode is provided with a central coaxially disposed inner cathode and a plurality of anode wires disposed in a cylindrical array in coaxial alignment with and between the inner and outer cathodes to form a virtual cylindrical anode coaxial with the inner and outer cathodes. The virtual cylindrical anode configuration improves the electron drift velocity by providing a more uniform field strength throughout the counter gas volume, thus decreasing the electron collection time following the detection of an ionizing event. This avoids pulse pile-up and coincidence losses at these high count rates. Conventional RC position encoding detection circuitry may be employed to extract the spatial information from the counter anodes.

  1. Automatic vehicle counting using background subtraction method on gray scale images and morphology operation

    NASA Astrophysics Data System (ADS)

    Adi, K.; Widodo, A. P.; Widodo, C. E.; Pamungkas, A.; Putranto, A. B.

    2018-05-01

    Traffic monitoring on road needs to be done, the counting of the number of vehicles passing the road is necessary. It is more emphasized for highway transportation management in order to prevent efforts. Therefore, it is necessary to develop a system that is able to counting the number of vehicles automatically. Video processing method is able to counting the number of vehicles automatically. This research has development a system of vehicle counting on toll road. This system includes processes of video acquisition, frame extraction, and image processing for each frame. Video acquisition is conducted in the morning, at noon, in the afternoon, and in the evening. This system employs of background subtraction and morphology methods on gray scale images for vehicle counting. The best vehicle counting results were obtained in the morning with a counting accuracy of 86.36 %, whereas the lowest accuracy was in the evening, at 21.43 %. Differences in morning and evening results are caused by different illumination in the morning and evening. This will cause the values in the image pixels to be different.

  2. Poisson mixture model for measurements using counting.

    PubMed

    Miller, Guthrie; Justus, Alan; Vostrotin, Vadim; Dry, Donald; Bertelli, Luiz

    2010-03-01

    Starting with the basic Poisson statistical model of a counting measurement process, 'extraPoisson' variance or 'overdispersion' are included by assuming that the Poisson parameter representing the mean number of counts itself comes from another distribution. The Poisson parameter is assumed to be given by the quantity of interest in the inference process multiplied by a lognormally distributed normalising coefficient plus an additional lognormal background that might be correlated with the normalising coefficient (shared uncertainty). The example of lognormal environmental background in uranium urine data is discussed. An additional uncorrelated background is also included. The uncorrelated background is estimated from a background count measurement using Bayesian arguments. The rather complex formulas are validated using Monte Carlo. An analytical expression is obtained for the probability distribution of gross counts coming from the uncorrelated background, which allows straightforward calculation of a classical decision level in the form of a gross-count alarm point with a desired false-positive rate. The main purpose of this paper is to derive formulas for exact likelihood calculations in the case of various kinds of backgrounds.

  3. Reference analysis of the signal + background model in counting experiments II. Approximate reference prior

    NASA Astrophysics Data System (ADS)

    Casadei, D.

    2014-10-01

    The objective Bayesian treatment of a model representing two independent Poisson processes, labelled as ``signal'' and ``background'' and both contributing additively to the total number of counted events, is considered. It is shown that the reference prior for the parameter of interest (the signal intensity) can be well approximated by the widely (ab)used flat prior only when the expected background is very high. On the other hand, a very simple approximation (the limiting form of the reference prior for perfect prior background knowledge) can be safely used over a large portion of the background parameters space. The resulting approximate reference posterior is a Gamma density whose parameters are related to the observed counts. This limiting form is simpler than the result obtained with a flat prior, with the additional advantage of representing a much closer approximation to the reference posterior in all cases. Hence such limiting prior should be considered a better default or conventional prior than the uniform prior. On the computing side, it is shown that a 2-parameter fitting function is able to reproduce extremely well the reference prior for any background prior. Thus, it can be useful in applications requiring the evaluation of the reference prior for a very large number of times.

  4. Cryogenic, high-resolution x-ray detector with high count rate capability

    DOEpatents

    Frank, Matthias; Mears, Carl A.; Labov, Simon E.; Hiller, Larry J.; Barfknecht, Andrew T.

    2003-03-04

    A cryogenic, high-resolution X-ray detector with high count rate capability has been invented. The new X-ray detector is based on superconducting tunnel junctions (STJs), and operates without thermal stabilization at or below 500 mK. The X-ray detector exhibits good resolution (.about.5-20 eV FWHM) for soft X-rays in the keV region, and is capable of counting at count rates of more than 20,000 counts per second (cps). Simple, FET-based charge amplifiers, current amplifiers, or conventional spectroscopy shaping amplifiers can provide the electronic readout of this X-ray detector.

  5. Dynamic time-correlated single-photon counting laser ranging

    NASA Astrophysics Data System (ADS)

    Peng, Huan; Wang, Yu-rong; Meng, Wen-dong; Yan, Pei-qin; Li, Zhao-hui; Li, Chen; Pan, Hai-feng; Wu, Guang

    2018-03-01

    We demonstrate a photon counting laser ranging experiment with a four-channel single-photon detector (SPD). The multi-channel SPD improve the counting rate more than 4×107 cps, which makes possible for the distance measurement performed even in daylight. However, the time-correlated single-photon counting (TCSPC) technique cannot distill the signal easily while the fast moving targets are submersed in the strong background. We propose a dynamic TCSPC method for fast moving targets measurement by varying coincidence window in real time. In the experiment, we prove that targets with velocity of 5 km/s can be detected according to the method, while the echo rate is 20% with the background counts of more than 1.2×107 cps.

  6. Low-Noise Free-Running High-Rate Photon-Counting for Space Communication and Ranging

    NASA Technical Reports Server (NTRS)

    Lu, Wei; Krainak, Michael A.; Yang, Guangning; Sun, Xiaoli; Merritt, Scott

    2016-01-01

    We present performance data for low-noise free-running high-rate photon counting method for space optical communication and ranging. NASA GSFC is testing the performance of two types of novel photon-counting detectors 1) a 2x8 mercury cadmium telluride (HgCdTe) avalanche array made by DRS Inc., and a 2) a commercial 2880-element silicon avalanche photodiode (APD) array. We successfully measured real-time communication performance using both the 2 detected-photon threshold and logic AND-gate coincidence methods. Use of these methods allows mitigation of dark count, after-pulsing and background noise effects without using other method of Time Gating The HgCdTe APD array routinely demonstrated very high photon detection efficiencies ((is) greater than 50%) at near infrared wavelength. The commercial silicon APD array exhibited a fast output with rise times of 300 ps and pulse widths of 600 ps. On-chip individually filtered signals from the entire array were multiplexed onto a single fast output. NASA GSFC has tested both detectors for their potential application for space communications and ranging. We developed and compare their performances using both the 2 detected photon threshold and coincidence methods.

  7. Low-Noise Free-Running High-Rate Photon-Counting for Space Communication and Ranging

    NASA Technical Reports Server (NTRS)

    Lu, Wei; Krainak, Michael A.; Yang, Guan; Sun, Xiaoli; Merritt, Scott

    2016-01-01

    We present performance data for low-noise free-running high-rate photon counting method for space optical communication and ranging. NASA GSFC is testing the performance of two types of novel photon-counting detectors 1) a 2x8 mercury cadmium telluride (HgCdTe) avalanche array made by DRS Inc., and a 2) a commercial 2880-element silicon avalanche photodiode (APD) array. We successfully measured real-time communication performance using both the 2 detected-photon threshold and logic AND-gate coincidence methods. Use of these methods allows mitigation of dark count, after-pulsing and background noise effects without using other method of Time Gating The HgCdTe APD array routinely demonstrated very high photon detection efficiencies (50) at near infrared wavelength. The commercial silicon APD array exhibited a fast output with rise times of 300 ps and pulse widths of 600 ps. On-chip individually filtered signals from the entire array were multiplexed onto a single fast output. NASA GSFC has tested both detectors for their potential application for space communications and ranging. We developed and compare their performances using both the 2 detected photon threshold and coincidence methods.

  8. A burst-mode photon counting receiver with automatic channel estimation and bit rate detection

    NASA Astrophysics Data System (ADS)

    Rao, Hemonth G.; DeVoe, Catherine E.; Fletcher, Andrew S.; Gaschits, Igor D.; Hakimi, Farhad; Hamilton, Scott A.; Hardy, Nicholas D.; Ingwersen, John G.; Kaminsky, Richard D.; Moores, John D.; Scheinbart, Marvin S.; Yarnall, Timothy M.

    2016-04-01

    We demonstrate a multi-rate burst-mode photon-counting receiver for undersea communication at data rates up to 10.416 Mb/s over a 30-foot water channel. To the best of our knowledge, this is the first demonstration of burst-mode photon-counting communication. With added attenuation, the maximum link loss is 97.1 dB at λ=517 nm. In clear ocean water, this equates to link distances up to 148 meters. For λ=470 nm, the achievable link distance in clear ocean water is 450 meters. The receiver incorporates soft-decision forward error correction (FEC) based on a product code of an inner LDPC code and an outer BCH code. The FEC supports multiple code rates to achieve error-free performance. We have selected a burst-mode receiver architecture to provide robust performance with respect to unpredictable channel obstructions. The receiver is capable of on-the-fly data rate detection and adapts to changing levels of signal and background light. The receiver updates its phase alignment and channel estimates every 1.6 ms, allowing for rapid changes in water quality as well as motion between transmitter and receiver. We demonstrate on-the-fly rate detection, channel BER within 0.2 dB of theory across all data rates, and error-free performance within 1.82 dB of soft-decision capacity across all tested code rates. All signal processing is done in FPGAs and runs continuously in real time.

  9. The Dependence of Tropical Cyclone Count and Size on Rotation Rate

    NASA Astrophysics Data System (ADS)

    Chavas, D. R.; Reed, K. A.

    2017-12-01

    Both theory and idealized equilibrium modeling studies indicate that tropical cyclone size decreases with background rotation rate. In contrast, in real-world observations size tends to increase with latitude. Here we seek to resolve this apparent contradiction via a set of reduced-complexity global aquaplanet simulations with varying planetary rotation rates using the NCAR Community Atmosphere Model 5. The latitudinal distribution of both storm count and size are found to vary markedly with rotation rate, yielding insight into the dynamical constraints on tropical cyclone activity on a rotating planet. Moreover, storm size is found to vary non-monotonically with latitude, indicating that non-equilibrium effects are crucial to the life-cycle evolution of size in nature. Results are then compared to experiments in idealized, time-dependent limited-area modeling simulations using CM1 in axisymmetric and three-dimensional geometry. Taken together, this hierarchy of models is used to quantify the role of equilibrium versus transient controls on storm size and the relevance of each to real storms in nature.

  10. Reducing the Child Poverty Rate. KIDS COUNT Indicator Brief

    ERIC Educational Resources Information Center

    Shore, Rima; Shore, Barbara

    2009-01-01

    In 2007, nearly one in five or 18 percent of children in the U.S. lived in poverty (KIDS COUNT Data Center, 2009). Many of these children come from minority backgrounds. African American (35 percent), American Indian (33 percent) and Latino (27 percent) children are more likely to live in poverty than their white (11 percent) and Asian (12…

  11. Exploration of maximum count rate capabilities for large-area photon counting arrays based on polycrystalline silicon thin-film transistors

    NASA Astrophysics Data System (ADS)

    Liang, Albert K.; Koniczek, Martin; Antonuk, Larry E.; El-Mohri, Youcef; Zhao, Qihua

    2016-03-01

    Pixelated photon counting detectors with energy discrimination capabilities are of increasing clinical interest for x-ray imaging. Such detectors, presently in clinical use for mammography and under development for breast tomosynthesis and spectral CT, usually employ in-pixel circuits based on crystalline silicon - a semiconductor material that is generally not well-suited for economic manufacture of large-area devices. One interesting alternative semiconductor is polycrystalline silicon (poly-Si), a thin-film technology capable of creating very large-area, monolithic devices. Similar to crystalline silicon, poly-Si allows implementation of the type of fast, complex, in-pixel circuitry required for photon counting - operating at processing speeds that are not possible with amorphous silicon (the material currently used for large-area, active matrix, flat-panel imagers). The pixel circuits of two-dimensional photon counting arrays are generally comprised of four stages: amplifier, comparator, clock generator and counter. The analog front-end (in particular, the amplifier) strongly influences performance and is therefore of interest to study. In this paper, the relationship between incident and output count rate of the analog front-end is explored under diagnostic imaging conditions for a promising poly-Si based design. The input to the amplifier is modeled in the time domain assuming a realistic input x-ray spectrum. Simulations of circuits based on poly-Si thin-film transistors are used to determine the resulting output count rate as a function of input count rate, energy discrimination threshold and operating conditions.

  12. Material screening with HPGe counting station for PandaX experiment

    NASA Astrophysics Data System (ADS)

    Wang, X.; Chen, X.; Fu, C.; Ji, X.; Liu, X.; Mao, Y.; Wang, H.; Wang, S.; Xie, P.; Zhang, T.

    2016-12-01

    A gamma counting station based on high-purity germanium (HPGe) detector was set up for the material screening of the PandaX dark matter experiments in the China Jinping Underground Laboratory. Low background gamma rate of 2.6 counts/min within the energy range of 20 to 2700 keV is achieved due to the well-designed passive shield. The sentivities of the HPGe detetector reach mBq/kg level for isotopes like K, U, Th, and even better for Co and Cs, resulted from the low-background rate and the high relative detection efficiency of 175%. The structure and performance of the counting station are described in this article. Detailed counting results for the radioactivity in materials used by the PandaX dark-matter experiment are presented. The upgrading plan of the counting station is also discussed.

  13. Note: Operation of gamma-ray microcalorimeters at elevated count rates using filters with constraints

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alpert, B. K.; Horansky, R. D.; Bennett, D. A.

    Microcalorimeter sensors operated near 0.1 K can measure the energy of individual x- and gamma-ray photons with significantly more precision than conventional semiconductor technologies. Both microcalorimeter arrays and higher per pixel count rates are desirable to increase the total throughput of spectrometers based on these devices. The millisecond recovery time of gamma-ray microcalorimeters and the resulting pulse pileup are significant obstacles to high per pixel count rates. Here, we demonstrate operation of a microcalorimeter detector at elevated count rates by use of convolution filters designed to be orthogonal to the exponential tail of a preceding pulse. These filters allow operationmore » at 50% higher count rates than conventional filters while largely preserving sensor energy resolution.« less

  14. A Calibration of NICMOS Camera 2 for Low Count Rates

    NASA Astrophysics Data System (ADS)

    Rubin, D.; Aldering, G.; Amanullah, R.; Barbary, K.; Dawson, K. S.; Deustua, S.; Faccioli, L.; Fadeyev, V.; Fakhouri, H. K.; Fruchter, A. S.; Gladders, M. D.; de Jong, R. S.; Koekemoer, A.; Krechmer, E.; Lidman, C.; Meyers, J.; Nordin, J.; Perlmutter, S.; Ripoche, P.; Schlegel, D. J.; Spadafora, A.; Suzuki, N.

    2015-05-01

    NICMOS 2 observations are crucial for constraining distances to most of the existing sample of z\\gt 1 SNe Ia. Unlike conventional calibration programs, these observations involve long exposure times and low count rates. Reciprocity failure is known to exist in HgCdTe devices and a correction for this effect has already been implemented for high and medium count rates. However, observations at faint count rates rely on extrapolations. Here instead, we provide a new zero-point calibration directly applicable to faint sources. This is obtained via inter-calibration of NIC2 F110W/F160W with the Wide Field Camera 3 (WFC3) in the low count-rate regime using z∼ 1 elliptical galaxies as tertiary calibrators. These objects have relatively simple near-IR spectral energy distributions, uniform colors, and their extended nature gives a superior signal-to-noise ratio at the same count rate than would stars. The use of extended objects also allows greater tolerances on point-spread function profiles. We find space telescope magnitude zero points (after the installation of the NICMOS cooling system, NCS) of 25.296\\+/- 0.022 for F110W and 25.803\\+/- 0.023 for F160W, both in agreement with the calibration extrapolated from count rates ≳1000 times larger (25.262 and 25.799). Before the installation of the NCS, we find 24.843\\+/- 0.025 for F110W and 25.498\\+/- 0.021 for F160W, also in agreement with the high-count-rate calibration (24.815 and 25.470). We also check the standard bandpasses of WFC3 and NICMOS 2 using a range of stars and galaxies at different colors and find mild tension for WFC3, limiting the accuracy of the zero points. To avoid human bias, our cross-calibration was “blinded” in that the fitted zero-point differences were hidden until the analysis was finalized. Based on observations with the NASA/ESA Hubble Space Telescope, obtained at the Space Telescope Science Institute, which is operated by AURA, Inc., under NASA contract NAS 5-26555, under programs

  15. Fast radio burst event rate counts - I. Interpreting the observations

    NASA Astrophysics Data System (ADS)

    Macquart, J.-P.; Ekers, R. D.

    2018-02-01

    The fluence distribution of the fast radio burst (FRB) population (the `source count' distribution, N (>F) ∝Fα), is a crucial diagnostic of its distance distribution, and hence the progenitor evolutionary history. We critically reanalyse current estimates of the FRB source count distribution. We demonstrate that the Lorimer burst (FRB 010724) is subject to discovery bias, and should be excluded from all statistical studies of the population. We re-examine the evidence for flat, α > -1, source count estimates based on the ratio of single-beam to multiple-beam detections with the Parkes multibeam receiver, and show that current data imply only a very weak constraint of α ≲ -1.3. A maximum-likelihood analysis applied to the portion of the Parkes FRB population detected above the observational completeness fluence of 2 Jy ms yields α = -2.6_{-1.3}^{+0.7 }. Uncertainties in the location of each FRB within the Parkes beam render estimates of the Parkes event rate uncertain in both normalizing survey area and the estimated post-beam-corrected completeness fluence; this uncertainty needs to be accounted for when comparing the event rate against event rates measured at other telescopes.

  16. Neutron counting with cameras

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Esch, Patrick; Crisanti, Marta; Mutti, Paolo

    2015-07-01

    A research project is presented in which we aim at counting individual neutrons with CCD-like cameras. We explore theoretically a technique that allows us to use imaging detectors as counting detectors at lower counting rates, and transits smoothly to continuous imaging at higher counting rates. As such, the hope is to combine the good background rejection properties of standard neutron counting detectors with the absence of dead time of integrating neutron imaging cameras as well as their very good spatial resolution. Compared to Xray detection, the essence of thermal neutron detection is the nuclear conversion reaction. The released energies involvedmore » are of the order of a few MeV, while X-ray detection releases energies of the order of the photon energy, which is in the 10 KeV range. Thanks to advances in camera technology which have resulted in increased quantum efficiency, lower noise, as well as increased frame rate up to 100 fps for CMOS-type cameras, this more than 100-fold higher available detection energy implies that the individual neutron detection light signal can be significantly above the noise level, as such allowing for discrimination and individual counting, which is hard to achieve with X-rays. The time scale of CMOS-type cameras doesn't allow one to consider time-of-flight measurements, but kinetic experiments in the 10 ms range are possible. The theory is next confronted to the first experimental results. (authors)« less

  17. The effects of awareness and count duration on adult respiratory rate measurements: An experimental study.

    PubMed

    Hill, Andrew; Kelly, Eliza; Horswill, Mark S; Watson, Marcus O

    2018-02-01

    To investigate whether awareness of manual respiratory rate monitoring affects respiratory rate in adults, and whether count duration influences respiratory rate estimates. Nursing textbooks typically suggest that the patient should ideally be unaware of respiratory rate observations; however, there is little published evidence of the effect of awareness on respiratory rate, and none specific to manual measurement. In addition, recommendations about the length of the respiratory rate count vary from text to text, and the relevant empirical evidence is scant, inconsistent and subject to substantial methodological limitations. Experimental study with awareness of respiration monitoring (aware, unaware; randomised between-subjects) and count duration (60 s, 30 s, 15 s; within-subjects) as the independent variables. Respiratory rate (breaths/minute) was the dependent variable. Eighty-two adult volunteers were randomly assigned to aware and unaware conditions. In the baseline block, no live monitoring occurred. In the subsequent experimental block, the researcher informed aware participants that their respiratory rate would be counted, and did so. Respirations were captured throughout via video recording, and counted by blind raters viewing 60-, 30- and 15-s extracts. The data were collected in 2015. There was no baseline difference between the groups. During the experimental block, the respiratory rates of participants in the aware condition were an average of 2.13 breaths/minute lower compared to unaware participants. Reducing the count duration from 1 min to 15 s caused respiratory rate to be underestimated by an average of 2.19 breaths/minute (and 0.95 breaths/minute for 30-s counts). The awareness effect did not depend on count duration. Awareness of monitoring appears to reduce respiratory rate, and shorter monitoring durations yield systematically lower respiratory rate estimates. When interpreting and acting upon respiratory rate data, clinicians should

  18. A real-time phoneme counting algorithm and application for speech rate monitoring.

    PubMed

    Aharonson, Vered; Aharonson, Eran; Raichlin-Levi, Katia; Sotzianu, Aviv; Amir, Ofer; Ovadia-Blechman, Zehava

    2017-03-01

    Adults who stutter can learn to control and improve their speech fluency by modifying their speaking rate. Existing speech therapy technologies can assist this practice by monitoring speaking rate and providing feedback to the patient, but cannot provide an accurate, quantitative measurement of speaking rate. Moreover, most technologies are too complex and costly to be used for home practice. We developed an algorithm and a smartphone application that monitor a patient's speaking rate in real time and provide user-friendly feedback to both patient and therapist. Our speaking rate computation is performed by a phoneme counting algorithm which implements spectral transition measure extraction to estimate phoneme boundaries. The algorithm is implemented in real time in a mobile application that presents its results in a user-friendly interface. The application incorporates two modes: one provides the patient with visual feedback of his/her speech rate for self-practice and another provides the speech therapist with recordings, speech rate analysis and tools to manage the patient's practice. The algorithm's phoneme counting accuracy was validated on ten healthy subjects who read a paragraph at slow, normal and fast paces, and was compared to manual counting of speech experts. Test-retest and intra-counter reliability were assessed. Preliminary results indicate differences of -4% to 11% between automatic and human phoneme counting. Differences were largest for slow speech. The application can thus provide reliable, user-friendly, real-time feedback for speaking rate control practice. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Optimization of high count rate event counting detector with Microchannel Plates and quad Timepix readout

    NASA Astrophysics Data System (ADS)

    Tremsin, A. S.; Vallerga, J. V.; McPhate, J. B.; Siegmund, O. H. W.

    2015-07-01

    Many high resolution event counting devices process one event at a time and cannot register simultaneous events. In this article a frame-based readout event counting detector consisting of a pair of Microchannel Plates and a quad Timepix readout is described. More than 104 simultaneous events can be detected with a spatial resolution of 55 μm, while >103 simultaneous events can be detected with <10 μm spatial resolution when event centroiding is implemented. The fast readout electronics is capable of processing >1200 frames/sec, while the global count rate of the detector can exceed 5×108 particles/s when no timing information on every particle is required. For the first generation Timepix readout, the timing resolution is limited by the Timepix clock to 10-20 ns. Optimization of the MCP gain, rear field voltage and Timepix threshold levels are crucial for the device performance and that is the main subject of this article. These devices can be very attractive for applications where the photon/electron/ion/neutron counting with high spatial and temporal resolution is required, such as energy resolved neutron imaging, Time of Flight experiments in lidar applications, experiments on photoelectron spectroscopy and many others.

  20. Reducing the Teen Death Rate. KIDS COUNT Indicator Brief

    ERIC Educational Resources Information Center

    Shore, Rima; Shore, Barbara

    2009-01-01

    Life continues to hold considerable risk for adolescents in the United States. In 2006, the teen death rate stood at 64 deaths per 100,000 teens (13,739 teens) (KIDS COUNT Data Center, 2009). Although it has declined by 4 percent since 2000, the rate of teen death in this country remains substantially higher than in many peer nations, based…

  1. Background Conditions for the October 29, 2003 Solar Flare by the AVS-F Apparatus Data

    NASA Astrophysics Data System (ADS)

    Arkhangelskaja, I. V.; Arkhangelskiy, A. I.; Lyapin, A. R.; Troitskaya, E. V.

    The background model for AVS-F apparatus onboard CORONAS-F satellite for the October 29, 2003 X10-class solar flare is discussed in the presented work. This background model developed for AVS-F counts rate in the low- and high-energy spectral ranges in both individual channels and summarized. Count rate were approximated by polynomials of high order taking into account the mean count rate in the geomagnetic equatorial region at the different orbits parts and Kp-index averaged on 5 bins in time interval from -24 to -12 hours before the time of geomagnetic equator passing. The observed averaged counts rate on equator in the region of geomagnetic latitude ±5o and estimated minimum count rate values are in coincidence within statistical errors for all selected orbits parts used for background modeling. This model will used to refine the estimated energy of registered during the solar flare spectral features and detailed analysis of their temporal profiles behavior both in corresponding energy bands and in summarized energy range.

  2. Impact of double counting and transfer bias on estimated rates and outcomes of acute myocardial infarction.

    PubMed

    Westfall, J M; McGloin, J

    2001-05-01

    Ischemic heart disease is the leading cause of death in the United States. Recent studies report inconsistent findings on the changes in the incidence of hospitalizations for ischemic heart disease. These reports have relied primarily on hospital discharge data. Preliminary data suggest that a significant percentage of patients suffering acute myocardial infarction (MI) in rural communities are transferred to urban centers for care. Patients transferred to a second hospital may be counted twice for one episode of ischemic heart disease. To describe the impact of double counting and transfer bias on the estimation of incidence rates and outcomes of ischemic heart disease, specifically acute MI, in the United States. Analysis of state hospital discharge data from Kansas, Colorado (State Inpatient Database [SID]), Nebraska, Arizona, New Jersey, Michigan, Pennsylvania, and Illinois (SID) for the years 1995 to 1997. A matching algorithm was developed for hospital discharges to determine patients counted twice for one episode of ischemic heart disease. Validation of our matching algorithm. Patients reported to have suffered ischemic heart disease (ICD9 codes 410-414, 786.5). Number of patients counted twice for one episode of acute MI. It is estimated that double count rates range from 10% to 15% for all states and increased over the 3 years. Moderate sized rural counties had the highest estimated double count rates at 15% to 20% with a few counties having estimated double count rates a high as 35% to 50%. Older patients and females were less likely to be double counted (P <0.05). Double counting patients has resulted in a significant overestimation in the incidence rate for hospitalization for acute MI. Correction of this double counting reveals a significantly lower incidence rate and a higher in-hospital mortality rate for acute MI. Transferred patients differ significantly from nontransferred patients, introducing significant bias into MI outcome studies. Double

  3. Separating Spike Count Correlation from Firing Rate Correlation

    PubMed Central

    Vinci, Giuseppe; Ventura, Valérie; Smith, Matthew A.; Kass, Robert E.

    2016-01-01

    Populations of cortical neurons exhibit shared fluctuations in spiking activity over time. When measured for a pair of neurons over multiple repetitions of an identical stimulus, this phenomenon emerges as correlated trial-to-trial response variability via spike count correlation (SCC). However, spike counts can be viewed as noisy versions of firing rates, which can vary from trial to trial. From this perspective, the SCC for a pair of neurons becomes a noisy version of the corresponding firing-rate correlation (FRC). Furthermore, the magnitude of the SCC is generally smaller than that of the FRC, and is likely to be less sensitive to experimental manipulation. We provide statistical methods for disambiguating time-averaged drive from within-trial noise, thereby separating FRC from SCC. We study these methods to document their reliability, and we apply them to neurons recorded in vivo from area V4, in an alert animal. We show how the various effects we describe are reflected in the data: within-trial effects are largely negligible, while attenuation due to trial-to-trial variation dominates, and frequently produces comparisons in SCC that, because of noise, do not accurately reflect those based on the underlying FRC. PMID:26942746

  4. Relationship between salivary flow rates and Candida counts in subjects with xerostomia.

    PubMed

    Torres, Sandra R; Peixoto, Camila Bernardo; Caldas, Daniele Manhães; Silva, Eline Barboza; Akiti, Tiyomi; Nucci, Márcio; de Uzeda, Milton

    2002-02-01

    This study evaluated the relationship between salivary flow and Candida colony counts in the saliva of patients with xerostomia. Sialometry and Candida colony-forming unit (CFU) counts were taken from 112 subjects who reported xerostomia in a questionnaire. Chewing-stimulated whole saliva was collected and streaked in Candida plates and counted in 72 hours. Species identification was accomplished under standard methods. There was a significant inverse relationship between salivary flow and Candida CFU counts (P =.007) when subjects with high colony counts were analyzed (cutoff point of 400 or greater CFU/mL). In addition, the median sialometry of men was significantly greater than that of women (P =.003), even after controlling for confounding variables like underlying disease and medications. Sjögren's syndrome was associated with low salivary flow rate (P =.007). There was no relationship between the median Candida CFU counts and gender or age. There was a high frequency (28%) of mixed colonization. Candida albicans was the most frequent species, followed by C parapsilosis, C tropicalis, and C krusei. In subjects with high Candida CFU counts there was an inverse relationship between salivary flow and Candida CFU counts.

  5. A New Statistics-Based Online Baseline Restorer for a High Count-Rate Fully Digital System.

    PubMed

    Li, Hongdi; Wang, Chao; Baghaei, Hossain; Zhang, Yuxuan; Ramirez, Rocio; Liu, Shitao; An, Shaohui; Wong, Wai-Hoi

    2010-04-01

    The goal of this work is to develop a novel, accurate, real-time digital baseline restorer using online statistical processing for a high count-rate digital system such as positron emission tomography (PET). In high count-rate nuclear instrumentation applications, analog signals are DC-coupled for better performance. However, the detectors, pre-amplifiers and other front-end electronics would cause a signal baseline drift in a DC-coupling system, which will degrade the performance of energy resolution and positioning accuracy. Event pileups normally exist in a high-count rate system and the baseline drift will create errors in the event pileup-correction. Hence, a baseline restorer (BLR) is required in a high count-rate system to remove the DC drift ahead of the pileup correction. Many methods have been reported for BLR from classic analog methods to digital filter solutions. However a single channel BLR with analog method can only work under 500 kcps count-rate, and normally an analog front-end application-specific integrated circuits (ASIC) is required for the application involved hundreds BLR such as a PET camera. We have developed a simple statistics-based online baseline restorer (SOBLR) for a high count-rate fully digital system. In this method, we acquire additional samples, excluding the real gamma pulses, from the existing free-running ADC in the digital system, and perform online statistical processing to generate a baseline value. This baseline value will be subtracted from the digitized waveform to retrieve its original pulse with zero-baseline drift. This method can self-track the baseline without a micro-controller involved. The circuit consists of two digital counter/timers, one comparator, one register and one subtraction unit. Simulation shows a single channel works at 30 Mcps count-rate with pileup condition. 336 baseline restorer circuits have been implemented into 12 field-programmable-gate-arrays (FPGA) for our new fully digital PET system.

  6. High event rate ROICs (HEROICs) for astronomical UV photon counting detectors

    NASA Astrophysics Data System (ADS)

    Harwit, Alex; France, Kevin; Argabright, Vic; Franka, Steve; Freymiller, Ed; Ebbets, Dennis

    2014-07-01

    The next generation of astronomical photocathode / microchannel plate based UV photon counting detectors will overcome existing count rate limitations by replacing the anode arrays and external cabled electronics with anode arrays integrated into imaging Read Out Integrated Circuits (ROICs). We have fabricated a High Event Rate ROIC (HEROIC) consisting of a 32 by 32 array of 55 μm square pixels on a 60 μm pitch. The pixel sensitivity (threshold) has been designed to be globally programmable between 1 × 103 and 1 × 106 electrons. To achieve the sensitivity of 1 × 103 electrons, parasitic capacitances had to be minimized and this was achieved by fabricating the ROIC in a 65 nm CMOS process. The ROIC has been designed to support pixel counts up to 4096 events per integration period at rates up to 1 MHz per pixel. Integration time periods can be controlled via an external signal with a time resolution of less than 1 microsecond enabling temporally resolved imaging and spectroscopy of astronomical sources. An electrical injection port is provided to verify functionality and performance of each ROIC prior to vacuum integration with a photocathode and microchannel plate amplifier. Test results on the first ROICs using the electrical injection port demonstrate sensitivities between 3 × 103 and 4 × 105 electrons are achieved. A number of fixes are identified for a re-spin of this ROIC.

  7. Aging and Visual Counting

    PubMed Central

    Li, Roger W.; MacKeben, Manfred; Chat, Sandy W.; Kumar, Maya; Ngo, Charlie; Levi, Dennis M.

    2010-01-01

    Background Much previous work on how normal aging affects visual enumeration has been focused on the response time required to enumerate, with unlimited stimulus duration. There is a fundamental question, not yet addressed, of how many visual items the aging visual system can enumerate in a “single glance”, without the confounding influence of eye movements. Methodology/Principal Findings We recruited 104 observers with normal vision across the age span (age 21–85). They were briefly (200 ms) presented with a number of well- separated black dots against a gray background on a monitor screen, and were asked to judge the number of dots. By limiting the stimulus presentation time, we can determine the maximum number of visual items an observer can correctly enumerate at a criterion level of performance (counting threshold, defined as the number of visual items at which ≈63% correct rate on a psychometric curve), without confounding by eye movements. Our findings reveal a 30% decrease in the mean counting threshold of the oldest group (age 61–85: ∼5 dots) when compared with the youngest groups (age 21–40: 7 dots). Surprisingly, despite decreased counting threshold, on average counting accuracy function (defined as the mean number of dots reported for each number tested) is largely unaffected by age, reflecting that the threshold loss can be primarily attributed to increased random errors. We further expanded this interesting finding to show that both young and old adults tend to over-count small numbers, but older observers over-count more. Conclusion/Significance Here we show that age reduces the ability to correctly enumerate in a glance, but the accuracy (veridicality), on average, remains unchanged with advancing age. Control experiments indicate that the degraded performance cannot be explained by optical, retinal or other perceptual factors, but is cortical in origin. PMID:20976149

  8. ChromAIX2: A large area, high count-rate energy-resolving photon counting ASIC for a Spectral CT Prototype

    NASA Astrophysics Data System (ADS)

    Steadman, Roger; Herrmann, Christoph; Livne, Amir

    2017-08-01

    Spectral CT based on energy-resolving photon counting detectors is expected to deliver additional diagnostic value at a lower dose than current state-of-the-art CT [1]. The capability of simultaneously providing a number of spectrally distinct measurements not only allows distinguishing between photo-electric and Compton interactions but also discriminating contrast agents that exhibit a K-edge discontinuity in the absorption spectrum, referred to as K-edge Imaging [2]. Such detectors are based on direct converting sensors (e.g. CdTe or CdZnTe) and high-rate photon counting electronics. To support the development of Spectral CT and show the feasibility of obtaining rates exceeding 10 Mcps/pixel (Poissonian observed count-rate), the ChromAIX ASIC has been previously reported showing 13.5 Mcps/pixel (150 Mcps/mm2 incident) [3]. The ChromAIX has been improved to offer the possibility of a large area coverage detector, and increased overall performance. The new ASIC is called ChromAIX2, and delivers count-rates exceeding 15 Mcps/pixel with an rms-noise performance of approximately 260 e-. It has an isotropic pixel pitch of 500 μm in an array of 22×32 pixels and is tile-able on three of its sides. The pixel topology consists of a two stage amplifier (CSA and Shaper) and a number of test features allowing to thoroughly characterize the ASIC without a sensor. A total of 5 independent thresholds are also available within each pixel, allowing to acquire 5 spectrally distinct measurements simultaneously. The ASIC also incorporates a baseline restorer to eliminate excess currents induced by the sensor (e.g. dark current and low frequency drifts) which would otherwise cause an energy estimation error. In this paper we report on the inherent electrical performance of the ChromAXI2 as well as measurements obtained with CZT (CdZnTe)/CdTe sensors and X-rays and radioactive sources.

  9. Accounting for orphaned aftershocks in the earthquake background rate

    USGS Publications Warehouse

    Van Der Elst, Nicholas

    2017-01-01

    Aftershocks often occur within cascades of triggered seismicity in which each generation of aftershocks triggers an additional generation, and so on. The rate of earthquakes in any particular generation follows Omori's law, going approximately as 1/t. This function decays rapidly, but is heavy-tailed, and aftershock sequences may persist for long times at a rate that is difficult to discriminate from background. It is likely that some apparently spontaneous earthquakes in the observational catalogue are orphaned aftershocks of long-past main shocks. To assess the relative proportion of orphaned aftershocks in the apparent background rate, I develop an extension of the ETAS model that explicitly includes the expected contribution of orphaned aftershocks to the apparent background rate. Applying this model to California, I find that the apparent background rate can be almost entirely attributed to orphaned aftershocks, depending on the assumed duration of an aftershock sequence. This implies an earthquake cascade with a branching ratio (the average number of directly triggered aftershocks per main shock) of nearly unity. In physical terms, this implies that very few earthquakes are completely isolated from the perturbing effects of other earthquakes within the fault system. Accounting for orphaned aftershocks in the ETAS model gives more accurate estimates of the true background rate, and more realistic expectations for long-term seismicity patterns.

  10. Accounting for orphaned aftershocks in the earthquake background rate

    NASA Astrophysics Data System (ADS)

    van der Elst, Nicholas J.

    2017-11-01

    Aftershocks often occur within cascades of triggered seismicity in which each generation of aftershocks triggers an additional generation, and so on. The rate of earthquakes in any particular generation follows Omori's law, going approximately as 1/t. This function decays rapidly, but is heavy-tailed, and aftershock sequences may persist for long times at a rate that is difficult to discriminate from background. It is likely that some apparently spontaneous earthquakes in the observational catalogue are orphaned aftershocks of long-past main shocks. To assess the relative proportion of orphaned aftershocks in the apparent background rate, I develop an extension of the ETAS model that explicitly includes the expected contribution of orphaned aftershocks to the apparent background rate. Applying this model to California, I find that the apparent background rate can be almost entirely attributed to orphaned aftershocks, depending on the assumed duration of an aftershock sequence. This implies an earthquake cascade with a branching ratio (the average number of directly triggered aftershocks per main shock) of nearly unity. In physical terms, this implies that very few earthquakes are completely isolated from the perturbing effects of other earthquakes within the fault system. Accounting for orphaned aftershocks in the ETAS model gives more accurate estimates of the true background rate, and more realistic expectations for long-term seismicity patterns.

  11. Controlling Low-Rate Signal Path Microdischarge for an Ultra-Low-Background Proportional Counter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mace, Emily K.; Aalseth, Craig E.; Bonicalzi, Ricco

    2013-05-01

    ABSTRACT Pacific Northwest National Laboratory (PNNL) has developed an ultra-low-background proportional counter (ULBPC) made of high purity copper. These detectors are part of an ultra-low-background counting system (ULBCS) in the newly constructed shallow underground laboratory at PNNL (at a depth of ~30 meters water-equivalent). To control backgrounds, the current preamplifier electronics are located outside the ULBCS shielding. Thus the signal from the detector travels through ~1 meter of cable and is potentially susceptible to high voltage microdischarge and other sources of electronic noise. Based on initial successful tests, commercial cables and connectors were used for this critical signal path. Subsequentmore » testing across different batches of commercial cables and connectors, however, showed unwanted (but still low) rates of microdischarge noise. To control this noise source, two approaches were pursued: first, to carefully validate cables, connectors, and other commercial components in this critical signal path, making modifications where necessary; second, to develop a custom low-noise, low-background preamplifier that can be integrated with the ULBPC and thus remove most commercial components from the critical signal path. This integrated preamplifier approach is based on the Amptek A250 low-noise charge-integrating preamplifier module. The initial microdischarge signals observed are presented and characterized according to the suspected source. Each of the approaches for mitigation is described, and the results from both are compared with each other and with the original performance seen with commercial cables and connectors.« less

  12. A Six-Year Study on the Changes in Airborne Pollen Counts and Skin Positivity Rates in Korea: 2008-2013.

    PubMed

    Park, Hye Jung; Lee, Jae-Hyun; Park, Kyung Hee; Kim, Kyu Rang; Han, Mae Ja; Choe, Hosoeng; Oh, Jae-Won; Hong, Chein-Soo

    2016-05-01

    The occurrence of pollen allergy is subject to exposure to pollen, which shows regional and temporal variations. We evaluated the changes in pollen counts and skin positivity rates for 6 years, and explored the correlation between their annual rates of change. We assessed the number of pollen grains collected in Seoul, and retrospectively reviewed the results of 4442 skin-prick tests conducted at the Severance Hospital Allergy-Asthma Clinic from January 1, 2008 to December 31, 2013. For 6 years, the mean monthly total pollen count showed two peaks, one in May and the other in September. Pollen count for grasses also showed the same trend. The pollen counts for trees, grasses, and weeds changed annually, but the changes were not significant. The annual skin positivity rates in response to pollen from grasses and weeds increased significantly over the 6 years. Among trees, the skin positivity rates in response to pollen from walnut, popular, elm, and alder significantly increased over the 6 years. Further, there was a significant correlation between the annual rate of change in pollen count and the rate of change in skin positivity rate for oak and hop Japanese. The pollen counts and skin positivity rates should be monitored, as they have changed annually. Oak and hop Japanese, which showed a significant correlation with the annual rate of change in pollen count and the rate of change in skin positivity rate over the 6 years may be considered the major allergens in Korea.

  13. Death rates in HIV-positive antiretroviral-naive patients with CD4 count greater than 350 cells per microL in Europe and North America: a pooled cohort observational study

    PubMed Central

    2011-01-01

    Background It is unclear whether antiretroviral (ART) naive HIV-positive individuals with high CD4 counts have a raised mortality risk compared with the general population, but this is relevant for considering earlier initiation of antiretroviral therapy. Methods Pooling data from 23 European and North American cohorts, we calculated country-, age-, sex-, and year-standardised mortality ratios (SMRs), stratifying by risk group. Included patients had at least one pre-ART CD4 count above 350 cells/mm3. The association between CD4 count and death rate was evaluated using Poisson regression methods. Findings Of 40,830 patients contributing 80,682 person-years of follow up with CD4 count above 350 cells/mm3, 419 (1.0%) died. The SMRs (95% confidence interval) were 1.30 (1.06-1.58) in homosexual men, and 2.94 (2.28-3.73) and 9.37 (8.13-10.75) in the heterosexual and IDU risk groups respectively. CD4 count above 500 cells/mm3 was associated with a lower death rate than 350-499 cells/mm3: adjusted rate ratios (95% confidence intervals) for 500-699 cells/mm3 and above 700 cells/mm3 were 0.77 (0.61-0.95) and 0.66 (0.52-0.85) respectively. Interpretation In HIV-infected ART-naive patients with high CD4 counts, death rates were raised compared with the general population. In homosexual men this was modest, suggesting that a proportion of the increased risk in other groups is due to confounding by other factors. Even in this high CD4 count range, lower CD4 count was associated with raised mortality. PMID:20638118

  14. Cosmic ray neutron background reduction using localized coincidence veto neutron counting

    DOEpatents

    Menlove, Howard O.; Bourret, Steven C.; Krick, Merlyn S.

    2002-01-01

    This invention relates to both the apparatus and method for increasing the sensitivity of measuring the amount of radioactive material in waste by reducing the interference caused by cosmic ray generated neutrons. The apparatus includes: (a) a plurality of neutron detectors, each of the detectors including means for generating a pulse in response to the detection of a neutron; and (b) means, coupled to each of the neutrons detectors, for counting only some of the pulses from each of the detectors, whether cosmic ray or fission generated. The means for counting includes a means that, after counting one of the pulses, vetos the counting of additional pulses for a prescribed period of time. The prescribed period of time is between 50 and 200 .mu.s. In the preferred embodiment the prescribed period of time is 128 .mu.s. The veto means can be an electronic circuit which includes a leading edge pulse generator which passes a pulse but blocks any subsequent pulse for a period of between 50 and 200 .mu.s. Alternately, the veto means is a software program which includes means for tagging each of the pulses from each of the detectors for both time and position, means for counting one of the pulses from a particular position, and means for rejecting those of the pulses which originate from the particular position and in a time interval on the order of the neutron die-away time in polyethylene or other shield material. The neutron detectors are grouped in pods, preferably at least 10. The apparatus also includes means for vetoing the counting of coincidence pulses from all of the detectors included in each of the pods which are adjacent to the pod which includes the detector which produced the pulse which was counted.

  15. Apparatus and method for temperature correction and expanded count rate of inorganic scintillation detectors

    DOEpatents

    Ianakiev, Kiril D [Los Alamos, NM; Hsue, Sin Tao [Santa Fe, NM; Browne, Michael C [Los Alamos, NM; Audia, Jeffrey M [Abiquiu, NM

    2006-07-25

    The present invention includes an apparatus and corresponding method for temperature correction and count rate expansion of inorganic scintillation detectors. A temperature sensor is attached to an inorganic scintillation detector. The inorganic scintillation detector, due to interaction with incident radiation, creates light pulse signals. A photoreceiver processes the light pulse signals to current signals. Temperature correction circuitry that uses a fast light component signal, a slow light component signal, and the temperature signal from the temperature sensor to corrected an inorganic scintillation detector signal output and expanded the count rate.

  16. CASA-Mot technology: how results are affected by the frame rate and counting chamber.

    PubMed

    Bompart, Daznia; García-Molina, Almudena; Valverde, Anthony; Caldeira, Carina; Yániz, Jesús; Núñez de Murga, Manuel; Soler, Carles

    2018-04-04

    For over 30 years, CASA-Mot technology has been used for kinematic analysis of sperm motility in different mammalian species, but insufficient attention has been paid to the technical limitations of commercial computer-aided sperm analysis (CASA) systems. Counting chamber type and frame rate are two of the most important aspects to be taken into account. Counting chambers can be disposable or reusable, with different depths. In human semen analysis, reusable chambers with a depth of 10µm are the most frequently used, whereas for most farm animal species it is more common to use disposable chambers with a depth of 20µm . The frame rate was previously limited by the hardware, although changes in the number of images collected could lead to significant variations in some kinematic parameters, mainly in curvilinear velocity (VCL). A frame rate of 60 frames s-1 is widely considered to be the minimum necessary for satisfactory results. However, the frame rate is species specific and must be defined in each experimental condition. In conclusion, we show that the optimal combination of frame rate and counting chamber type and depth should be defined for each species and experimental condition in order to obtain reliable results.

  17. Multiplicity Counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Geist, William H.

    2015-12-01

    This set of slides begins by giving background and a review of neutron counting; three attributes of a verification item are discussed: 240Pu eff mass; α, the ratio of (α,n) neutrons to spontaneous fission neutrons; and leakage multiplication. It then takes up neutron detector systems – theory & concepts (coincidence counting, moderation, die-away time); detector systems – some important details (deadtime, corrections); introduction to multiplicity counting; multiplicity electronics and example distributions; singles, doubles, and triples from measured multiplicity distributions; and the point model: multiplicity mathematics.

  18. A Six-Year Study on the Changes in Airborne Pollen Counts and Skin Positivity Rates in Korea: 2008–2013

    PubMed Central

    Park, Hye Jung; Lee, Jae-Hyun; Park, Kyung Hee; Kim, Kyu Rang; Han, Mae Ja; Choe, Hosoeng

    2016-01-01

    Purpose The occurrence of pollen allergy is subject to exposure to pollen, which shows regional and temporal variations. We evaluated the changes in pollen counts and skin positivity rates for 6 years, and explored the correlation between their annual rates of change. Materials and Methods We assessed the number of pollen grains collected in Seoul, and retrospectively reviewed the results of 4442 skin-prick tests conducted at the Severance Hospital Allergy-Asthma Clinic from January 1, 2008 to December 31, 2013. Results For 6 years, the mean monthly total pollen count showed two peaks, one in May and the other in September. Pollen count for grasses also showed the same trend. The pollen counts for trees, grasses, and weeds changed annually, but the changes were not significant. The annual skin positivity rates in response to pollen from grasses and weeds increased significantly over the 6 years. Among trees, the skin positivity rates in response to pollen from walnut, popular, elm, and alder significantly increased over the 6 years. Further, there was a significant correlation between the annual rate of change in pollen count and the rate of change in skin positivity rate for oak and hop Japanese. Conclusion The pollen counts and skin positivity rates should be monitored, as they have changed annually. Oak and hop Japanese, which showed a significant correlation with the annual rate of change in pollen count and the rate of change in skin positivity rate over the 6 years may be considered the major allergens in Korea. PMID:26996572

  19. Performance evaluation of the Ingenuity TF PET/CT scanner with a focus on high count-rate conditions

    NASA Astrophysics Data System (ADS)

    Kolthammer, Jeffrey A.; Su, Kuan-Hao; Grover, Anu; Narayanan, Manoj; Jordan, David W.; Muzic, Raymond F.

    2014-07-01

    This study evaluated the positron emission tomography (PET) imaging performance of the Ingenuity TF 128 PET/computed tomography (CT) scanner which has a PET component that was designed to support a wider radioactivity range than is possible with those of Gemini TF PET/CT and Ingenuity TF PET/MR. Spatial resolution, sensitivity, count rate characteristics and image quality were evaluated according to the NEMA NU 2-2007 standard and ACR phantom accreditation procedures; these were supplemented by additional measurements intended to characterize the system under conditions that would be encountered during quantitative cardiac imaging with 82Rb. Image quality was evaluated using a hot spheres phantom, and various contrast recovery and noise measurements were made from replicated images. Timing and energy resolution, dead time, and the linearity of the image activity concentration, were all measured over a wide range of count rates. Spatial resolution (4.8-5.1 mm FWHM), sensitivity (7.3 cps kBq-1), peak noise-equivalent count rate (124 kcps), and peak trues rate (365 kcps) were similar to those of the Gemini TF PET/CT. Contrast recovery was higher with a 2 mm, body-detail reconstruction than with a 4 mm, body reconstruction, although the precision was reduced. The noise equivalent count rate peak was broad (within 10% of peak from 241-609 MBq). The activity measured in phantom images was within 10% of the true activity for count rates up to those observed in 82Rb cardiac PET studies.

  20. A compact 7-cell Si-drift detector module for high-count rate X-ray spectroscopy.

    PubMed

    Hansen, K; Reckleben, C; Diehl, I; Klär, H

    2008-05-01

    A new Si-drift detector module for fast X-ray spectroscopy experiments was developed and realized. The Peltier-cooled module comprises a sensor with 7 × 7-mm 2 active area, an integrated circuit for amplification, shaping and detection, storage, and derandomized readout of signal pulses in parallel, and amplifiers for line driving. The compactness and hexagonal shape of the module with a wrench size of 16mm allow very short distances to the specimen and multi-module arrangements. The power dissipation is 186mW. At a shaper peaking time of 190 ns and an integration time of 450 ns an electronic rms noise of ~11 electrons was achieved. When operated at 7 °C, FWHM line widths around 260 and 460 eV (Cu-K α ) were obtained at low rates and at sum-count rates of 1.7 MHz, respectively. The peak shift is below 1% for a broad range of count rates. At 1.7-MHz sum-count rate the throughput loss amounts to 30%.

  1. The NuSTAR Extragalactic Surveys: The Number Counts Of Active Galactic Nuclei And The Resolved Fraction Of The Cosmic X-ray Background

    NASA Technical Reports Server (NTRS)

    Harrison, F. A.; Aird, J.; Civano, F.; Lansbury, G.; Mullaney, J. R.; Ballentyne, D. R.; Alexander, D. M.; Stern, D.; Ajello, M.; Barret, D.; hide

    2016-01-01

    We present the 3-8 kiloelectronvolts and 8-24 kiloelectronvolts number counts of active galactic nuclei (AGNs) identified in the Nuclear Spectroscopic Telescope Array (NuSTAR) extragalactic surveys. NuSTAR has now resolved 33 percent -39 percent of the X-ray background in the 8-24 kiloelectronvolts band, directly identifying AGNs with obscuring columns up to approximately 10 (exp 25) per square centimeter. In the softer 3-8 kiloelectronvolts band the number counts are in general agreement with those measured by XMM-Newton and Chandra over the flux range 5 times 10 (exp -15) less than or approximately equal to S (3-8 kiloelectronvolts) divided by ergs per second per square centimeter less than or approximately equal to 10 (exp -12) probed by NuSTAR. In the hard 8-24 kiloelectronvolts band NuSTAR probes fluxes over the range 2 times 10 (exp -14) less than or approximately equal to S (8-24 kiloelectronvolts) divided by ergs per second per square centimeter less than or approximately equal to 10 (exp -12), a factor approximately 100 times fainter than previous measurements. The 8-24 kiloelectronvolts number counts match predictions from AGN population synthesis models, directly confirming the existence of a population of obscured and/or hard X-ray sources inferred from the shape of the integrated cosmic X-ray background. The measured NuSTAR counts lie significantly above simple extrapolation with a Euclidian slope to low flux of the Swift/BAT15-55 kiloelectronvolts number counts measured at higher fluxes (S (15-55 kiloelectronvolts) less than or approximately equal to 10 (exp -11) ergs per second per square centimeter), reflecting the evolution of the AGN population between the Swift/BAT local (redshift is less than 0.1) sample and NuSTAR's redshift approximately equal to 1 sample. CXB (Cosmic X-ray Background) synthesis models, which account for AGN evolution, lie above the Swift/BAT measurements, suggesting that they do not fully capture the evolution of obscured

  2. Illinois Quality Counts: QRS Profile. The Child Care Quality Rating System (QRS) Assessment

    ERIC Educational Resources Information Center

    Child Trends, 2010

    2010-01-01

    This paper presents a profile of Illinois' Quality Counts prepared as part of the Child Care Quality Rating System (QRS) Assessment Study. The profile consists of several sections and their corresponding descriptions including: (1) Program Information; (2) Rating Details; (3) Quality Indicators for Center-Based Programs; (4) Indicators for Family…

  3. NEMA count-rate evaluation of the first and second generation of the Ecat Exact and Ecat Exact HR family of scanners

    NASA Astrophysics Data System (ADS)

    Eriksson, L.; Wienhard, K.; Eriksson, M.; Casey, M. E.; Knoess, C.; Bruckbauer, T.; Hamill, J.; Mulnix, T.; Vollmar, S.; Bendriem, B.; Heiss, W. D.; Nutt, R.

    2002-06-01

    The first and second generation of the Exact and Exact HR family of scanners has been evaluated in terms of noise equivalent count rate (NEC) and count-rate capabilities. The new National Electrical Manufacturers Association standard was used for the evaluation. In spite of improved electronics and improved count-rate capabilities, the peak NEC was found to be fairly constant between the generations. The results are discussed in terms of the different electronic solutions for the two generations and its implications on system dead time and NEC count-rate capability.

  4. High resolution gamma-ray spectroscopy at high count rates with a prototype High Purity Germanium detector

    NASA Astrophysics Data System (ADS)

    Cooper, R. J.; Amman, M.; Vetter, K.

    2018-04-01

    High-resolution gamma-ray spectrometers are required for applications in nuclear safeguards, emergency response, and fundamental nuclear physics. To overcome one of the shortcomings of conventional High Purity Germanium (HPGe) detectors, we have developed a prototype device capable of achieving high event throughput and high energy resolution at very high count rates. This device, the design of which we have previously reported on, features a planar HPGe crystal with a reduced-capacitance strip electrode geometry. This design is intended to provide good energy resolution at the short shaping or digital filter times that are required for high rate operation and which are enabled by the fast charge collection afforded by the planar geometry crystal. In this work, we report on the initial performance of the system at count rates up to and including two million counts per second.

  5. Modeling zero-modified count and semicontinuous data in health services research Part 1: background and overview.

    PubMed

    Neelon, Brian; O'Malley, A James; Smith, Valerie A

    2016-11-30

    Health services data often contain a high proportion of zeros. In studies examining patient hospitalization rates, for instance, many patients will have no hospitalizations, resulting in a count of zero. When the number of zeros is greater or less than expected under a standard count model, the data are said to be zero modified relative to the standard model. A similar phenomenon arises with semicontinuous data, which are characterized by a spike at zero followed by a continuous distribution with positive support. When analyzing zero-modified count and semicontinuous data, flexible mixture distributions are often needed to accommodate both the excess zeros and the typically skewed distribution of nonzero values. Various models have been introduced over the past three decades to accommodate such data, including hurdle models, zero-inflated models, and two-part semicontinuous models. This tutorial describes recent modeling strategies for zero-modified count and semicontinuous data and highlights their role in health services research studies. Part 1 of the tutorial, presented here, provides a general overview of the topic. Part 2, appearing as a companion piece in this issue of Statistics in Medicine, discusses three case studies illustrating applications of the methods to health services research. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. A 1.5k x 1.5k class photon counting HgCdTe linear avalanche photo-diode array for low background space astronomy in the 1-5micron infrared

    NASA Astrophysics Data System (ADS)

    Hall, Donald

    Under a current award, NASA NNX 13AC13G "EXTENDING THE ASTRONOMICAL APPLICATION OF PHOTON COUNTING HgCdTe LINEAR AVALANCHE PHOTODIODE ARRAYS TO LOW BACKGROUND SPACE OBSERVATIONS" UH has used Selex SAPHIRA 320 x 256 MOVPE L-APD HgCdTe arrays developed for Adaptive Optics (AO) wavefront (WF) sensing to investigate the potential of this technology for low background space astronomy applications. After suppressing readout integrated circuit (ROIC) glow, we have placed upper limits on gain normalized dark current of 0.01 e-/sec at up to 8 volts avalanche bias, corresponding to avalanche gain of 5, and have operated with avalanche gains of up to several hundred at higher bias. We have also demonstrated detection of individual photon events. The proposed investigation would scale the format to 1536 x 1536 at 12um (the largest achievable in a standard reticule without requiring stitching) while incorporating reference pixels required at these low dark current levels. The primary objective is to develop, produce and characterize a 1.5k x 1.5k at 12um pitch MOVPE HgCdTe L-APD array, with nearly 30 times the pixel count of the 320 x 256 SAPHIRA, optimized for low background space astronomy. This will involve: 1) Selex design of a 1.5k x 1.5k at 12um pitch ROIC optimized for low background operation, silicon wafer fabrication at the German XFab foundry in 0.35 um 3V3 process and dicing/test at Selex, 2) provision by GL Scientific of a 3-side close-buttable carrier building from the heritage of the HAWAII xRG family, 3) Selex development and fabrication of 1.5k x 1.5k at 12 um pitch MOVPE HgCdTe L-APD detector arrays optimized for low background applications, 4) hybridization, packaging into a sensor chip assembly (SCA) with initial characterization by Selex and, 5) comprehensive characterization of low background performance, both in the laboratory and at ground based telescopes, by UH. The ultimate goal is to produce and eventually market a large format array, the L

  7. Background compensation for a radiation level monitor

    DOEpatents

    Keefe, D.J.

    1975-12-01

    Background compensation in a device such as a hand and foot monitor is provided by digital means using a scaler. With no radiation level test initiated, a scaler is down-counted from zero according to the background measured. With a radiation level test initiated, the scaler is up-counted from the previous down-count position according to the radiation emitted from the monitored object and an alarm is generated if, with the scaler having crossed zero in the positive going direction, a particular number is exceeded in a specific time period after initiation of the test. If the test is initiated while the scale is down-counting, the background count from the previous down- count stored in a memory is used as the initial starting point for the up-count.

  8. Transforming GSC-II Magnitudes into JWST/FGS Count Rates

    NASA Astrophysics Data System (ADS)

    Holfeltz, Sherie T.; Chayer, P.; Nelan, E. P.

    2010-01-01

    The JWST Fine Guidance Sensor (FGS) will provide the positions of guide stars to the spacecraft attitude control system to facilitate the fine pointing of the Observatory. The FGS is an infrared camera operating in an unfiltered passband from 0.6 to 5.3 microns. The ground system will select guide stars from the Guide Star Catalog II (GSC-II), which is an all-sky catalog with three optical passbands (BJ, RF, IN) derived from photographic plates, and from 2MASS. We present a method for predicting a guide star's FGS photon count rate, which is needed to operate the FGS. The method consists of first deriving equations for transforming the GSC-II optical passbands into J, H, and K for stars that are below the 2MASS faint limiting magnitude, based upon fitting the distribution of brighter stars in color-color diagrams using GSC-II and 2MASS photometry. Next, we convolve the BJ, RF, IN and predicted J, H, and K magnitudes (or 2MASS magnitudes if available) for a given star with the wavelength dependent throughput and sensitivity of the telescope and FGS. To estimate the accuracy of this method for stars that are too faint for 2MASS, we compare the predicted J, H, and K magnitudes for a large sample of stars to data from the United Kingdom Infrared Telescope (UKIRT) Deep Sky Survey (UKIDSS) Large Area Survey (LAS). Using synthetic magnitudes computed from Kurucz models for stars of different spectral types, we show that the method should provide reliable FGS count rates.

  9. The use of noise equivalent count rate and the NEMA phantom for PET image quality evaluation.

    PubMed

    Yang, Xin; Peng, Hao

    2015-03-01

    PET image quality is directly associated with two important parameters among others: count-rate performance and image signal-to-noise ratio (SNR). The framework of noise equivalent count rate (NECR) was developed back in the 1990s and has been widely used since then to evaluate count-rate performance for PET systems. The concept of NECR is not entirely straightforward, however, and among the issues requiring clarification are its original definition, its relationship to image quality, and its consistency among different derivation methods. In particular, we try to answer whether a higher NECR measurement using a standard NEMA phantom actually corresponds to better imaging performance. The paper includes the following topics: 1) revisiting the original analytical model for NECR derivation; 2) validating three methods for NECR calculation based on the NEMA phantom/standard; and 3) studying the spatial dependence of NECR and quantitative relationship between NECR and image SNR. Copyright © 2015 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  10. Monitoring trends in bird populations: addressing background levels of annual variability in counts

    Treesearch

    Jared Verner; Kathryn L. Purcell; Jennifer G. Turner

    1996-01-01

    Point counting has been widely accepted as a method for monitoring trends in bird populations. Using a rigorously standardized protocol at 210 counting stations at the San Joaquin Experimental Range, Madera Co., California, we have been studying sources of variability in point counts of birds. Vegetation types in the study area have not changed during the 11 years of...

  11. Effective count rates for PET scanners with reduced and extended axial field of view

    NASA Astrophysics Data System (ADS)

    MacDonald, L. R.; Harrison, R. L.; Alessio, A. M.; Hunter, W. C. J.; Lewellen, T. K.; Kinahan, P. E.

    2011-06-01

    We investigated the relationship between noise equivalent count (NEC) and axial field of view (AFOV) for PET scanners with AFOVs ranging from one-half to twice those of current clinical scanners. PET scanners with longer or shorter AFOVs could fulfill different clinical needs depending on exam volumes and site economics. Using previously validated Monte Carlo simulations, we modeled true, scattered and random coincidence counting rates for a PET ring diameter of 88 cm with 2, 4, 6, and 8 rings of detector blocks (AFOV 7.8, 15.5, 23.3, and 31.0 cm). Fully 3D acquisition mode was compared to full collimation (2D) and partial collimation (2.5D) modes. Counting rates were estimated for a 200 cm long version of the 20 cm diameter NEMA count-rate phantom and for an anthropomorphic object based on a patient scan. We estimated the live-time characteristics of the scanner from measured count-rate data and applied that estimate to the simulated results to obtain NEC as a function of object activity. We found NEC increased as a quadratic function of AFOV for 3D mode, and linearly in 2D mode. Partial collimation provided the highest overall NEC on the 2-block system and fully 3D mode provided the highest NEC on the 8-block system for clinically relevant activities. On the 4-, and 6-block systems 3D mode NEC was highest up to ~300 MBq in the anthropomorphic phantom, above which 3D NEC dropped rapidly, and 2.5D NEC was highest. Projected total scan time to achieve NEC-density that matches current clinical practice in a typical oncology exam averaged 9, 15, 24, and 61 min for the 8-, 6-, 4-, and 2-block ring systems, when using optimal collimation. Increasing the AFOV should provide a greater than proportional increase in NEC, potentially benefiting patient throughput-to-cost ratio. Conversely, by using appropriate collimation, a two-ring (7.8 cm AFOV) system could acquire whole-body scans achieving NEC-density levels comparable to current standards within long, but feasible

  12. Palm Beach Quality Counts: QRS Profile. The Child Care Quality Rating System (QRS) Assessment

    ERIC Educational Resources Information Center

    Child Trends, 2010

    2010-01-01

    This paper presents a profile of Palm Beach's Quality Counts prepared as part of the Child Care Quality Rating System (QRS) Assessment Study. The profile consists of several sections and their corresponding descriptions including: (1) Program Information; (2) Rating Details; (3) Quality Indicators for Center-Based Programs; (4) Indicators for…

  13. Miami-Dade Quality Counts: QRS Profile. The Child Care Quality Rating System (QRS) Assessment

    ERIC Educational Resources Information Center

    Child Trends, 2010

    2010-01-01

    This paper presents a profile of Miami-Dade's Quality Counts prepared as part of the Child Care Quality Rating System (QRS) Assessment Study. The profile consists of several sections and their corresponding descriptions including: (1) Program Information; (2) Rating Details; (3) Quality Indicators for Center-Based Programs; (4) Indicators for…

  14. Linear-log counting-rate meter uses transconductance characteristics of a silicon planar transistor

    NASA Technical Reports Server (NTRS)

    Eichholz, J. J.

    1969-01-01

    Counting rate meter compresses a wide range of data values, or decades of current. Silicon planar transistor, operating in the zero collector-base voltage mode, is used as a feedback element in an operational amplifier to obtain the log response.

  15. Spitzer deep and wide legacy mid- and far-infrared number counts and lower limits of cosmic infrared background

    NASA Astrophysics Data System (ADS)

    Béthermin, M.; Dole, H.; Beelen, A.; Aussel, H.

    2010-03-01

    Aims: We aim to place stronger lower limits on the cosmic infrared background (CIB) brightness at 24 μm, 70 μm and 160 μm and measure the extragalactic number counts at these wavelengths in a homogeneous way from various surveys. Methods: Using Spitzer legacy data over 53.6 deg2 of various depths, we build catalogs with the same extraction method at each wavelength. Completeness and photometric accuracy are estimated with Monte-Carlo simulations. Number count uncertainties are estimated with a counts-in-cells moment method to take galaxy clustering into account. Furthermore, we use a stacking analysis to estimate number counts of sources not detected at 70 μm and 160 μm. This method is validated by simulations. The integration of the number counts gives new CIB lower limits. Results: Number counts reach 35 μJy, 3.5 mJy and 40 mJy at 24 μm, 70 μm, and 160 μm, respectively. We reach deeper flux densities of 0.38 mJy at 70, and 3.1 at 160 μm with a stacking analysis. We confirm the number count turnover at 24 μm and 70 μm, and observe it for the first time at 160 μm at about 20 mJy, together with a power-law behavior below 10 mJy. These mid- and far-infrared counts: 1) are homogeneously built by combining fields of different depths and sizes, providing a legacy over about three orders of magnitude in flux density; 2) are the deepest to date at 70 μm and 160 μm; 3) agree with previously published results in the common measured flux density range; 4) globally agree with the Lagache et al. (2004) model, except at 160 μm, where the model slightly overestimates the counts around 20 and 200 mJy. Conclusions: These counts are integrated to estimate new CIB firm lower limits of 2.29-0.09+0.09 nW m-2 sr-1, 5.4-0.4+0.4 nW m-2 sr-1, and 8.9-1.1+1.1 nW m-2 sr-1 at 24 μm, 70 μm, and 160 μm, respectively, and extrapolated to give new estimates of the CIB due to galaxies of 2.86-0.16+0.19 nW m-2 sr-1, 6.6-0.6+0.7 nW m-2 sr-1, and 14.6-2.9+7.1 nW m-2 sr-1

  16. Expected count rate for the Self- Interrogation Neutron Resonance Densitometry measurements of spent nuclear fuel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rossa, Riccardo; Universite libre de Bruxelles, Ecole polytechnique de Bruxelles - Service de Metrologie Nucleaire, CP 165/84, Avenue F.D. Roosevelt, 50 - B1050 Brussels; Borella, Alessandro

    The Self-Interrogation Neutron Resonance Densitometry (SINRD) is a passive neutron technique that aims at a direct quantification of {sup 239}Pu in the fuel assemblies by measuring the attenuation of the neutron flux in the energy region close to the 0.3 eV resonance of {sup 239}Pu. The {sup 239}Pu mass is estimated by calculating the SINRD signature, that is the ratio between the neutron flux integrated over the fast energy region and around the 0.3 eV resonance region. The SINRD measurement approach considered in this study consists in introducing a small neutron detector in the central guide tube of a PWRmore » 17x17 fuel assembly. In order to measure the neutron flux in the energy regions defined in the SINRD signature, different detector types are used. The response of a bare {sup 238}U fission chamber is considered for the determination of the fast neutron flux, while other thermal-epithermal detectors wrapped in neutron absorbers are envisaged to measure the neutron flux around the resonance region. This paper provides an estimation of the count rate that can be achieved with the detector types proposed for the SINRD measurement. In the first section a set of detectors are evaluated in terms of count rate and sensitivity to the {sup 239}Pu content, in order to identify the optimal measurement configuration for each detector type. Then a study is performed to increase the count rate by increasing the detector size. The study shows that the highest count rate is achieved by using either {sup 3}He or {sup 10}B proportional counters because of the high neutron efficiency of these detectors. However, the calculations indicate that the biggest contribution to the measurement uncertainty is due to the measurement of the fast neutron flux. Finally, similar sensitivity to the {sup 239}Pu content is obtained by using the different detector types for the measurement of the neutron flux close to the resonance region. Therefore, the count rate associated to each

  17. Rain-induced increase in background radiation detected by Radiation Portal Monitors.

    PubMed

    Livesay, R J; Blessinger, C S; Guzzardo, T F; Hausladen, P A

    2014-11-01

    A complete understanding of both the steady state and transient background measured by Radiation Portal Monitors (RPMs) is essential to predictable system performance, as well as maximization of detection sensitivity. To facilitate this understanding, a test bed for the study of natural background in RPMs has been established at the Oak Ridge National Laboratory. This work was performed in support of the Second Line of Defense Program's mission to enhance partner country capability to deter, detect, and interdict the illicit movement of special nuclear material. In the present work, transient increases in gamma-ray counting rates in RPMs due to rain are investigated. The increase in background activity associated with rain, which has been well documented in the field of environmental radioactivity, originates primarily from the wet-deposition of two radioactive daughters of (222)Rn, namely, (214)Pb and (214)Bi. In this study, rainfall rates recorded by a co-located weather station are compared with RPM count rates and high-purity germanium spectra. The data verify that these radionuclides are responsible for the largest environmental background fluctuations in RPMs. Analytical expressions for the detector response function in Poly-Vinyl Toluene have been derived. Effects on system performance and potential mitigation strategies are discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. Measures of clustering and heterogeneity in multilevel Poisson regression analyses of rates/count data

    PubMed Central

    Austin, Peter C.; Stryhn, Henrik; Leckie, George; Merlo, Juan

    2017-01-01

    Multilevel data occur frequently in many research areas like health services research and epidemiology. A suitable way to analyze such data is through the use of multilevel regression models. These models incorporate cluster‐specific random effects that allow one to partition the total variation in the outcome into between‐cluster variation and between‐individual variation. The magnitude of the effect of clustering provides a measure of the general contextual effect. When outcomes are binary or time‐to‐event in nature, the general contextual effect can be quantified by measures of heterogeneity like the median odds ratio or the median hazard ratio, respectively, which can be calculated from a multilevel regression model. Outcomes that are integer counts denoting the number of times that an event occurred are common in epidemiological and medical research. The median (incidence) rate ratio in multilevel Poisson regression for counts that corresponds to the median odds ratio or median hazard ratio for binary or time‐to‐event outcomes respectively is relatively unknown and is rarely used. The median rate ratio is the median relative change in the rate of the occurrence of the event when comparing identical subjects from 2 randomly selected different clusters that are ordered by rate. We also describe how the variance partition coefficient, which denotes the proportion of the variation in the outcome that is attributable to between‐cluster differences, can be computed with count outcomes. We illustrate the application and interpretation of these measures in a case study analyzing the rate of hospital readmission in patients discharged from hospital with a diagnosis of heart failure. PMID:29114926

  19. Simulation of Rate-Related (Dead-Time) Losses In Passive Neutron Multiplicity Counting Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, L.G.; Norman, P.I.; Leadbeater, T.W.

    Passive Neutron Multiplicity Counting (PNMC) based on Multiplicity Shift Register (MSR) electronics (a form of time correlation analysis) is a widely used non-destructive assay technique for quantifying spontaneously fissile materials such as Pu. At high event rates, dead-time losses perturb the count rates with the Singles, Doubles and Triples being increasingly affected. Without correction these perturbations are a major source of inaccuracy in the measured count rates and assay values derived from them. This paper presents the simulation of dead-time losses and investigates the effect of applying different dead-time models on the observed MSR data. Monte Carlo methods have beenmore » used to simulate neutron pulse trains for a variety of source intensities and with ideal detection geometry, providing an event by event record of the time distribution of neutron captures within the detection system. The action of the MSR electronics was modelled in software to analyse these pulse trains. Stored pulse trains were perturbed in software to apply the effects of dead-time according to the chosen physical process; for example, the ideal paralysable (extending) and non-paralysable models with an arbitrary dead-time parameter. Results of the simulations demonstrate the change in the observed MSR data when the system dead-time parameter is varied. In addition, the paralysable and non-paralysable models of deadtime are compared. These results form part of a larger study to evaluate existing dead-time corrections and to extend their application to correlated sources. (authors)« less

  20. Waveguide integrated low noise NbTiN nanowire single-photon detectors with milli-Hz dark count rate

    PubMed Central

    Schuck, Carsten; Pernice, Wolfram H. P.; Tang, Hong X.

    2013-01-01

    Superconducting nanowire single-photon detectors are an ideal match for integrated quantum photonic circuits due to their high detection efficiency for telecom wavelength photons. Quantum optical technology also requires single-photon detection with low dark count rate and high timing accuracy. Here we present very low noise superconducting nanowire single-photon detectors based on NbTiN thin films patterned directly on top of Si3N4 waveguides. We systematically investigate a large variety of detector designs and characterize their detection noise performance. Milli-Hz dark count rates are demonstrated over the entire operating range of the nanowire detectors which also feature low timing jitter. The ultra-low dark count rate, in combination with the high detection efficiency inherent to our travelling wave detector geometry, gives rise to a measured noise equivalent power at the 10−20 W/Hz1/2 level. PMID:23714696

  1. Clustering method for counting passengers getting in a bus with single camera

    NASA Astrophysics Data System (ADS)

    Yang, Tao; Zhang, Yanning; Shao, Dapei; Li, Ying

    2010-03-01

    Automatic counting of passengers is very important for both business and security applications. We present a single-camera-based vision system that is able to count passengers in a highly crowded situation at the entrance of a traffic bus. The unique characteristics of the proposed system include, First, a novel feature-point-tracking- and online clustering-based passenger counting framework, which performs much better than those of background-modeling-and foreground-blob-tracking-based methods. Second, a simple and highly accurate clustering algorithm is developed that projects the high-dimensional feature point trajectories into a 2-D feature space by their appearance and disappearance times and counts the number of people through online clustering. Finally, all test video sequences in the experiment are captured from a real traffic bus in Shanghai, China. The results show that the system can process two 320×240 video sequences at a frame rate of 25 fps simultaneously, and can count passengers reliably in various difficult scenarios with complex interaction and occlusion among people. The method achieves high accuracy rates up to 96.5%.

  2. Fluorescence decay data analysis correcting for detector pulse pile-up at very high count rates

    NASA Astrophysics Data System (ADS)

    Patting, Matthias; Reisch, Paja; Sackrow, Marcus; Dowler, Rhys; Koenig, Marcelle; Wahl, Michael

    2018-03-01

    Using time-correlated single photon counting for the purpose of fluorescence lifetime measurements is usually limited in speed due to pile-up. With modern instrumentation, this limitation can be lifted significantly, but some artifacts due to frequent merging of closely spaced detector pulses (detector pulse pile-up) remain an issue to be addressed. We propose a data analysis method correcting for this type of artifact and the resulting systematic errors. It physically models the photon losses due to detector pulse pile-up and incorporates the loss in the decay fit model employed to obtain fluorescence lifetimes and relative amplitudes of the decay components. Comparison of results with and without this correction shows a significant reduction of systematic errors at count rates approaching the excitation rate. This allows quantitatively accurate fluorescence lifetime imaging at very high frame rates.

  3. Low-background gamma-ray spectrometry for the international monitoring system

    DOE PAGES

    Greenwood, L. R.; Cantaloub, M. G.; Burnett, J. L.; ...

    2016-12-28

    PNNL has developed two low-background gamma-ray spectrometers in a new shallow underground laboratory, thereby significantly improving its ability to detect low levels of gamma-ray emitting fission or activation products in airborne particulate in samples from the IMS (International Monitoring System). Furthermore, the combination of cosmic veto panels, dry nitrogen gas to reduce radon and low background shielding results in a reduction of the background count rate by about a factor of 100 compared to detectors operating above ground at our laboratory.

  4. Systematic measurement of fast neutron background fluctuations in an urban area using a mobile detection system

    DOE PAGES

    Iyengar, Anagha; Beach, Matthew; Newby, Robert J.; ...

    2015-11-12

    Neutron background measurements using a mobile trailer-based system were conducted in Knoxville, Tennessee. The 0.5 m 2 system consisting of 8 EJ-301 liquid scintillation detectors was used to collect neutron background measurements in order to better understand the systematic background variations that depend solely on the street-level measurement position in a local, downtown area. Data was collected along 5 different streets in the downtown Knoxville area, and the measurements were found to be repeatable. Using 10-min measurements, fractional uncertainty in each measured data point was <2%. Compared with fast neutron background count rates measured away from downtown Knoxville, a reductionmore » in background count rates ranging from 10-50% was observed in the downtown area, sometimes varying substantially over distances of tens of meters. These reductions are attributed to the shielding of adjacent buildings, quantified in part here by the metric angle-of-open-sky. The adjacent buildings may serve to shield cosmic ray neutron flux.« less

  5. Systematic measurement of fast neutron background fluctuations in an urban area using a mobile detection system

    NASA Astrophysics Data System (ADS)

    Iyengar, A.; Beach, M.; Newby, R. J.; Fabris, L.; Heilbronn, L. H.; Hayward, J. P.

    2015-02-01

    Neutron background measurements using a mobile trailer-based system were conducted in Knoxville, Tennessee, USA. The 0.5 m2 system, consisting of eight EJ-301 liquid scintillation detectors, was used to collect neutron background measurements in order to better understand the systematic variations in background that depend solely on the street-level measurement position in a downtown area. Data was collected along 5 different streets, and the measurements were found to be repeatable. Using 10-min measurements, the fractional uncertainty in each measured data point was <2%. Compared with fast neutron background count rates measured away from downtown Knoxville, a reduction in background count rates ranging from 10% to 50% was observed in the downtown area, sometimes varying substantially over distances of tens of meters. These reductions are attributed to the net shielding of the cosmic ray neutron flux by adjacent buildings. For reference, the building structure as observed at street level is quantified in part here by a measured angle-of-open-sky metric.

  6. The NuSTAR Extragalactic Surveys: The Number Counts of Active Galactic Nuclei and The Resolved Fraction of The Cosmic X-Ray Background

    DOE PAGES

    Harrison, F. A.; Aird, J.; Civano, F.; ...

    2016-11-07

    Here, we present the 3–8 keV and 8–24 keV number counts of active galactic nuclei (AGNs) identified in the Nuclear Spectroscopic Telescope Array (NuSTAR) extragalactic surveys. NuSTAR has now resolved 33%–39% of the X-ray background in the 8–24 keV band, directly identifying AGNs with obscuring columns up tomore » $$\\sim {10}^{25}\\,{\\mathrm{cm}}^{-2}$$. In the softer 3–8 keV band the number counts are in general agreement with those measured by XMM-Newton and Chandra over the flux range $$5\\times {10}^{-15}\\,\\lesssim $$ S(3–8 keV)/$$\\mathrm{erg}\\,{{\\rm{s}}}^{-1}\\,{\\mathrm{cm}}^{-2}\\,\\lesssim \\,{10}^{-12}$$ probed by NuSTAR. In the hard 8–24 keV band NuSTAR probes fluxes over the range $$2\\times {10}^{-14}\\,\\lesssim $$ S(8–24 keV)/$$\\mathrm{erg}\\,{{\\rm{s}}}^{-1}\\,{\\mathrm{cm}}^{-2}\\,\\lesssim \\,{10}^{-12}$$, a factor ~100 fainter than previous measurements. The 8–24 keV number counts match predictions from AGN population synthesis models, directly confirming the existence of a population of obscured and/or hard X-ray sources inferred from the shape of the integrated cosmic X-ray background. The measured NuSTAR counts lie significantly above simple extrapolation with a Euclidian slope to low flux of the Swift/BAT 15–55 keV number counts measured at higher fluxes (S(15–55 keV) gsim 10-11 $$\\mathrm{erg}\\,{{\\rm{s}}}^{-1}\\,{\\mathrm{cm}}^{-2}$$), reflecting the evolution of the AGN population between the Swift/BAT local ($$z\\lt 0.1$$) sample and NuSTAR's $$z\\sim 1$$ sample. CXB synthesis models, which account for AGN evolution, lie above the Swift/BAT measurements, suggesting that they do not fully capture the evolution of obscured AGNs at low redshifts.« less

  7. Comparisons of monthly mean cosmic ray counting rates observes from worldwide network of neutron monitors

    NASA Technical Reports Server (NTRS)

    Ryu, J. Y.; Wada, M.

    1985-01-01

    In order to examine the stability of neutron monitor observation, each of the monthly average counting rates of a neutron monitors is correlated to those of Kiel neutron monitor. The regression coefficients thus obtained are compared with the coupling coefficients of isotropic intensity radiation. The results of the comparisons for five year periods during 1963 to 1982, and for whole period are given. The variation spectrum with a single power law with an exponent of -0.75 up to 50 GV is not so unsatisfactory one. More than one half of the stations show correlations with the coefficient greater than 0.9. Some stations have shifted the level of mean counting rates by changing the instrumental characteristics which can be adjusted.

  8. On-Orbit Sky Background Measurements with the FOS

    NASA Technical Reports Server (NTRS)

    Lyons, R. W.; Baity, W. A.; Beaver, E. A.; Cohen, R. D.; Junkkarinen, V. T.; Linsky, J. B.; Bohlin, R. C.

    1993-01-01

    Observations of the sky background obtained with the Faint Object Spectrograph during 1991-1992 are discussed. Sky light can be an important contributor to the observed count rate in several of the instrument configurations especially when large apertures are used. In general, the sky background is consistent with the pre-launch expectations and showed the expected effects of zodiacal light and diffuse galactic light. In addition to these sources, there is, particularly during the daytime, a highly variable airglow component which includes a number of emission lines. The sky background will have an impact on the reduction and possibly the interpretation of some spectra.

  9. Fast neutron background characterization with the Radiological Multi-sensor Analysis Platform (RadMAP)

    DOE PAGES

    Davis, John R.; Brubaker, Erik; Vetter, Kai

    2017-03-29

    In an effort to characterize the fast neutron radiation background, 16 EJ-309 liquid scintillator cells were installed in the Radiological Multi-sensor Analysis Platform (RadMAP) to collect data in the San Francisco Bay Area. Each fast neutron event was associated with specific weather metrics (pressure, temperature, absolute humidity) and GPS coordinates. Furthermore, the expected exponential dependence of the fast neutron count rate on atmospheric pressure was demonstrated and event rates were subsequently adjusted given the measured pressure at the time of detection. Pressure adjusted data was also used to investigate the influence of other environmental conditions on the neutron background rate.more » Using National Oceanic and Atmospheric Administration (NOAA) coastal area lidar data, an algorithm was implemented to approximate sky-view factors (the total fraction of visible sky) for points along RadMAPs route. In the three areas we analyzed, San Francisco, Downtown Oakland, and Berkeley, all demonstrated a suppression in the background rate of over 50% for the range of sky-view factors measured. This effect, which is due to the shielding of cosmic-ray produced neutrons by surrounding buildings, was comparable to the pressure influence which yielded a 32% suppression in the count rate over the range of pressures measured.« less

  10. Effects of flicker rate, complexity, and color combinations of Chinese characters and backgrounds on visual search performance with varying flicker types.

    PubMed

    Huang, Kuo-Chen; Lin, Rung-Tai; Wu, Chih-Fu

    2011-08-01

    This study investigated the effects of number of strokes in Chinese characters, flicker rate, flicker type, and character/background color combination on search performance. 37 participants ages 14 to 18 years were randomly assigned to each flicker-type condition. The search field contained 36 characters arranged in a 6 x 6 matrix. Participants were asked to search for the target characters among the surrounding distractors and count how many target characters were displayed in the search array. Analysis indicated that the character/background color combination significantly affected search times. The color combinations of white/purple and white/green yielded search times greater than those for black/white and black/yellow combinations. A significant effect for flicker type on search time was also identified. Rotating characters facilitated search time, compared with twinkling ones. The number of strokes and the flicker rates also had positive effects on search performances. For flicker rate, the search accuracy for 0.5 Hz was greater than that for 1.0 Hz, and the latter was also greater than that for 2.0 Hz. Results are applicable to web advertisement designs containing dynamic characters, in terms of how to best capture readers' attention by various means of dynamic character presentation.

  11. High Resolution Gamma Ray Spectroscopy at MHz Counting Rates With LaBr3 Scintillators for Fusion Plasma Applications

    NASA Astrophysics Data System (ADS)

    Nocente, M.; Tardocchi, M.; Olariu, A.; Olariu, S.; Pereira, R. C.; Chugunov, I. N.; Fernandes, A.; Gin, D. B.; Grosso, G.; Kiptily, V. G.; Neto, A.; Shevelev, A. E.; Silva, M.; Sousa, J.; Gorini, G.

    2013-04-01

    High resolution γ-ray spectroscopy measurements at MHz counting rates were carried out at nuclear accelerators, combining a LaBr 3(Ce) detector with dedicated hardware and software solutions based on digitization and off-line analysis. Spectra were measured at counting rates up to 4 MHz, with little or no degradation of the energy resolution, adopting a pile up rejection algorithm. The reported results represent a step forward towards the final goal of high resolution γ-ray spectroscopy measurements on a burning plasma device.

  12. Changes in Sensitization Rate to Weed Allergens in Children with Increased Weeds Pollen Counts in Seoul Metropolitan Area

    PubMed Central

    Kim, Joo-Hwa; Lee, Ha-Baik; Kim, Seong-Won; Kang, Im-Joo; Kook, Myung-Hee; Kim, Bong-Seong; Park, Kang-Seo; Baek, Hey-Sung; Kim, Kyu-Rang; Choi, Young-Jean

    2012-01-01

    The prevalence of allergic diseases in children has increased for several decades. We evaluated the correlation between pollen count of weeds and their sensitization rate in Seoul, 1997-2009. Airborne particles carrying allergens were collected daily from 3 stations around Seoul. Skin prick tests to pollen were performed on children with allergic diseases. Ragweed pollen gradually increased between 1999 and 2005, decreased after 2005 and plateaued until 2009 (peak counts, 67 in 2003, 145 in 2005 and 83 grains/m3/day in 2007). Japanese hop pollen increased between 2002 and 2009 (peak counts, 212 in 2006 and 492 grains/m3/day in 2009). Sensitization rates to weed pollen, especially ragweed and Japanese hop in children with allergic diseases, increased annually (ragweed, 2.2% in 2000 and 2.8% in 2002; Japanese hop, 1.4% in 2000 and 1.9% in 2002). The age for sensitization to pollen gradually became younger since 2000 (4 to 6 yr of age, 3.5% in 1997 and 6.2% in 2009; 7 to 9 yr of age, 4.2% in 1997 and 6.4% in 2009). In conclusion, sensitization rates for weed pollens increase in Korean children given increasing pollen counts of ragweed and Japanese hop. PMID:22468096

  13. Changes in sensitization rate to weed allergens in children with increased weeds pollen counts in Seoul metropolitan area.

    PubMed

    Kim, Joo-Hwa; Oh, Jae-Won; Lee, Ha-Baik; Kim, Seong-Won; Kang, Im-Joo; Kook, Myung-Hee; Kim, Bong-Seong; Park, Kang-Seo; Baek, Hey-Sung; Kim, Kyu-Rang; Choi, Young-Jean

    2012-04-01

    The prevalence of allergic diseases in children has increased for several decades. We evaluated the correlation between pollen count of weeds and their sensitization rate in Seoul, 1997-2009. Airborne particles carrying allergens were collected daily from 3 stations around Seoul. Skin prick tests to pollen were performed on children with allergic diseases. Ragweed pollen gradually increased between 1999 and 2005, decreased after 2005 and plateaued until 2009 (peak counts, 67 in 2003, 145 in 2005 and 83 grains/m(3)/day in 2007). Japanese hop pollen increased between 2002 and 2009 (peak counts, 212 in 2006 and 492 grains/m(3)/day in 2009). Sensitization rates to weed pollen, especially ragweed and Japanese hop in children with allergic diseases, increased annually (ragweed, 2.2% in 2000 and 2.8% in 2002; Japanese hop, 1.4% in 2000 and 1.9% in 2002). The age for sensitization to pollen gradually became younger since 2000 (4 to 6 yr of age, 3.5% in 1997 and 6.2% in 2009; 7 to 9 yr of age, 4.2% in 1997 and 6.4% in 2009). In conclusion, sensitization rates for weed pollens increase in Korean children given increasing pollen counts of ragweed and Japanese hop.

  14. Real-time people counting system using a single video camera

    NASA Astrophysics Data System (ADS)

    Lefloch, Damien; Cheikh, Faouzi A.; Hardeberg, Jon Y.; Gouton, Pierre; Picot-Clemente, Romain

    2008-02-01

    There is growing interest in video-based solutions for people monitoring and counting in business and security applications. Compared to classic sensor-based solutions the video-based ones allow for more versatile functionalities, improved performance with lower costs. In this paper, we propose a real-time system for people counting based on single low-end non-calibrated video camera. The two main challenges addressed in this paper are: robust estimation of the scene background and the number of real persons in merge-split scenarios. The latter is likely to occur whenever multiple persons move closely, e.g. in shopping centers. Several persons may be considered to be a single person by automatic segmentation algorithms, due to occlusions or shadows, leading to under-counting. Therefore, to account for noises, illumination and static objects changes, a background substraction is performed using an adaptive background model (updated over time based on motion information) and automatic thresholding. Furthermore, post-processing of the segmentation results is performed, in the HSV color space, to remove shadows. Moving objects are tracked using an adaptive Kalman filter, allowing a robust estimation of the objects future positions even under heavy occlusion. The system is implemented in Matlab, and gives encouraging results even at high frame rates. Experimental results obtained based on the PETS2006 datasets are presented at the end of the paper.

  15. Relationship of long-term highly active antiretroviral therapy on salivary flow rate and CD4 Count among HIV-infected patients.

    PubMed

    Kumar, J Vijay; Baghirath, P Venkat; Naishadham, P Parameswar; Suneetha, Sujai; Suneetha, Lavanya; Sreedevi, P

    2015-01-01

    To determine if long-term highly active antiretroviral therapy (HAART) therapy alters salivary flow rate and also to compare its relation of CD4 count with unstimulated and stimulated whole saliva. A cross-sectional study was performed on 150 individuals divided into three groups. Group I (50 human immunodeficiency virus (HIV) seropositive patients, but not on HAART therapy), Group II (50 HIV-infected subjects and on HAART for less than 3 years called short-term HAART), Group III (50 HIV-infected subjects and on HAART for more than or equal to 3 years called long-term HAART). Spitting method proposed by Navazesh and Kumar was used for the measurement of unstimulated and stimulated salivary flow rate. Chi-square test and analysis of variance (ANOVA) were used for statistical analysis. The mean CD4 count was 424.78 ± 187.03, 497.82 ± 206.11 and 537.6 ± 264.00 in the respective groups. Majority of the patients in all the groups had a CD4 count between 401 and 600. Both unstimulated and stimulated whole salivary (UWS and SWS) flow rates in Group I was found to be significantly higher than in Group II (P < 0.05). Unstimulated salivary flow rate between Group II and III subjects were also found to be statistically significant (P < 0.05). ANOVA performed between CD4 count and unstimulated and stimulated whole saliva in each group demonstrated a statistically significant relationship in Group II (P < 0.05). There were no significant results found between CD4 count and stimulated whole saliva in each groups. The reduction in CD4 cell counts were significantly associated with salivary flow rates of HIV-infected individuals who are on long-term HAART.

  16. Gamma-gamma coincidence performance of LaBr 3:Ce scintillation detectors vs HPGe detectors in high count-rate scenarios

    DOE PAGES

    Drescher, A.; Yoho, M.; Landsberger, S.; ...

    2017-01-15

    In this study, a radiation detection system consisting of two cerium doped lanthanum bromide (LaBr 3:Ce) scintillation detectors in a gamma-gamma coincidence configuration has been used to demonstrate the advantages that coincident detection provides relative to a single detector, and the advantages that LaBr 3:Ce detectors provide relative to high purity germanium (HPGe) detectors. Signal to noise ratios of select photopeak pairs for these detectors have been compared to high-purity germanium (HPGe) detectors in both single and coincident detector configurations in order to quantify the performance of each detector configuration. The efficiency and energy resolution of LaBr 3:Ce detectors havemore » been determined and compared to HPGe detectors. Coincident gamma-ray pairs from the radionuclides 152Eu and 133Ba have been identified in a sample that is dominated by 137Cs. Gamma-gamma coincidence successfully reduced the Compton continuum from the large 137Cs peak, revealed several coincident gamma energies characteristic of these nuclides, and improved the signal-to-noise ratio relative to single detector measurements. LaBr 3:Ce detectors performed at count rates multiple times higher than can be achieved with HPGe detectors. The standard background spectrum consisting of peaks associated with transitions within the LaBr 3:Ce crystal has also been significantly reduced. Finally, it is shown that LaBr 3:Ce detectors have the unique capability to perform gamma-gamma coincidence measurements in very high count rate scenarios, which can potentially benefit nuclear safeguards in situ measurements of spent nuclear fuel.« less

  17. Gamma-gamma coincidence performance of LaBr 3:Ce scintillation detectors vs HPGe detectors in high count-rate scenarios

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drescher, A.; Yoho, M.; Landsberger, S.

    In this study, a radiation detection system consisting of two cerium doped lanthanum bromide (LaBr 3:Ce) scintillation detectors in a gamma-gamma coincidence configuration has been used to demonstrate the advantages that coincident detection provides relative to a single detector, and the advantages that LaBr 3:Ce detectors provide relative to high purity germanium (HPGe) detectors. Signal to noise ratios of select photopeak pairs for these detectors have been compared to high-purity germanium (HPGe) detectors in both single and coincident detector configurations in order to quantify the performance of each detector configuration. The efficiency and energy resolution of LaBr 3:Ce detectors havemore » been determined and compared to HPGe detectors. Coincident gamma-ray pairs from the radionuclides 152Eu and 133Ba have been identified in a sample that is dominated by 137Cs. Gamma-gamma coincidence successfully reduced the Compton continuum from the large 137Cs peak, revealed several coincident gamma energies characteristic of these nuclides, and improved the signal-to-noise ratio relative to single detector measurements. LaBr 3:Ce detectors performed at count rates multiple times higher than can be achieved with HPGe detectors. The standard background spectrum consisting of peaks associated with transitions within the LaBr 3:Ce crystal has also been significantly reduced. Finally, it is shown that LaBr 3:Ce detectors have the unique capability to perform gamma-gamma coincidence measurements in very high count rate scenarios, which can potentially benefit nuclear safeguards in situ measurements of spent nuclear fuel.« less

  18. Pile-up correction algorithm based on successive integration for high count rate medical imaging and radiation spectroscopy

    NASA Astrophysics Data System (ADS)

    Mohammadian-Behbahani, Mohammad-Reza; Saramad, Shahyar

    2018-07-01

    In high count rate radiation spectroscopy and imaging, detector output pulses tend to pile up due to high interaction rate of the particles with the detector. Pile-up effects can lead to a severe distortion of the energy and timing information. Pile-up events are conventionally prevented or rejected by both analog and digital electronics. However, for decreasing the exposure times in medical imaging applications, it is important to maintain the pulses and extract their true information by pile-up correction methods. The single-event reconstruction method is a relatively new model-based approach for recovering the pulses one-by-one using a fitting procedure, for which a fast fitting algorithm is a prerequisite. This article proposes a fast non-iterative algorithm based on successive integration which fits the bi-exponential model to experimental data. After optimizing the method, the energy spectra, energy resolution and peak-to-peak count ratios are calculated for different counting rates using the proposed algorithm as well as the rejection method for comparison. The obtained results prove the effectiveness of the proposed method as a pile-up processing scheme designed for spectroscopic and medical radiation detection applications.

  19. Can simple mobile phone applications provide reliable counts of respiratory rates in sick infants and children? An initial evaluation of three new applications.

    PubMed

    Black, James; Gerdtz, Marie; Nicholson, Pat; Crellin, Dianne; Browning, Laura; Simpson, Julie; Bell, Lauren; Santamaria, Nick

    2015-05-01

    Respiratory rate is an important sign that is commonly either not recorded or recorded incorrectly. Mobile phone ownership is increasing even in resource-poor settings. Phone applications may improve the accuracy and ease of counting of respiratory rates. The study assessed the reliability and initial users' impressions of four mobile phone respiratory timer approaches, compared to a 60-second count by the same participants. Three mobile applications (applying four different counting approaches plus a standard 60-second count) were created using the Java Mobile Edition and tested on Nokia C1-01 phones. Apart from the 60-second timer application, the others included a counter based on the time for ten breaths, and three based on the time interval between breaths ('Once-per-Breath', in which the user presses for each breath and the application calculates the rate after 10 or 20 breaths, or after 60s). Nursing and physiotherapy students used the applications to count respiratory rates in a set of brief video recordings of children with different respiratory illnesses. Limits of agreement (compared to the same participant's standard 60-second count), intra-class correlation coefficients and standard errors of measurement were calculated to compare the reliability of the four approaches, and a usability questionnaire was completed by the participants. There was considerable variation in the counts, with large components of the variation related to the participants and the videos, as well as the methods. None of the methods was entirely reliable, with no limits of agreement better than -10 to +9 breaths/min. Some of the methods were superior to the others, with ICCs from 0.24 to 0.92. By ICC the Once-per-Breath 60-second count and the Once-per-Breath 20-breath count were the most consistent, better even than the 60-second count by the participants. The 10-breath approaches performed least well. Users' initial impressions were positive, with little difference between the

  20. A relationship between salivary flow rates and Candida counts in patients with xerostomia.

    PubMed

    Nadig, Suchetha Devendrappa; Ashwathappa, Deepak Timmasandra; Manjunath, Muniraju; Krishna, Sowmya; Annaji, Araleri Gopalkrishna; Shivaprakash, Praveen Kunigal

    2017-01-01

    Most of the adult population is colonized by Candida in their oral cavity. The process of colonization depends on several factors, including the interaction between Candida and salivary proteins. Therefore, salivary gland hypofunction may alter the oral microbiota and increase the risk for opportunistic infections, such as candidiasis. Hence, it is necessary to evaluate the relationship between salivary flow rates (SFRs) and Candida colony counts in the saliva of patients with xerostomia. This study aims to determine and evaluate the relationship between SFRs and Candida colony forming units (CFUs) in patients with xerostomia. This study was a descriptive study. The study participants were taken from the patients attending outpatient department in a private dental college. Fifty patients, who reported xerostomia in a questionnaire of the symptoms of xerostomia, were selected. Chewing stimulated whole saliva samples were collected from them and their SFRs were assessed. Saliva samples were inoculated in the Sabouraud dextrose agar culture media for 24-48 h, and Candida CFUs were counted. Chi-squared test was used to analyze the data. There was a significant inverse relationship between salivary flow and candida CFUs count when patients with high colony counts were analyzed (cutoff point of 400 or greater CFU/mL). Females had less SFR than males. Most of the patients who had hyposalivation were taking medication for the underlying systemic diseases. Candida albicans was the most frequent species. There was a significantly negative correlation between SFRs and Candida CFUs in the patients with xerostomia.

  1. Real Time Coincidence Detection Engine for High Count Rate Timestamp Based PET

    NASA Astrophysics Data System (ADS)

    Tetrault, M.-A.; Oliver, J. F.; Bergeron, M.; Lecomte, R.; Fontaine, R.

    2010-02-01

    Coincidence engines follow two main implementation flows: timestamp based systems and AND-gate based systems. The latter have been more widespread in recent years because of its lower cost and high efficiency. However, they are highly dependent on the selected electronic components, they have limited flexibility once assembled and they are customized to fit a specific scanner's geometry. Timestamp based systems are gathering more attention lately, especially with high channel count fully digital systems. These new systems must however cope with important singles count rates. One option is to record every detected event and postpone coincidence detection offline. For daily use systems, a real time engine is preferable because it dramatically reduces data volume and hence image preprocessing time and raw data management. This paper presents the timestamp based coincidence engine for the LabPET¿, a small animal PET scanner with up to 4608 individual readout avalanche photodiode channels. The engine can handle up to 100 million single events per second and has extensive flexibility because it resides in programmable logic devices. It can be adapted for any detector geometry or channel count, can be ported to newer, faster programmable devices and can have extra modules added to take advantage of scanner-specific features. Finally, the user can select between full processing mode for imaging protocols and minimum processing mode to study different approaches for coincidence detection with offline software.

  2. Predicting county-level cancer incidence rates and counts in the United States

    PubMed Central

    Yu, Binbing

    2018-01-01

    Many countries, including the United States, publish predicted numbers of cancer incidence and death in current and future years for the whole country. These predictions provide important information on the cancer burden for cancer control planners, policymakers and the general public. Based on evidence from several empirical studies, the joinpoint (segmented-line linear regression) model has been adopted by the American Cancer Society to estimate the number of new cancer cases in the United States and in individual states since 2007. Recently, cancer incidence in smaller geographic regions such as counties and FIPS code regions is of increasing interest by local policymakers. The natural extension is to directly apply the joinpoint model to county-level cancer incidence data. The direct application has several drawbacks and its performance has not been evaluated. To address the concerns, we developed a spatial random-effects joinpoint model for county-level cancer incidence data. The proposed model was used to predict both cancer incidence rates and counts at the county level. The standard joinpoint model and the proposed method were compared through a validation study. The proposed method out-performed the standard joinpoint model for almost all cancer sites, especially for moderate or rare cancer sites and for counties with small population sizes. As an application, we predicted county-level prostate cancer incidence rates and counts for the year 2011 in Connecticut. PMID:23670947

  3. Statistical study of muons counts rates in differents directions, observed at the Brazilian Southern Space Observatory

    NASA Astrophysics Data System (ADS)

    Grams, Guilherme; Schuch, Nelson Jorge; Braga, Carlos Roberto; Purushottam Kane, Rajaram; Echer, Ezequiel; Ronan Coelho Stekel, Tardelli

    Cosmic ray are charged particles, at the most time protons, that reach the earth's magne-tosphere from interplanetary space with velocities greater than the solar wind. When these impinge the atmosphere, they interact with atmosphere constituents and decay into sub-particles forming an atmospheric shower. The muons are the sub-particles which normally maintain the originated direction of the primary cosmic ray. A multi-directional muon detec-tor (MMD) was installed in 2001 and upgraded in 2005, through an international cooperation between Brazil, Japan and USA, and operated since then at the Southern Space Observatory -SSO/CRS/CCR/INPE -MCT, (29,4° S, 53,8° W, 480m a.s.l.), São Martinho da Serra, RS, a Brazil. The main objetive of this work is to present a statistical analysis of the intensity of muons, with energy between 50 and 170 GeV, in differents directions, measured by the SSO's multi-directional muon detector. The analysis was performed with data from 2006 and 2007 collected by the SSO's MMD. The MMD consists of two layers of 4x7 detectors with a total observation area of 28 m2 . The counting of muons in each directional channel is made by a coincidence of pulses pair, one from a detector in the upper layer and the other from a detector in the lower layer. The SSO's MMD is equipped with 119 directional channels for muon count rate measurement and is capable of detecting muons incident with zenithal angle between 0° and 75,53° . A statistical analysis was made with the MMD muon count rate for all the di-rectional channels. The average and the standard deviation of the muon count rate in each directional component were calculated. The results show lower cont rate for the channels with larger zenith, and higher cont rate with smaller zenith, as expected from the production and propagation of muons in the atmosphere. It is also possible to identify the Stormer cone. The SSO's MMD is also a detector component of the Global Muon Detector Network (GMDN

  4. A relationship between salivary flow rates and Candida counts in patients with xerostomia

    PubMed Central

    Nadig, Suchetha Devendrappa; Ashwathappa, Deepak Timmasandra; Manjunath, Muniraju; Krishna, Sowmya; Annaji, Araleri Gopalkrishna; Shivaprakash, Praveen Kunigal

    2017-01-01

    Context: Most of the adult population is colonized by Candida in their oral cavity. The process of colonization depends on several factors, including the interaction between Candida and salivary proteins. Therefore, salivary gland hypofunction may alter the oral microbiota and increase the risk for opportunistic infections, such as candidiasis. Hence, it is necessary to evaluate the relationship between salivary flow rates (SFRs) and Candida colony counts in the saliva of patients with xerostomia. Aims: This study aims to determine and evaluate the relationship between SFRs and Candida colony forming units (CFUs) in patients with xerostomia. Settings and Design: This study was a descriptive study. Subjects and Methods: The study participants were taken from the patients attending outpatient department in a private dental college. Fifty patients, who reported xerostomia in a questionnaire of the symptoms of xerostomia, were selected. Chewing stimulated whole saliva samples were collected from them and their SFRs were assessed. Saliva samples were inoculated in the Sabouraud dextrose agar culture media for 24–48 h, and Candida CFUs were counted. Statistical Analysis Used: Chi-squared test was used to analyze the data. Results: There was a significant inverse relationship between salivary flow and candida CFUs count when patients with high colony counts were analyzed (cutoff point of 400 or greater CFU/mL). Females had less SFR than males. Most of the patients who had hyposalivation were taking medication for the underlying systemic diseases. Candida albicans was the most frequent species. Conclusions: There was a significantly negative correlation between SFRs and Candida CFUs in the patients with xerostomia. PMID:28932047

  5. AURORA on MEGSAT 1: a photon counting observatory for the Earth UV night-sky background and Aurora emission

    NASA Astrophysics Data System (ADS)

    Monfardini, A.; Trampus, P.; Stalio, R.; Mahne, N.; Battiston, R.; Menichelli, M.; Mazzinghi, P.

    2001-08-01

    A low-mass, low-cost photon-counting scientific payload has been developed and launched on a commercial microsatellite in order to study the near-UV night-sky background emission with a telescope nicknamed ``Notte'' and the Aurora emission with ``Alba''. AURORA, this is the name of the experiment, will determine, with the ``Notte'' channel, the overall night-side photon background in the 300-400nm spectral range, together with a particular 2+N2 line (λc=337nm). The ``Alba'' channel, on the other hand, will study the Aurora emissions in four different spectral bands (FWHM=8.4-9.6nm) centered on: 367nm (continuum evaluation), 391nm (1-N+2), 535nm (continuum evaluation), 560nm (OI). The instrument has been launched on the 26 September, 2000 from the Baikonur cosmodrome on a modified SS18 Dnepr-1 ``Satan'' rocket. The satellite orbit is nearly circular (hapogee=648km, /e=0.0022), and the inclination of the orbital plane is 64.56°. An overview of the techniques adopted is given in this paper.

  6. Comparing Background and Recent Erosion Rates in Degraded Areas of Southeastern Brazil

    NASA Astrophysics Data System (ADS)

    Fernandes, N.; Bierman, P. R.; Sosa-Gonzalez, V.; Rood, D. H.; Fontes, R. L.; Santos, A. C.; Godoy, J. M.; Bhering, S.

    2014-12-01

    Soil erosion is a major problem in northwestern Rio de Janeiro State where, during the last three centuries, major land-use changes took place, associated with the replacement of the original rainforest by agriculture and grazing. The combination of steep hillslopes, erodible soils, sparse vegetation, natural and human-induced fires, as well as downslope ploughing, led to an increase in surface runoff and surface erosion on soil-mantled hillslopes; together, these actions and responses caused a decline in soil productivity. In order to estimate changes in erosion rates over time, we compared erosion rates measured at different spatial and temporal scales, both background (natural) and short-term (human-induced during last few decades). Background long-term erosion rates were measured using in-situ produced cosmogenic 10Be in the sand fraction quartz of active river channel sediment in four basins in the northwestern portion of Rio de Janeiro State. In these basins, average annual precipitation varies from 1,200 to 1,300 mm, while drainage areas vary from 15 to 7,200 km2. Short-term erosion rates were measured in one of these basins from fallout 210Pb in soil samples collected along a hillslope transect located in an abandoned agriculture field. In this transect, 190 undisturbed soil samples (three replicates) were collected from the surface to 0.50 m depth (5 cm vertical intervals) in six soil pits. 10Be average background, basin-wide, erosion rates in the area are ~ 13 m/My; over the last decades, time-integrated (210Pb) average hillslope erosion rates are around 1450 m/Myr, with maximum values at the steepest portion of convex hillslopes of about 2000 m/Myr. These results suggest that recent hillslope erosion rates are about 2 orders of magnitude above background rates of sediment generation integrated over many millennia. This unsustainable rate of soil loss has severely decreased soil productivity eventually leading to the abandonment of farming activities in

  7. Linking reproduction and survival can improve model estimates of vital rates derived from limited time-series counts of pinnipeds and other species.

    PubMed

    Battaile, Brian C; Trites, Andrew W

    2013-01-01

    We propose a method to model the physiological link between somatic survival and reproductive output that reduces the number of parameters that need to be estimated by models designed to determine combinations of birth and death rates that produce historic counts of animal populations. We applied our Reproduction and Somatic Survival Linked (RSSL) method to the population counts of three species of North Pacific pinnipeds (harbor seals, Phoca vitulina richardii (Gray, 1864); northern fur seals, Callorhinus ursinus (L., 1758); and Steller sea lions, Eumetopias jubatus (Schreber, 1776))--and found our model outperformed traditional models when fitting vital rates to common types of limited datasets, such as those from counts of pups and adults. However, our model did not perform as well when these basic counts of animals were augmented with additional observations of ratios of juveniles to total non-pups. In this case, the failure of the ratios to improve model performance may indicate that the relationship between survival and reproduction is redefined or disassociated as populations change over time or that the ratio of juveniles to total non-pups is not a meaningful index of vital rates. Overall, our RSSL models show advantages to linking survival and reproduction within models to estimate the vital rates of pinnipeds and other species that have limited time-series of counts.

  8. Anti-aliasing techniques in photon-counting depth imaging using GHz clock rates

    NASA Astrophysics Data System (ADS)

    Krichel, Nils J.; McCarthy, Aongus; Collins, Robert J.; Buller, Gerald S.

    2010-04-01

    Single-photon detection technologies in conjunction with low laser illumination powers allow for the eye-safe acquisition of time-of-flight range information on non-cooperative target surfaces. We previously presented a photon-counting depth imaging system designed for the rapid acquisition of three-dimensional target models by steering a single scanning pixel across the field angle of interest. To minimise the per-pixel dwelling times required to obtain sufficient photon statistics for accurate distance resolution, periodic illumination at multi- MHz repetition rates was applied. Modern time-correlated single-photon counting (TCSPC) hardware allowed for depth measurements with sub-mm precision. Resolving the absolute target range with a fast periodic signal is only possible at sufficiently short distances: if the round-trip time towards an object is extended beyond the timespan between two trigger pulses, the return signal cannot be assigned to an unambiguous range value. Whereas constructing a precise depth image based on relative results may still be possible, problems emerge for large or unknown pixel-by-pixel separations or in applications with a wide range of possible scene distances. We introduce a technique to avoid range ambiguity effects in time-of-flight depth imaging systems at high average pulse rates. A long pseudo-random bitstream is used to trigger the illuminating laser. A cyclic, fast-Fourier supported analysis algorithm is used to search for the pattern within return photon events. We demonstrate this approach at base clock rates of up to 2 GHz with varying pattern lengths, allowing for unambiguous distances of several kilometers. Scans at long stand-off distances and of scenes with large pixel-to-pixel range differences are presented. Numerical simulations are performed to investigate the relative merits of the technique.

  9. Low gamma counting for measuring NORM/TENORM with a radon reducing system

    NASA Astrophysics Data System (ADS)

    Paschoa, Anselmo S.

    2001-06-01

    A detection system for counting low levels of gamma radiation was built by upgrading an existing rectangular chamber made of 18 metric tonne of steel fabricated before World War II. The internal walls, the ceiling, and the floor of the chamber are covered with copper sheets. The new detection system consists of a stainless steel hollow cylinder with variable circular apertures in the cylindrical wall and in the base, to allow introduction of a NaI (Tl) crystal, or alternatively, a HPGe detector in its interior. This counting system is mounted inside the larger chamber, which in turn is located in a subsurface air-conditioned room. The access to the subsurface room is made from a larger entrance room through a tunnel plus a glass anteroom to decrease the air-exchange rate. Both sample and detector are housed inside the stainless steel cylinder. This cylinder is filled with hyper pure nitrogen gas, before counting a sample, to prevent radon coming into contact with the detector surface. As a consequence, the contribution of the 214Bi photopeaks to the background gamma spectra is minimized. The reduction of the gamma radiation background near the detector facilitates measurement of naturally occurring radioactive materials (NORM), and/or technologically enhanced NORM (TENORM), which are usually at concentration levels only slightly higher than those typically found in the natural radioactive background.

  10. Modeling radon daughter deposition rates for low background detectors

    NASA Astrophysics Data System (ADS)

    Westerdale, S.; Guiseppe, V. E.; Rielage, K.; Elliot, S. R.; Hime, A.

    2009-10-01

    Detectors such as those looking for dark matter and those working to detect neutrinoless double-beta decay require record low levels of background radiation. One major source of background radiation is from radon daughters that decay from airborne radon. In particular, ^222Rn decay products may be deposited on any detector materials that are exposed to environmental radon. Long-lasting daughters, especially ^210Pb, can pose a long-term background radiation source that can interfere with the detectors' measurements by emitting alpha particles into sensitive parts of the detectors. A better understanding of this radon daughter deposition will allow for preventative actions to be taken to minimize the amount of noise from this source. A test stand has therefore been set up to study the impact of various environmental factors on the rate of radon daughter deposition so that a model can be constructed. Results from the test stand and a model of radon daughter deposition will be presented.

  11. Detecting trends in raptor counts: power and type I error rates of various statistical tests

    USGS Publications Warehouse

    Hatfield, J.S.; Gould, W.R.; Hoover, B.A.; Fuller, M.R.; Lindquist, E.L.

    1996-01-01

    We conducted simulations that estimated power and type I error rates of statistical tests for detecting trends in raptor population count data collected from a single monitoring site. Results of the simulations were used to help analyze count data of bald eagles (Haliaeetus leucocephalus) from 7 national forests in Michigan, Minnesota, and Wisconsin during 1980-1989. Seven statistical tests were evaluated, including simple linear regression on the log scale and linear regression with a permutation test. Using 1,000 replications each, we simulated n = 10 and n = 50 years of count data and trends ranging from -5 to 5% change/year. We evaluated the tests at 3 critical levels (alpha = 0.01, 0.05, and 0.10) for both upper- and lower-tailed tests. Exponential count data were simulated by adding sampling error with a coefficient of variation of 40% from either a log-normal or autocorrelated log-normal distribution. Not surprisingly, tests performed with 50 years of data were much more powerful than tests with 10 years of data. Positive autocorrelation inflated alpha-levels upward from their nominal levels, making the tests less conservative and more likely to reject the null hypothesis of no trend. Of the tests studied, Cox and Stuart's test and Pollard's test clearly had lower power than the others. Surprisingly, the linear regression t-test, Collins' linear regression permutation test, and the nonparametric Lehmann's and Mann's tests all had similar power in our simulations. Analyses of the count data suggested that bald eagles had increasing trends on at least 2 of the 7 national forests during 1980-1989.

  12. Pulse shape discrimination of Cs2LiYCl6:Ce3+ detectors at high count rate based on triangular and trapezoidal filters

    NASA Astrophysics Data System (ADS)

    Wen, Xianfei; Enqvist, Andreas

    2017-09-01

    Cs2LiYCl6:Ce3+ (CLYC) detectors have demonstrated the capability to simultaneously detect γ-rays and thermal and fast neutrons with medium energy resolution, reasonable detection efficiency, and substantially high pulse shape discrimination performance. A disadvantage of CLYC detectors is the long scintillation decay times, which causes pulse pile-up at moderate input count rate. Pulse processing algorithms were developed based on triangular and trapezoidal filters to discriminate between neutrons and γ-rays at high count rate. The algorithms were first tested using low-rate data. They exhibit a pulse-shape discrimination performance comparable to that of the charge comparison method, at low rate. Then, they were evaluated at high count rate. Neutrons and γ-rays were adequately identified with high throughput at rates of up to 375 kcps. The algorithm developed using the triangular filter exhibits discrimination capability marginally higher than that of the trapezoidal filter based algorithm irrespective of low or high rate. The algorithms exhibit low computational complexity and are executable on an FPGA in real-time. They are also suitable for application to other radiation detectors whose pulses are piled-up at high rate owing to long scintillation decay times.

  13. A physics investigation of deadtime losses in neutron counting at low rates with Cf252

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Louise G; Croft, Stephen

    2009-01-01

    {sup 252}Cf spontaneous fission sources are used for the characterization of neutron counters and the determination of calibration parameters; including both neutron coincidence counting (NCC) and neutron multiplicity deadtime (DT) parameters. Even at low event rates, temporally-correlated neutron counting using {sup 252}Cf suffers a deadtime effect. Meaning that in contrast to counting a random neutron source (e.g. AmLi to a close approximation), DT losses do not vanish in the low rate limit. This is because neutrons are emitted from spontaneous fission events in time-correlated 'bursts', and are detected over a short period commensurate with their lifetime in the detector (characterizedmore » by the system die-away time, {tau}). Thus, even when detected neutron events from different spontaneous fissions are unlikely to overlap in time, neutron events within the detected 'burst' are subject to intrinsic DT losses. Intrinsic DT losses for dilute Pu will be lower since the multiplicity distribution is softer, but real items also experience self-multiplication which can increase the 'size' of the bursts. Traditional NCC DT correction methods do not include the intrinsic (within burst) losses. We have proposed new forms of the traditional NCC Singles and Doubles DT correction factors. In this work, we apply Monte Carlo neutron pulse train analysis to investigate the functional form of the deadtime correction factors for an updating deadtime. Modeling is based on a high efficiency {sup 3}He neutron counter with short die-away time, representing an ideal {sup 3}He based detection system. The physics of dead time losses at low rates is explored and presented. It is observed that new forms are applicable and offer more accurate correction than the traditional forms.« less

  14. Note: Fully integrated active quenching circuit achieving 100 MHz count rate with custom technology single photon avalanche diodes.

    PubMed

    Acconcia, G; Labanca, I; Rech, I; Gulinatti, A; Ghioni, M

    2017-02-01

    The minimization of Single Photon Avalanche Diodes (SPADs) dead time is a key factor to speed up photon counting and timing measurements. We present a fully integrated Active Quenching Circuit (AQC) able to provide a count rate as high as 100 MHz with custom technology SPAD detectors. The AQC can also operate the new red enhanced SPAD and provide the timing information with a timing jitter Full Width at Half Maximum (FWHM) as low as 160 ps.

  15. Evaluation of two-stage system for neutron measurement aiming at increase in count rate at Japan Atomic Energy Agency-Fusion Neutronics Source

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shinohara, K., E-mail: shinohara.koji@jaea.go.jp; Ochiai, K.; Sukegawa, A.

    In order to increase the count rate capability of a neutron detection system as a whole, we propose a multi-stage neutron detection system. Experiments to test the effectiveness of this concept were carried out on Fusion Neutronics Source. Comparing four configurations of alignment, it was found that the influence of an anterior stage on a posterior stage was negligible for the pulse height distribution. The two-stage system using 25 mm thickness scintillator was about 1.65 times the count rate capability of a single detector system for d-D neutrons and was about 1.8 times the count rate capability for d-T neutrons.more » The results suggested that the concept of a multi-stage detection system will work in practice.« less

  16. Using Crater Counts to Constrain Erosion Rates on Mars: Implications for the Global Dust Cycle, Sedimentary Rock Erosion and Organic Matter Preservation

    NASA Astrophysics Data System (ADS)

    Mayer, D. P.; Kite, E. S.

    2016-12-01

    Sandblasting, aeolian infilling, and wind deflation all obliterate impact craters on Mars, complicating the use of crater counts for chronology, particularly on sedimentary rock surfaces. However, crater counts on sedimentary rocks can be exploited to constrain wind erosion rates. Relatively small, shallow craters are preferentially obliterated as a landscape undergoes erosion, so the size-frequency distribution of impact craters in a landscape undergoing steady exhumation will develop a shallower power-law slope than a simple production function. Estimating erosion rates is important for several reasons: (1) Wind erosion is a source of mass for the global dust cycle, so the global dust reservoir will disproportionately sample fast-eroding regions; (2) The pace and pattern of recent wind erosion is a sorely-needed constraint on models of the sculpting of Mars' sedimentary-rock mounds; (3) Near-surface complex organic matter on Mars is destroyed by radiation in <108 years, so high rates of surface exhumation are required for preservation of near-surface organic matter. We use crater counts from 18 HiRISE images over sedimentary rock deposits as the basis for estimating erosion rates. Each image was counted by ≥3 analysts and only features agreed on by ≥2 analysts were included in the erosion rate estimation. Erosion rates range from 0.1-0.2 {μ }m/yr across all images. These rates represent an upper limit on surface erosion by landscape lowering. At the conference we will discuss the within and between-image variability of erosion rates and their implications for recent geological processes on Mars.

  17. Progress on the Use of Combined Analog and Photon Counting Detection for Raman Lidar

    NASA Technical Reports Server (NTRS)

    Newsom, Rob; Turner, Dave; Clayton, Marian; Ferrare, Richard

    2008-01-01

    The Atmospheric Radiation Measurement (ARM) program Raman Lidar (CARL) was upgraded in 2004 with a new data system that provides simultaneous measurements of both the photomultiplier analog output voltage and photon counts. The so-called merge value added procedure (VAP) was developed to combine the analog and count-rate signals into a single signal with improved dynamic range. Earlier versions of this VAP tended to cause unacceptably large biases in the water vapor mixing ratio during the daytime as a result of improper matching between the analog and count-rate signals in the presence of elevated solar background levels. We recently identified several problems and tested a modified version of the merge VAP by comparing profiles of water vapor mixing ratio derived from CARL with simultaneous sonde data over a six month period. We show that the modified merge VAP significantly reduces the daytime bias, and results in mean differences that are within approximately 1% for both nighttime and daytime measurements.

  18. The projected background for the CUORE experiment

    DOE PAGES

    Alduino, C.; Alfonso, K.; Artusa, D. R.; ...

    2017-08-14

    The Cryogenic Underground Observatory for Rare Events (CUORE) is designed to search for neutrinoless double beta decay of 130Te with an array of 988 TeO 2 bolometers operating at temperatures around 10 mK. The experiment is currently being commissioned in Hall A of Laboratori Nazionali del Gran Sasso, Italy. The goal of CUORE is to reach a 90% C.L. exclusion sensitivity on the 130Te decay half-life of 9 × 10 25 years after 5 years of data taking. The main issue to be addressed to accomplish this aim is the rate of background events in the region of interest, whichmore » must not be higher than 10 –2 counts/keV/kg/year. We developed a detailed Monte Carlo simulation, based on results from a campaign of material screening, radioassays, and bolometric measurements, to evaluate the expected background. This was used over the years to guide the construction strategies of the experiment and we use it here to project a background model for CUORE. In this paper we report the results of our study and our expectations for the background rate in the energy region where the peak signature of neutrinoless double beta decay of 130Te is expected.« less

  19. The projected background for the CUORE experiment

    NASA Astrophysics Data System (ADS)

    Alduino, C.; Alfonso, K.; Artusa, D. R.; Avignone, F. T.; Azzolini, O.; Banks, T. I.; Bari, G.; Beeman, J. W.; Bellini, F.; Benato, G.; Bersani, A.; Biassoni, M.; Branca, A.; Brofferio, C.; Bucci, C.; Camacho, A.; Caminata, A.; Canonica, L.; Cao, X. G.; Capelli, S.; Cappelli, L.; Carbone, L.; Cardani, L.; Carniti, P.; Casali, N.; Cassina, L.; Chiesa, D.; Chott, N.; Clemenza, M.; Copello, S.; Cosmelli, C.; Cremonesi, O.; Creswick, R. J.; Cushman, J. S.; D'Addabbo, A.; Dafinei, I.; Davis, C. J.; Dell'Oro, S.; Deninno, M. M.; Di Domizio, S.; Di Vacri, M. L.; Drobizhev, A.; Fang, D. Q.; Faverzani, M.; Fernandes, G.; Ferri, E.; Ferroni, F.; Fiorini, E.; Franceschi, M. A.; Freedman, S. J.; Fujikawa, B. K.; Giachero, A.; Gironi, L.; Giuliani, A.; Gladstone, L.; Gorla, P.; Gotti, C.; Gutierrez, T. D.; Haller, E. E.; Han, K.; Hansen, E.; Heeger, K. M.; Hennings-Yeomans, R.; Hickerson, K. P.; Huang, H. Z.; Kadel, R.; Keppel, G.; Kolomensky, Yu. G.; Leder, A.; Ligi, C.; Lim, K. E.; Ma, Y. G.; Maino, M.; Marini, L.; Martinez, M.; Maruyama, R. H.; Mei, Y.; Moggi, N.; Morganti, S.; Mosteiro, P. J.; Napolitano, T.; Nastasi, M.; Nones, C.; Norman, E. B.; Novati, V.; Nucciotti, A.; O'Donnell, T.; Ouellet, J. L.; Pagliarone, C. E.; Pallavicini, M.; Palmieri, V.; Pattavina, L.; Pavan, M.; Pessina, G.; Pettinacci, V.; Piperno, G.; Pira, C.; Pirro, S.; Pozzi, S.; Previtali, E.; Rosenfeld, C.; Rusconi, C.; Sakai, M.; Sangiorgio, S.; Santone, D.; Schmidt, B.; Schmidt, J.; Scielzo, N. D.; Singh, V.; Sisti, M.; Smith, A. R.; Taffarello, L.; Tenconi, M.; Terranova, F.; Tomei, C.; Trentalange, S.; Vignati, M.; Wagaarachchi, S. L.; Wang, B. S.; Wang, H. W.; Welliver, B.; Wilson, J.; Winslow, L. A.; Wise, T.; Woodcraft, A.; Zanotti, L.; Zhang, G. Q.; Zhu, B. X.; Zimmermann, S.; Zucchelli, S.; Laubenstein, M.

    2017-08-01

    The Cryogenic Underground Observatory for Rare Events (CUORE) is designed to search for neutrinoless double beta decay of ^{130}Te with an array of 988 TeO_2 bolometers operating at temperatures around 10 mK. The experiment is currently being commissioned in Hall A of Laboratori Nazionali del Gran Sasso, Italy. The goal of CUORE is to reach a 90% C.L. exclusion sensitivity on the ^{130}Te decay half-life of 9 × 10^{25} years after 5 years of data taking. The main issue to be addressed to accomplish this aim is the rate of background events in the region of interest, which must not be higher than 10^{-2} counts/keV/kg/year. We developed a detailed Monte Carlo simulation, based on results from a campaign of material screening, radioassays, and bolometric measurements, to evaluate the expected background. This was used over the years to guide the construction strategies of the experiment and we use it here to project a background model for CUORE. In this paper we report the results of our study and our expectations for the background rate in the energy region where the peak signature of neutrinoless double beta decay of ^{130}Te is expected.

  20. Tower counts

    USGS Publications Warehouse

    Woody, Carol Ann; Johnson, D.H.; Shrier, Brianna M.; O'Neal, Jennifer S.; Knutzen, John A.; Augerot, Xanthippe; O'Neal, Thomas A.; Pearsons, Todd N.

    2007-01-01

    Counting towers provide an accurate, low-cost, low-maintenance, low-technology, and easily mobilized escapement estimation program compared to other methods (e.g., weirs, hydroacoustics, mark-recapture, and aerial surveys) (Thompson 1962; Siebel 1967; Cousens et al. 1982; Symons and Waldichuk 1984; Anderson 2000; Alaska Department of Fish and Game 2003). Counting tower data has been found to be consistent with that of digital video counts (Edwards 2005). Counting towers do not interfere with natural fish migration patterns, nor are fish handled or stressed; however, their use is generally limited to clear rivers that meet specific site selection criteria. The data provided by counting tower sampling allow fishery managers to determine reproductive population size, estimate total return (escapement + catch) and its uncertainty, evaluate population productivity and trends, set harvest rates, determine spawning escapement goals, and forecast future returns (Alaska Department of Fish and Game 1974-2000 and 1975-2004). The number of spawning fish is determined by subtracting subsistence, sport-caught fish, and prespawn mortality from the total estimated escapement. The methods outlined in this protocol for tower counts can be used to provide reasonable estimates ( plus or minus 6%-10%) of reproductive salmon population size and run timing in clear rivers. 

  1. Historical data decrease complete blood count reflex blood smear review rates without missing patients with acute leukaemia.

    PubMed

    Rabizadeh, Esther; Pickholtz, Itay; Barak, Mira; Froom, Paul

    2013-08-01

    The availability of historical data decreases the rate of blood smear review rates in outpatients, but we are unaware of studies done at referral centres. In the following study, we determined the effect of historical data on the rates of peripheral blood smears over a 3-month period and then the detection rate of patients with acute leukaemia. All results of complete blood counts (CBCs) tested on three ADVIA 120 analyzers at the regional Rabin Medical Centre, Beilinson Campus over a 3-month period were accessed on a computerised laboratory information system. Over a 3-month period, we determined the proportion of total CBC and patients with criteria for a manual differential count and the actual number of peripheral blood smears done. Finally, we determined the proportion of 100 consecutive patients with acute leukaemia detected using our criteria that included limiting reflex testing according to historical data. Over the 3-month period, there were 34,827 tests done in 12,785 patients. Without historical data, our smear rate would have been 24.5%, but with the availability of historical data, the blood smear review rate was 5.6%. The detection rate for cases of acute leukaemia was 100%. We conclude that the availability of previous test results significantly reduces the need for blood smear review without missing any patients with acute leukaemia.

  2. Analysis techniques for background rejection at the Majorana Demonstrator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cuestra, Clara; Rielage, Keith Robert; Elliott, Steven Ray

    2015-06-11

    The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulsemore » shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.« less

  3. Dark-count-less photon-counting x-ray computed tomography system using a YAP-MPPC detector

    NASA Astrophysics Data System (ADS)

    Sato, Eiichi; Sato, Yuich; Abudurexiti, Abulajiang; Hagiwara, Osahiko; Matsukiyo, Hiroshi; Osawa, Akihiro; Enomoto, Toshiyuki; Watanabe, Manabu; Kusachi, Shinya; Sato, Shigehiro; Ogawa, Akira; Onagawa, Jun

    2012-10-01

    A high-sensitive X-ray computed tomography (CT) system is useful for decreasing absorbed dose for patients, and a dark-count-less photon-counting CT system was developed. X-ray photons are detected using a YAP(Ce) [cerium-doped yttrium aluminum perovskite] single crystal scintillator and an MPPC (multipixel photon counter). Photocurrents are amplified by a high-speed current-voltage amplifier, and smooth event pulses from an integrator are sent to a high-speed comparator. Then, logical pulses are produced from the comparator and are counted by a counter card. Tomography is accomplished by repeated linear scans and rotations of an object, and projection curves of the object are obtained by the linear scan. The image contrast of gadolinium medium slightly fell with increase in lower-level voltage (Vl) of the comparator. The dark count rate was 0 cps, and the count rate for the CT was approximately 250 kcps.

  4. The coincidence counting technique for orders of magnitude background reduction in data obtained with the magnetic recoil spectrometer at OMEGA and the NIF.

    PubMed

    Casey, D T; Frenje, J A; Séguin, F H; Li, C K; Rosenberg, M J; Rinderknecht, H; Manuel, M J-E; Gatu Johnson, M; Schaeffer, J C; Frankel, R; Sinenian, N; Childs, R A; Petrasso, R D; Glebov, V Yu; Sangster, T C; Burke, M; Roberts, S

    2011-07-01

    A magnetic recoil spectrometer (MRS) has been built and successfully used at OMEGA for measurements of down-scattered neutrons (DS-n), from which an areal density in both warm-capsule and cryogenic-DT implosions have been inferred. Another MRS is currently being commissioned on the National Ignition Facility (NIF) for diagnosing low-yield tritium-hydrogen-deuterium implosions and high-yield DT implosions. As CR-39 detectors are used in the MRS, the principal sources of background are neutron-induced tracks and intrinsic tracks (defects in the CR-39). The coincidence counting technique was developed to reduce these types of background tracks to the required level for the DS-n measurements at OMEGA and the NIF. Using this technique, it has been demonstrated that the number of background tracks is reduced by a couple of orders of magnitude, which exceeds the requirement for the DS-n measurements at both facilities.

  5. AUTOMATIC COUNTING APPARATUS

    DOEpatents

    Howell, W.D.

    1957-08-20

    An apparatus for automatically recording the results of counting operations on trains of electrical pulses is described. The disadvantages of prior devices utilizing the two common methods of obtaining the count rate are overcome by this apparatus; in the case of time controlled operation, the disclosed system automatically records amy information stored by the scaler but not transferred to the printer at the end of the predetermined time controlled operations and, in the case of count controlled operation, provision is made to prevent a weak sample from occupying the apparatus for an excessively long period of time.

  6. Investigation of background radiation levels and geologic unit profiles in Durango, Colorado

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Triplett, G.H.; Foutz, W.L.; Lesperance, L.R.

    1989-11-01

    As part of the Uranium Mill Tailings Remedial Action (UMTRA) Project, Oak Ridge National Laboratory (ORNL) has performed radiological surveys on 435 vicinity properties (VPs) in the Durango area. This study was undertaken to establish the background radiation levels and geologic unit profiles in the Durango VP area. During the months of May through June, 1986, extensive radiometric measurements and surface soil samples were collected in the Durango VP area by personnel from ORNL's Grand Junction Office. A majority of the Durango VP surveys were conducted at sites underlain by Quaternary alluvium, older Quaternary gravels, and Cretaceous Lewis and Mancosmore » shales. These four geologic units were selected to be evaluated. The data indicated no formation anomalies and established regional background radiation levels. Durango background radionuclide concentrations in surface soil were determined to be 20.3 {plus minus} 3.4 pCi/g for {sup 40}K, 1.6 {plus minus} 0.5 pCi/g for {sup 226}Ra, and 1.2 {plus minus} 0.3 pCi/g for {sup 232}Th. The Durango background gamma exposure rate was found to be 16.5 {plus minus} 1.3 {mu}R/h. Average gamma spectral count rate measurements for {sup 40}K, {sup 226}Ra and {sup 232}Th were determined to be 553, 150, and 98 counts per minute (cpm), respectively. Geologic unit profiles and Durango background radiation measurements are presented and compared with other areas. 19 refs., 15 figs., 5 tabs.« less

  7. Film Vetoes for Alpha Background Rejection in Bolometer Detectors

    NASA Astrophysics Data System (ADS)

    Deporzio, Nicholas; Bucci, Carlo; Canonica, Lucia; Divacri, Marialaura; Cuore Collaboration; Absurd Team

    2015-04-01

    This study characterizes the effectiveness of encasing bolometer detectors in scintillator, metal ionization, and more exotic films to veto alpha radiation background. Bolometers are highly susceptible to alpha background and a successful veto should boost the statistical strength, speed, and signal-background ratio of bolometer particle searches. Plastic scintillator films are cooled to bolometer temperatures and bombarded with 1.4 MeV to 6.0 MeV alpha particles representative of detector conditions. Photomultipliers detect the keV range scintillation light and produce a veto signal. Also, layered films of a primary metal, dielectric, and secondary metal, such as gold-polyethylene-gold films, are cooled to milli-kelvin temperatures and biased with 0.1V to 100V to produce a current signal when incident 1.4 MeV to 6.0 MeV alpha particles ionize conduction paths through the film. Veto signals are characterized by their affect on bolometer detection of 865 keV target signals. Similar methods are applied to more exotic films. Early results show scintillator films raise target signal count rate and suppress counts above target energy by at least a factor of 10. This indicates scintillation vetoes are effective and that metal ionization and other films under study will also be effective.

  8. Characterizing energy dependence and count rate performance of a dual scintillator fiber-optic detector for computed tomography.

    PubMed

    Hoerner, Matthew R; Stepusin, Elliott J; Hyer, Daniel E; Hintenlang, David E

    2015-03-01

    Kilovoltage (kV) x-rays pose a significant challenge for radiation dosimetry. In the kV energy range, even small differences in material composition can result in significant variations in the absorbed energy between soft tissue and the detector. In addition, the use of electronic systems in light detection has demonstrated measurement losses at high photon fluence rates incident to the detector. This study investigated the feasibility of using a novel dual scintillator detector and whether its response to changes in beam energy from scatter and hardening is readily quantified. The detector incorporates a tissue-equivalent plastic scintillator and a gadolinium oxysulfide scintillator, which has a higher sensitivity to scatter x-rays. The detector was constructed by coupling two scintillators: (1) small cylindrical plastic scintillator, 500 μm in diameter and 2 mm in length, and (2) 100 micron sheet of gadolinium oxysulfide 500 μm in diameter, each to a 2 m long optical fiber, which acts as a light guide to transmit scintillation photons from the sensitive element to a photomultiplier tube. Count rate linearity data were obtained from a wide range of exposure rates delivered from a radiological x-ray tube by adjusting the tube current. The data were fitted to a nonparalyzable dead time model to characterize the time response. The true counting rate was related to the reference free air dose air rate measured with a 0.6 cm(3) Radcal(®) thimble chamber as described in AAPM Report No. 111. Secondary electron and photon spectra were evaluated using Monte Carlo techniques to analyze ionization quenching and photon energy-absorption characteristics from free-in-air and in phantom measurements. The depth/energy dependence of the detector was characterized using a computed tomography dose index QA phantom consisting of nested adult head and body segments. The phantom provided up to 32 cm of acrylic with a compatible 0.6 cm(3) calibrated ionization chamber to measure the

  9. Comparison between two time-resolved approaches for prostate cancer diagnosis: high rate imager vs. photon counting system

    NASA Astrophysics Data System (ADS)

    Boutet, J.; Debourdeau, M.; Laidevant, A.; Hervé, L.; Dinten, J.-M.

    2010-02-01

    Finding a way to combine ultrasound and fluorescence optical imaging on an endorectal probe may improve early detection of prostate cancer. A trans-rectal probe adapted to fluorescence diffuse optical tomography measurements was developed by our team. This probe is based on a pulsed NIR laser source, an optical fiber network and a time-resolved detection system. A reconstruction algorithm was used to help locate and quantify fluorescent prostate tumors. In this study, two different kinds of time-resolved detectors are compared: High Rate Imaging system (HRI) and a photon counting system. The HRI is based on an intensified multichannel plate and a CCD Camera. The temporal resolution is obtained through a gating of the HRI. Despite a low temporal resolution (300ps), this system allows a simultaneous acquisition of the signal from a large number of detection fibers. In the photon counting setup, 4 photomultipliers are connected to a Time Correlated Single Photon Counting (TCSPC) board, providing a better temporal resolution (0.1 ps) at the expense of a limited number of detection fibers (4). At last, we show that the limited number of detection fibers of the photon counting setup is enough for a good localization and dramatically improves the overall acquisition time. The photon counting approach is then validated through the localization of fluorescent inclusions in a prostate-mimicking phantom.

  10. Determination of confidence limits for experiments with low numbers of counts. [Poisson-distributed photon counts from astrophysical sources

    NASA Technical Reports Server (NTRS)

    Kraft, Ralph P.; Burrows, David N.; Nousek, John A.

    1991-01-01

    Two different methods, classical and Bayesian, for determining confidence intervals involving Poisson-distributed data are compared. Particular consideration is given to cases where the number of counts observed is small and is comparable to the mean number of background counts. Reasons for preferring the Bayesian over the classical method are given. Tables of confidence limits calculated by the Bayesian method are provided for quick reference.

  11. Performance of a GM tube based environmental dose rate monitor operating in the Time-To-Count mode

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zickefoose, J.; Kulkarni, T.; Martinson, T.

    The events at the Fukushima Daiichi power plant in the aftermath of a natural disaster underline the importance of a large array of networked environmental monitors to cover areas around nuclear power plants. These monitors should meet a few basic criteria: have a uniform response over a wide range of gamma energies, have a uniform response over a wide range of incident angles, and have a large dynamic range. Many of these criteria are met if the probe is qualified to the international standard IEC 60532 (Radiation protection instrumentation - Installed dose rate meters, warning assemblies and monitors - Xmore » and gamma radiation of energy between 50 keV and 7 MeV), which specifically deals with energy response, angle of incidence, dynamic range, response time, and a number of environmental characteristics. EcoGamma is a dual GM tube environmental gamma radiation monitor designed specifically to meet the requirements of IEC 60532 and operate in the most extreme conditions. EcoGamma utilizes two energy compensated GM tubes operating with a Time-To-Count (TTC) collection algorithm. The TTC algorithm extends the lifetime and range of a GM tube significantly and allows the dual GM tube probe to achieve linearity over approximately 10 decades of gamma dose rate (from the Sv/hr range to 100 Sv/hr). In the TTC mode of operation, the GM tube is not maintained in a biased condition continuously. This is different from a traditional counting system where the GM tube is held at a constant bias continuously and the total number of strikes that the tube registers are counted. The traditional approach allows for good sensitivity, but does not lend itself to a long lifetime of the tube and is susceptible to linearity issues at high count rates. TTC on the other hand only biases the tube for short periods of time and in effect measures the time between events, which is statistically representative of the total strike rate. Since the tube is not continually biased, the life of

  12. Reducing background contributions in fluorescence fluctuation time-traces for single-molecule measurements in solution.

    PubMed

    Földes-Papp, Zeno; Liao, Shih-Chu Jeff; You, Tiefeng; Barbieri, Beniamino

    2009-08-01

    We first report on the development of new microscope means that reduce background contributions in fluorescence fluctuation methods: i) excitation shutter, ii) electronic switches, and iii) early and late time-gating. The elements allow for measuring molecules at low analyte concentrations. We first found conditions of early and late time-gating with time-correlated single-photon counting that made the fluorescence signal as bright as possible compared with the fluctuations in the background count rate in a diffraction-limited optical set-up. We measured about a 140-fold increase in the amplitude of autocorrelated fluorescence fluctuations at the lowest analyte concentration of about 15 pM, which gave a signal-to-background advantage of more than two-orders of magnitude. The results of this original article pave the way for single-molecule detection in solution and in live cells without immobilization or hydrodynamic/electrokinetic focusing at longer observation times than are currently available.

  13. A scaling relation between merger rate of galaxies and their close pair count

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, C. Y.; Jing, Y. P.; Han, Jiaxin, E-mail: ypjing@sjtu.edu.cn

    We study how to measure the galaxy merger rate from the observed close pair count. Using a high-resolution N-body/SPH cosmological simulation, we find an accurate scaling relation between galaxy pair counts and merger rates down to a stellar mass ratio of about 1:30. The relation explicitly accounts for the dependence on redshift (or time), on pair separation, and on mass of the two galaxies in a pair. With this relation, one can easily obtain the mean merger timescale for a close pair of galaxies. The use of virial masses, instead of the stellar mass, is motivated by the fact thatmore » the dynamical friction timescale is mainly determined by the dark matter surrounding central and satellite galaxies. This fact can also minimize the error induced by uncertainties in modeling star formation in the simulation. Since the virial mass can be determined from the well-established relation between the virial masses and the stellar masses in observations, our scaling relation can easily be applied to observations to obtain the merger rate and merger timescale. For major merger pairs (1:1-1:4) of galaxies above a stellar mass of 4 × 10{sup 10} h {sup –1} M{sub ☉} at z = 0.1, it takes about 0.31 Gyr to merge for pairs within a projected distance of 20 h {sup –1} kpc with a stellar mass ratio of 1:1, while the time goes up to 1.6 Gyr for mergers with stellar mass ratio of 1:4. Our results indicate that a single timescale usually used in the literature is not accurate to describe mergers with a stellar mass ratio spanning even a narrow range from 1:1 to 1:4.« less

  14. Phase space representation of neutron monitor count rate and atmospheric electric field in relation to solar activity in cycles 21 and 22.

    PubMed

    Silva, H G; Lopes, I

    Heliospheric modulation of galactic cosmic rays links solar cycle activity with neutron monitor count rate on earth. A less direct relation holds between neutron monitor count rate and atmospheric electric field because different atmospheric processes, including fluctuations in the ionosphere, are involved. Although a full quantitative model is still lacking, this link is supported by solid statistical evidence. Thus, a connection between the solar cycle activity and atmospheric electric field is expected. To gain a deeper insight into these relations, sunspot area (NOAA, USA), neutron monitor count rate (Climax, Colorado, USA), and atmospheric electric field (Lisbon, Portugal) are presented here in a phase space representation. The period considered covers two solar cycles (21, 22) and extends from 1978 to 1990. Two solar maxima were observed in this dataset, one in 1979 and another in 1989, as well as one solar minimum in 1986. Two main observations of the present study were: (1) similar short-term topological features of the phase space representations of the three variables, (2) a long-term phase space radius synchronization between the solar cycle activity, neutron monitor count rate, and potential gradient (confirmed by absolute correlation values above ~0.8). Finally, the methodology proposed here can be used for obtaining the relations between other atmospheric parameters (e.g., solar radiation) and solar cycle activity.

  15. Regression Analysis of Mixed Recurrent-Event and Panel-Count Data with Additive Rate Models

    PubMed Central

    Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L.

    2015-01-01

    Summary Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007; Zhao et al., 2011). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013). In this paper, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. PMID:25345405

  16. State of the art of D&D Instrumentation Technology: Alpha counting in the presence of high background

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickerman, C.E.

    1995-08-01

    Discrimination of alpha activity in the presence of a high radiation background has been identified as an area of concern to be studied for D&D applications. Upon evaluating the range of alpha detection needs for D&D operations, we have expanded this study to address the operational concern of greatly expediting alpha counting of rough surfaces and rubble. Note that the term, ``rough surfaces`` includes a wide range of practical cases, including contaminated equipment and work surfaces. We have developed provisional applications requirements for instrumentation of this type; and we also have generated the scope of a program of instrument evaluationmore » and testing, with emphasis on practical implementation. In order to obtain the full operational benefit of alpha discrimination in the presence of strong beta-gamma radiation background, the detection system must be capable of some form of remote or semi-remote operation in order to reduce operator exposure. We have identified a highly promising technique, the long-range alpha detector (LRAD), for alpha discrimination in the presence of high radiation background. This technique operates upon the principle of transporting alphaionized air to an ionization detector. A transport time within a few seconds is adequate. Neither the provisional requirements nor the evaluation and testing scope were expressly tailored to force the selection of a LRAD technology, and they could be used as a basis for studies of other promising technologies. However, a technology that remotely detects alpha-ionized air (e. g., LRAD) is a natural fit to the key requirements of rejection of high background at the survey location and operator protection. Also, LRAD appears to be valuable for D&D applications as a means of greatly expediting surface alpha-activity surveys that otherwise would require performing time-consuming scans over surfaces of interest with alpha detector probes, and even more labor-intensive surface wipe surveys.« less

  17. Kids Count in Indiana: 1996 Data Book.

    ERIC Educational Resources Information Center

    Erickson, Judith B.

    This Kids Count report is the third in a series examining statewide trends in the well-being of Indiana's children. The report combines statistics of special concern in Indiana with 10 national Kids Count well-being indicators: (1) percent low birthweight; (2) infant mortality rate; (3) child death rate; (4) birth rate to unmarried teens ages 15…

  18. Photon-Counting Multikilohertz Microlaser Altimeters for Airborne and Spaceborne Topographic Measurements

    NASA Technical Reports Server (NTRS)

    Degnan, John J.; Smith, David E. (Technical Monitor)

    2000-01-01

    We consider the optimum design of photon-counting microlaser altimeters operating from airborne and spaceborne platforms under both day and night conditions. Extremely compact Q-switched microlaser transmitters produce trains of low energy pulses at multi-kHz rates and can easily generate subnanosecond pulse-widths for precise ranging. To guide the design, we have modeled the solar noise background and developed simple algorithms, based on Post-Detection Poisson Filtering (PDPF), to optimally extract the weak altimeter signal from a high noise background during daytime operations. Practical technology issues, such as detector and/or receiver dead times, have also been considered in the analysis. We describe an airborne prototype, being developed under NASA's instrument Incubator Program, which is designed to operate at a 10 kHz rate from aircraft cruise altitudes up to 12 km with laser pulse energies on the order of a few microjoules. We also analyze a compact and power efficient system designed to operate from Mars orbit at an altitude of 300 km and sample the Martian surface at rates up to 4.3 kHz using a 1 watt laser transmitter and an 18 cm telescope. This yields a Power-Aperture Product of 0.24 W-square meter, corresponding to a value almost 4 times smaller than the Mars Orbiting Laser Altimeter (0. 88W-square meter), yet the sampling rate is roughly 400 times greater (4 kHz vs 10 Hz) Relative to conventional high power laser altimeters, advantages of photon-counting laser altimeters include: (1) a more efficient use of available laser photons providing up to two orders of magnitude greater surface sampling rates for a given laser power-telescope aperture product; (2) a simultaneous two order of magnitude reduction in the volume, cost and weight of the telescope system; (3) the unique ability to spatially resolve the source of the surface return in a photon counting mode through the use of pixellated or imaging detectors; and (4) improved vertical and

  19. Characterizing energy dependence and count rate performance of a dual scintillator fiber-optic detector for computed tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoerner, Matthew R., E-mail: mrh5038@ufl.edu; Stepusin, Elliott J.; Hyer, Daniel E.

    Purpose: Kilovoltage (kV) x-rays pose a significant challenge for radiation dosimetry. In the kV energy range, even small differences in material composition can result in significant variations in the absorbed energy between soft tissue and the detector. In addition, the use of electronic systems in light detection has demonstrated measurement losses at high photon fluence rates incident to the detector. This study investigated the feasibility of using a novel dual scintillator detector and whether its response to changes in beam energy from scatter and hardening is readily quantified. The detector incorporates a tissue-equivalent plastic scintillator and a gadolinium oxysulfide scintillator,more » which has a higher sensitivity to scatter x-rays. Methods: The detector was constructed by coupling two scintillators: (1) small cylindrical plastic scintillator, 500 μm in diameter and 2 mm in length, and (2) 100 micron sheet of gadolinium oxysulfide 500 μm in diameter, each to a 2 m long optical fiber, which acts as a light guide to transmit scintillation photons from the sensitive element to a photomultiplier tube. Count rate linearity data were obtained from a wide range of exposure rates delivered from a radiological x-ray tube by adjusting the tube current. The data were fitted to a nonparalyzable dead time model to characterize the time response. The true counting rate was related to the reference free air dose air rate measured with a 0.6 cm{sup 3} Radcal{sup ®} thimble chamber as described in AAPM Report No. 111. Secondary electron and photon spectra were evaluated using Monte Carlo techniques to analyze ionization quenching and photon energy-absorption characteristics from free-in-air and in phantom measurements. The depth/energy dependence of the detector was characterized using a computed tomography dose index QA phantom consisting of nested adult head and body segments. The phantom provided up to 32 cm of acrylic with a compatible 0.6 cm{sup 3

  20. Low-background germanium radioassay for the MAJORANA Collaboration

    NASA Astrophysics Data System (ADS)

    Trimble, James E., Jr.

    The focus of the MAJORANA COLLABORATION is the search for nuclear neutrinoless double beta decay. If discovered, this process would prove that the neutrino is its own anti-particle, or a M AJORANA particle. Being constructed at the Sanford Underground Research Facility, the MAJORANA DEMONSTRATOR aims to show that a background rate of 3 counts per region of interest (ROI) per tonne per year in the 4 keV ROI surrounding the 2039-keV Q-value energy of 76Ge is achievable and to demonstrate the technological feasibility of building a tonne-scale Ge-based experiment. Because of the rare nature of this process, detectors in the system must be isolated from ionizing radiation backgrounds as much as possible. This involved building the system with materials containing very low levels of naturally- occurring and anthropogenic radioactive isotopes at a deep underground site. In order to measure the levels of radioactive contamination in some components, the Majorana Demonstrator uses a low background counting facility managed by the Experimental Nuclear and Astroparticle Physics (ENAP) group at UNC. The UNC low background counting (LBC) facility is located at the Kimballton Underground Research Facility (KURF) located in Ripplemead, VA. The facility was used for a neutron activation analysis of samples of polytetrafluoroethylene (PTFE) and fluorinated ethylene propylene (FEP) tubing intended for use in the Demonstrator. Calculated initial activity limits (90% C.L.) of 238U and 232Th in the 0.002-in PTFE samples were 7.6 ppt and 5.1 ppt, respectively. The same limits in the FEP tubing sample were 150 ppt and 45 ppt, respectively. The UNC LBC was also used to gamma-assay a modified stainless steel flange to be used as a vacuum feedthrough. Trace activities of both 238U and 232Th were found in the sample, but all were orders of magnitude below the acceptable threshold for the Majorana experiment. Also discussed is a proposed next generation ultra-low background system designed

  1. Modeling and simulation of count data.

    PubMed

    Plan, E L

    2014-08-13

    Count data, or number of events per time interval, are discrete data arising from repeated time to event observations. Their mean count, or piecewise constant event rate, can be evaluated by discrete probability distributions from the Poisson model family. Clinical trial data characterization often involves population count analysis. This tutorial presents the basics and diagnostics of count modeling and simulation in the context of pharmacometrics. Consideration is given to overdispersion, underdispersion, autocorrelation, and inhomogeneity.

  2. Pulse pileup statistics for energy discriminating photon counting x-ray detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Adam S.; Harrison, Daniel; Lobastov, Vladimir

    Purpose: Energy discriminating photon counting x-ray detectors can be subject to a wide range of flux rates if applied in clinical settings. Even when the incident rate is a small fraction of the detector's maximum periodic rate N{sub 0}, pulse pileup leads to count rate losses and spectral distortion. Although the deterministic effects can be corrected, the detrimental effect of pileup on image noise is not well understood and may limit the performance of photon counting systems. Therefore, the authors devise a method to determine the detector count statistics and imaging performance. Methods: The detector count statistics are derived analyticallymore » for an idealized pileup model with delta pulses of a nonparalyzable detector. These statistics are then used to compute the performance (e.g., contrast-to-noise ratio) for both single material and material decomposition contrast detection tasks via the Cramer-Rao lower bound (CRLB) as a function of the detector input count rate. With more realistic unipolar and bipolar pulse pileup models of a nonparalyzable detector, the imaging task performance is determined by Monte Carlo simulations and also approximated by a multinomial method based solely on the mean detected output spectrum. Photon counting performance at different count rates is compared with ideal energy integration, which is unaffected by count rate. Results: The authors found that an ideal photon counting detector with perfect energy resolution outperforms energy integration for our contrast detection tasks, but when the input count rate exceeds 20%N{sub 0}, many of these benefits disappear. The benefit with iodine contrast falls rapidly with increased count rate while water contrast is not as sensitive to count rates. The performance with a delta pulse model is overoptimistic when compared to the more realistic bipolar pulse model. The multinomial approximation predicts imaging performance very close to the prediction from Monte Carlo simulations. The

  3. Single photon counting linear mode avalanche photodiode technologies

    NASA Astrophysics Data System (ADS)

    Williams, George M.; Huntington, Andrew S.

    2011-10-01

    The false count rate of a single-photon-sensitive photoreceiver consisting of a high-gain, low-excess-noise linear-mode InGaAs avalanche photodiode (APD) and a high-bandwidth transimpedance amplifier (TIA) is fit to a statistical model. The peak height distribution of the APD's multiplied dark current is approximated by the weighted sum of McIntyre distributions, each characterizing dark current generated at a different location within the APD's junction. The peak height distribution approximated in this way is convolved with a Gaussian distribution representing the input-referred noise of the TIA to generate the statistical distribution of the uncorrelated sum. The cumulative distribution function (CDF) representing count probability as a function of detection threshold is computed, and the CDF model fit to empirical false count data. It is found that only k=0 McIntyre distributions fit the empirically measured CDF at high detection threshold, and that false count rate drops faster than photon count rate as detection threshold is raised. Once fit to empirical false count data, the model predicts the improvement of the false count rate to be expected from reductions in TIA noise and APD dark current. Improvement by at least three orders of magnitude is thought feasible with further manufacturing development and a capacitive-feedback TIA (CTIA).

  4. Multiparameter linear least-squares fitting to Poisson data one count at a time

    NASA Technical Reports Server (NTRS)

    Wheaton, Wm. A.; Dunklee, Alfred L.; Jacobsen, Allan S.; Ling, James C.; Mahoney, William A.; Radocinski, Robert G.

    1995-01-01

    A standard problem in gamma-ray astronomy data analysis is the decomposition of a set of observed counts, described by Poisson statistics, according to a given multicomponent linear model, with underlying physical count rates or fluxes which are to be estimated from the data. Despite its conceptual simplicity, the linear least-squares (LLSQ) method for solving this problem has generally been limited to situations in which the number n(sub i) of counts in each bin i is not too small, conventionally more than 5-30. It seems to be widely believed that the failure of the LLSQ method for small counts is due to the failure of the Poisson distribution to be even approximately normal for small numbers. The cause is more accurately the strong anticorrelation between the data and the wieghts w(sub i) in the weighted LLSQ method when square root of n(sub i) instead of square root of bar-n(sub i) is used to approximate the uncertainties, sigma(sub i), in the data, where bar-n(sub i) = E(n(sub i)), the expected value of N(sub i). We show in an appendix that, avoiding this approximation, the correct equations for the Poisson LLSQ (PLLSQ) problems are actually identical to those for the maximum likelihood estimate using the exact Poisson distribution. We apply the method to solve a problem in high-resolution gamma-ray spectroscopy for the JPL High-Resolution Gamma-Ray Spectrometer flown on HEAO 3. Systematic error in subtracting the strong, highly variable background encountered in the low-energy gamma-ray region can be significantly reduced by closely pairing source and background data in short segments. Significant results can be built up by weighted averaging of the net fluxes obtained from the subtraction of many individual source/background pairs. Extension of the approach to complex situations, with multiple cosmic sources and realistic background parameterizations, requires a means of efficiently fitting to data from single scans in the narrow (approximately = 1.2 ke

  5. Regression analysis of mixed recurrent-event and panel-count data with additive rate models.

    PubMed

    Zhu, Liang; Zhao, Hui; Sun, Jianguo; Leisenring, Wendy; Robison, Leslie L

    2015-03-01

    Event-history studies of recurrent events are often conducted in fields such as demography, epidemiology, medicine, and social sciences (Cook and Lawless, 2007, The Statistical Analysis of Recurrent Events. New York: Springer-Verlag; Zhao et al., 2011, Test 20, 1-42). For such analysis, two types of data have been extensively investigated: recurrent-event data and panel-count data. However, in practice, one may face a third type of data, mixed recurrent-event and panel-count data or mixed event-history data. Such data occur if some study subjects are monitored or observed continuously and thus provide recurrent-event data, while the others are observed only at discrete times and hence give only panel-count data. A more general situation is that each subject is observed continuously over certain time periods but only at discrete times over other time periods. There exists little literature on the analysis of such mixed data except that published by Zhu et al. (2013, Statistics in Medicine 32, 1954-1963). In this article, we consider the regression analysis of mixed data using the additive rate model and develop some estimating equation-based approaches to estimate the regression parameters of interest. Both finite sample and asymptotic properties of the resulting estimators are established, and the numerical studies suggest that the proposed methodology works well for practical situations. The approach is applied to a Childhood Cancer Survivor Study that motivated this study. © 2014, The International Biometric Society.

  6. High quantum efficiency and low dark count rate in multi-layer superconducting nanowire single-photon detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jafari Salim, A., E-mail: ajafaris@uwaterloo.ca; Eftekharian, A.; University of Waterloo, Waterloo, Ontario N2L 3G1

    In this paper, we theoretically show that a multi-layer superconducting nanowire single-photon detector (SNSPD) is capable of approaching characteristics of an ideal SNSPD in terms of the quantum efficiency, dark count, and band-width. A multi-layer structure improves the performance in two ways. First, the potential barrier for thermally activated vortex crossing, which is the major source of dark counts and the reduction of the critical current in SNSPDs is elevated. In a multi-layer SNSPD, a vortex is made of 2D-pancake vortices that form a stack. It will be shown that the stack of pancake vortices effectively experiences a larger potentialmore » barrier compared to a vortex in a single-layer SNSPD. This leads to an increase in the experimental critical current as well as significant decrease in the dark count rate. In consequence, an increase in the quantum efficiency for photons of the same energy or an increase in the sensitivity to photons of lower energy is achieved. Second, a multi-layer structure improves the efficiency of single-photon absorption by increasing the effective optical thickness without compromising the single-photon sensitivity.« less

  7. Evaluation of counting methods for oceanic radium-228

    NASA Astrophysics Data System (ADS)

    Orr, James C.

    1988-07-01

    Measurement of open ocean 228Ra is difficult, typically requiring at least 200 L of seawater. The burden of collecting and processing these large-volume samples severely limits the widespread use of this promising tracer. To use smaller-volume samples, a more sensitive means of analysis is required. To seek out new and improved counting method(s), conventional 228Ra counting methods have been compared with some promising techniques which are currently used for other radionuclides. Of the conventional methods, α spectrometry possesses the highest efficiency (3-9%) and lowest background (0.0015 cpm), but it suffers from the need for complex chemical processing after sampling and the need to allow about 1 year for adequate ingrowth of 228Th granddaughter. The other two conventional counting methods measure the short-lived 228Ac daughter while it remains supported by 228Ra, thereby avoiding the complex sample processing and the long delay before counting. The first of these, high-resolution γ spectrometry, offers the simplest processing and an efficiency (4.8%) comparable to α spectrometry; yet its high background (0.16 cpm) and substantial equipment cost (˜30,000) limit its widespread use. The second no-wait method, β-γ coincidence spectrometry, also offers comparable efficiency (5.3%), but it possesses both lower background (0.0054 cpm) and lower initial cost (˜12,000). Three new (i.e., untried for 228Ra) techniques all seem to promise about a fivefold increase in efficiency over conventional methods. By employing liquid scintillation methods, both α spectrometry and β-γ coincidence spectrometry can improve their counter efficiency while retaining low background. The third new 228Ra counting method could be adapted from a technique which measures 224Ra by 220Rn emanation. After allowing for ingrowth and then counting for the 224Ra great-granddaughter, 228Ra could be back calculated, thereby yielding a method with high efficiency, where no sample processing

  8. Hawkes process model with a time-dependent background rate and its application to high-frequency financial data.

    PubMed

    Omi, Takahiro; Hirata, Yoshito; Aihara, Kazuyuki

    2017-07-01

    A Hawkes process model with a time-varying background rate is developed for analyzing the high-frequency financial data. In our model, the logarithm of the background rate is modeled by a linear model with a relatively large number of variable-width basis functions, and the parameters are estimated by a Bayesian method. Our model can capture not only the slow time variation, such as in the intraday seasonality, but also the rapid one, which follows a macroeconomic news announcement. By analyzing the tick data of the Nikkei 225 mini, we find that (i) our model is better fitted to the data than the Hawkes models with a constant background rate or a slowly varying background rate, which have been commonly used in the field of quantitative finance; (ii) the improvement in the goodness-of-fit to the data by our model is significant especially for sessions where considerable fluctuation of the background rate is present; and (iii) our model is statistically consistent with the data. The branching ratio, which quantifies the level of the endogeneity of markets, estimated by our model is 0.41, suggesting the relative importance of exogenous factors in the market dynamics. We also demonstrate that it is critically important to appropriately model the time-dependent background rate for the branching ratio estimation.

  9. Hawkes process model with a time-dependent background rate and its application to high-frequency financial data

    NASA Astrophysics Data System (ADS)

    Omi, Takahiro; Hirata, Yoshito; Aihara, Kazuyuki

    2017-07-01

    A Hawkes process model with a time-varying background rate is developed for analyzing the high-frequency financial data. In our model, the logarithm of the background rate is modeled by a linear model with a relatively large number of variable-width basis functions, and the parameters are estimated by a Bayesian method. Our model can capture not only the slow time variation, such as in the intraday seasonality, but also the rapid one, which follows a macroeconomic news announcement. By analyzing the tick data of the Nikkei 225 mini, we find that (i) our model is better fitted to the data than the Hawkes models with a constant background rate or a slowly varying background rate, which have been commonly used in the field of quantitative finance; (ii) the improvement in the goodness-of-fit to the data by our model is significant especially for sessions where considerable fluctuation of the background rate is present; and (iii) our model is statistically consistent with the data. The branching ratio, which quantifies the level of the endogeneity of markets, estimated by our model is 0.41, suggesting the relative importance of exogenous factors in the market dynamics. We also demonstrate that it is critically important to appropriately model the time-dependent background rate for the branching ratio estimation.

  10. Characterization of 176Lu background in LSO-based PET scanners

    NASA Astrophysics Data System (ADS)

    Conti, Maurizio; Eriksson, Lars; Rothfuss, Harold; Sjoeholm, Therese; Townsend, David; Rosenqvist, Göran; Carlier, Thomas

    2017-05-01

    LSO and LYSO are today the most common scintillators used in positron emission tomography. Lutetium contains traces of 176Lu, a radioactive isotope that decays β - with a cascade of γ photons in coincidence. Therefore, Lutetium-based scintillators are characterized by a small natural radiation background. In this paper, we investigate and characterize the 176Lu radiation background via experiments performed on LSO-based PET scanners. LSO background was measured at different energy windows and different time coincidence windows, and by using shields to alter the original spectrum. The effect of radiation background in particularly count-starved applications, such as 90Y imaging, is analysed and discussed. Depending on the size of the PET scanner, between 500 and 1000 total random counts per second and between 3 and 5 total true coincidences per second were measured in standard coincidence mode. The LSO background counts in a Siemens mCT in the standard PET energy and time windows are in general negligible in terms of trues, and are comparable to that measured in a BGO scanner of similar size.

  11. High Broadband Spectral Resolving Transition-Edge Sensors for High Count-Rate Astrophysical Applications

    NASA Technical Reports Server (NTRS)

    Smith, Stephen

    2011-01-01

    We are developing arrays of transition-edge sensor (TES) X-ray detectors optimized for high count-rate solar astronomy applications where characterizing the high velocity motions of X-ray jets in solar flares is of particular interest. These devices are fabricated on thick Si substrates and consist of 35x35micron^2 TESs with 4.5micron thick, 60micron pitch, electroplated absorbers. We have tested devices fabricated with different geometric stem contact areas with the TES and surrounding substrate area, which allows us to investigate the loss of athermal phonons to the substrate. Results show a correlation between the stem contact area and a non-Gaussian broadening in the spectral line shape consistent with athermal phonon loss. When the contact area is minimized we have obtained remarkable board-band spectral resolving capabilities of 1.3 plus or minus 0.leV at an energy of 1.5 keV, 1.6 plus or minus 0.1 eV at 5.9 keV and 2.0 plus or minus 0.1 eV at 8 keV. This, coupled with a capability of accommodating 100's of counts per second per pixel makes these devices an exciting prospect of future x-ray astronomy applications.

  12. The particle background observed by the X-ray detectors onboard Copernicus

    NASA Technical Reports Server (NTRS)

    Davison, P. J. N.

    1974-01-01

    The design and characteristics of low energy detectors on the Copernicus satellite are described. The functions of the sensors in obtaining data on the particle background. The procedure for processing the data obtained by the satellite is examined. The most significant positive deviations are caused by known weak X-ray sources in the field of view. In addition to small systemic effects, occasional random effects where the count rate increases suddenly and decreases within a few frames are analyzed.

  13. Noise models for low counting rate coherent diffraction imaging.

    PubMed

    Godard, Pierre; Allain, Marc; Chamard, Virginie; Rodenburg, John

    2012-11-05

    Coherent diffraction imaging (CDI) is a lens-less microscopy method that extracts the complex-valued exit field from intensity measurements alone. It is of particular importance for microscopy imaging with diffraction set-ups where high quality lenses are not available. The inversion scheme allowing the phase retrieval is based on the use of an iterative algorithm. In this work, we address the question of the choice of the iterative process in the case of data corrupted by photon or electron shot noise. Several noise models are presented and further used within two inversion strategies, the ordered subset and the scaled gradient. Based on analytical and numerical analysis together with Monte-Carlo studies, we show that any physical interpretations drawn from a CDI iterative technique require a detailed understanding of the relationship between the noise model and the used inversion method. We observe that iterative algorithms often assume implicitly a noise model. For low counting rates, each noise model behaves differently. Moreover, the used optimization strategy introduces its own artefacts. Based on this analysis, we develop a hybrid strategy which works efficiently in the absence of an informed initial guess. Our work emphasises issues which should be considered carefully when inverting experimental data.

  14. A Burst-Mode Photon-Counting Receiver with Automatic Channel Estimation and Bit Rate Detection

    DTIC Science & Technology

    2016-02-24

    communication at data rates up to 10.416 Mb/s over a 30-foot water channel. To the best of our knowledge, this is the first demonstration of burst-mode...obstructions. The receiver is capable of on-the-fly data rate detection and adapts to changing levels of signal and background light. The receiver...receiver. We demonstrate on-the-fly rate detection, channel BER within 0.2 dB of theory across all data rates, and error-free performance within 1.82 dB

  15. Fingerprint Ridge Count: A Polygenic Trait Useful in Classroom Instruction.

    ERIC Educational Resources Information Center

    Mendenhall, Gordon; And Others

    1989-01-01

    Describes the use of the polygenic trait of total fingerprint ridge count in the classroom as a laboratory investigation. Presents information on background of topic, fingerprint patterns which are classified into three major groups, ridge count, the inheritance model, and activities. Includes an example data sheet format for fingerprints. (RT)

  16. Background Characterization for Thermal Ion Release Experiments with 224Ra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kwong, H.; /Stanford U., Phys. Dept.; Rowson, P.

    The Enriched Xenon Observatory for neutrinoless double beta decay uses {sup 136}Ba identification as a means for verifying the decay's occurrence in {sup 136}Xe. A current challenge is the release of Ba ions from the Ba extraction probe, and one possible solution is to heat the probe to high temperatures to release the ions. The investigation of this method requires a characterization of the alpha decay background in our test apparatus, which uses a {sup 228}Th source that produces {sup 224}Ra daughters, the ionization energies of which are similar to those of Ba. For this purpose, we ran a backgroundmore » count with our apparatus maintained at a vacuum, and then three counts with the apparatus filled with Xe gas. We were able to match up our alpha spectrum in vacuum with the known decay scheme of {sup 228}Th, while the spectrum in xenon gas had too many unresolved ambiguities for an accurate characterization. We also found that the alpha decays occurred at a near-zero rate both in vacuum and in xenon gas, which indicates that the rate was determined by {sup 228}Th decays. With these background measurements, we can in the future make a more accurate measurement of the temperature dependency of the ratio of ions to neutral atoms released from the hot surface of the probe, which may lead to a successful method of Ba ion release.« less

  17. Knowledge of resting heart rate mediates the relationship between intelligence and the heartbeat counting task.

    PubMed

    Murphy, Jennifer; Millgate, Edward; Geary, Hayley; Ichijo, Eri; Coll, Michel-Pierre; Brewer, Rebecca; Catmur, Caroline; Bird, Geoffrey

    2018-03-01

    Evidence suggests that intelligence is positively associated with performance on the heartbeat counting task (HCT). The HCT is often employed as measure of interoception - the ability to perceive the internal state of one's body - however it's use remains controversial as performance on the HCT is strongly influenced by knowledge of resting heart rate. This raises the possibility that heart rate knowledge may mediate the previously-observed association between intelligence and HCT performance. Study One demonstrates an association between intelligence and HCT performance (N = 94), and Study Two demonstrates that this relationship is mediated by knowledge of the average resting heart rate (N = 134). These data underscore the need to account for the influence of prior knowledge and beliefs when examining individual differences in cardiac interoceptive accuracy using the HCT. Copyright © 2018 The Author(s). Published by Elsevier B.V. All rights reserved.

  18. Development of a stained cell nuclei counting system

    NASA Astrophysics Data System (ADS)

    Timilsina, Niranjan; Moffatt, Christopher; Okada, Kazunori

    2011-03-01

    This paper presents a novel cell counting system which exploits the Fast Radial Symmetry Transformation (FRST) algorithm [1]. The driving force behind our system is a research on neurogenesis in the intact nervous system of Manduca Sexta or the Tobacco Hornworm, which was being studied to assess the impact of age, food and environment on neurogenesis. The varying thickness of the intact nervous system in this species often yields images with inhomogeneous background and inconsistencies such as varying illumination, variable contrast, and irregular cell size. For automated counting, such inhomogeneity and inconsistencies must be addressed, which no existing work has done successfully. Thus, our goal is to devise a new cell counting algorithm for the images with non-uniform background. Our solution adapts FRST: a computer vision algorithm which is designed to detect points of interest on circular regions such as human eyes. This algorithm enhances the occurrences of the stained-cell nuclei in 2D digital images and negates the problems caused by their inhomogeneity. Besides FRST, our algorithm employs standard image processing methods, such as mathematical morphology and connected component analysis. We have evaluated the developed cell counting system with fourteen digital images of Tobacco Hornworm's nervous system collected for this study with ground-truth cell counts by biology experts. Experimental results show that our system has a minimum error of 1.41% and mean error of 16.68% which is at least forty-four percent better than the algorithm without FRST.

  19. Background observations on the SMM high energy monitor at energies greater than 10 MeV

    NASA Technical Reports Server (NTRS)

    Forrest, D. J.

    1989-01-01

    The background rate in any gamma ray detector on a spacecraft in near-earth orbit is strongly influenced by the primary cosmic ray flux at the spacecraft's position. Although the direct counting of the primary cosmic rays can be rejected by anticoincident shields, secondary production cannot be. Secondary production of gamma rays and neutrons in the instrument, the spacecraft, and the earth's atmospheric are recorded as background. A 30 day data base of 65.5 second records has been used to show that some of the background rates observed on the Gamma Ray Spectrometer can be ordered to a precision on the order of 1 percent This ordering is done with only two parameters, namely the cosmic ray vertical cutoff rigidity and the instrument's pointing angle with respect to the earth's center. This result sets limits on any instrumental instability and also on any temporal or spatial changes in the background radiation field.

  20. The absolute counting of red cell-derived microparticles with red cell bead by flow rate based assay.

    PubMed

    Nantakomol, Duangdao; Imwong, Malika; Soontarawirat, Ingfar; Kotjanya, Duangporn; Khakhai, Chulalak; Ohashi, Jun; Nuchnoi, Pornlada

    2009-05-01

    Activation of red blood cell is associated with the formation of red cell-derived microparticles (RMPs). Analysis of circulating RMPs is becoming more refined and clinically useful. A quantitative Trucount tube method is the conventional method uses for quantitating RMPs. In this study, we validated a quantitative method called "flow rate based assay using red cell bead (FCB)" to measure circulating RMPs in the peripheral blood of healthy subjects. Citrated blood samples collected from 30 cases of healthy subjects were determined the RMPs count by using double labeling of annexin V-FITC and anti-glycophorin A-PE. The absolute RMPs numbers were measured by FCB, and the results were compared with the Trucount or with flow rate based calibration (FR). Statistical correlation and agreement were analyzed using linear regression and Bland-Altman analysis. There was no significant difference in the absolute number of RMPs quantitated by FCB when compared with those two reference methods including the Trucount tube and FR method. The absolute RMPs count obtained from FCB method was highly correlated with those obtained from Trucount tube (r(2) = 0.98, mean bias 4 cell/microl, limit of agreement [LOA] -20.3 to 28.3 cell/microl), and FR method (r(2) = 1, mean bias 10.3 cell/microl, and LOA -5.5 to 26.2 cell/microl). This study demonstrates that FCB is suitable and more affordable for RMPs quantitation in the clinical samples. This method is a low cost and interchangeable to latex bead-based method for generating the absolute counts in the resource-limited areas. (c) 2008 Clinical Cytometry Society.

  1. Analysis of Sample Size, Counting Time, and Plot Size from an Avian Point Count Survey on Hoosier National Forest, Indiana

    Treesearch

    Frank R. Thompson; Monica J. Schwalbach

    1995-01-01

    We report results of a point count survey of breeding birds on Hoosier National Forest in Indiana. We determined sample size requirements to detect differences in means and the effects of count duration and plot size on individual detection rates. Sample size requirements ranged from 100 to >1000 points with Type I and II error rates of <0.1 and 0.2. Sample...

  2. Automated vehicle counting using image processing and machine learning

    NASA Astrophysics Data System (ADS)

    Meany, Sean; Eskew, Edward; Martinez-Castro, Rosana; Jang, Shinae

    2017-04-01

    Vehicle counting is used by the government to improve roadways and the flow of traffic, and by private businesses for purposes such as determining the value of locating a new store in an area. A vehicle count can be performed manually or automatically. Manual counting requires an individual to be on-site and tally the traffic electronically or by hand. However, this can lead to miscounts due to factors such as human error A common form of automatic counting involves pneumatic tubes, but pneumatic tubes disrupt traffic during installation and removal, and can be damaged by passing vehicles. Vehicle counting can also be performed via the use of a camera at the count site recording video of the traffic, with counting being performed manually post-recording or using automatic algorithms. This paper presents a low-cost procedure to perform automatic vehicle counting using remote video cameras with an automatic counting algorithm. The procedure would utilize a Raspberry Pi micro-computer to detect when a car is in a lane, and generate an accurate count of vehicle movements. The method utilized in this paper would use background subtraction to process the images and a machine learning algorithm to provide the count. This method avoids fatigue issues that are encountered in manual video counting and prevents the disruption of roadways that occurs when installing pneumatic tubes

  3. Icon flickering, flicker rate, and color combinations of an icon's symbol/background in visual search performance.

    PubMed

    Huang, Kuo-Chen; Chiang, Shu-Ying; Chen, Chen-Fu

    2008-02-01

    The effects of color combinations of an icon's symbol/background and components of flicker and flicker rate on visual search performance on a liquid crystal display screen were investigated with 39 subjects who searched for a target icon in a circular stimulus array (diameter = 20 cm) including one target and 19 distractors. Analysis showed that the icon's symbol/background color significantly affected search time. The search times for icons with black/red and white/blue were significantly shorter than for white/yellow, black/yellow, and black/blue. Flickering of different components of the icon significantly affected the search time. Search time for an icon's border flickering was shorter than for an icon symbol flickering; search for flicker rates of 3 and 5 Hz was shorter than that for 1 Hz. For icon's symbol/background color combinations, search error rate for black/blue was greater than for black/red and white/blue combinations, and the error rate for an icon's border flickering was lower than for an icon's symbol flickering. Interactions affected search time and error rate. Results are applicable to design of graphic user interfaces.

  4. Photon Counting Detectors for the 1.0 - 2.0 Micron Wavelength Range

    NASA Technical Reports Server (NTRS)

    Krainak, Michael A.

    2004-01-01

    We describe results on the development of greater than 200 micron diameter, single-element photon-counting detectors for the 1-2 micron wavelength range. The technical goals include quantum efficiency in the range 10-70%; detector diameter greater than 200 microns; dark count rate below 100 kilo counts-per-second (cps), and maximum count rate above 10 Mcps.

  5. Point count length and detection of forest neotropical migrant birds

    USGS Publications Warehouse

    Dawson, D.K.; Smith, D.R.; Robbins, C.S.; Ralph, C. John; Sauer, John R.; Droege, Sam

    1995-01-01

    Comparisons of bird abundances among years or among habitats assume that the rates at which birds are detected and counted are constant within species. We use point count data collected in forests of the Mid-Atlantic states to estimate detection probabilities for Neotropical migrant bird species as a function of count length. For some species, significant differences existed among years or observers in both the probability of detecting the species and in the rate at which individuals are counted. We demonstrate the consequence that variability in species' detection probabilities can have on estimates of population change, and discuss ways for reducing this source of bias in point count studies.

  6. Building and Activating Students' Background Knowledge: It's What They Already Know That Counts

    ERIC Educational Resources Information Center

    Fisher, Douglas; Frey, Nancy; Lapp, Diane

    2012-01-01

    Students enter the middle grades with varying amounts of background knowledge. Teachers must assess student background knowledge for gaps or misconceptions and then provide instruction to build on that base. This article discusses effective strategies for assessing and developing students' background knowledge so they can become independent…

  7. Approach for counting vehicles in congested traffic flow

    NASA Astrophysics Data System (ADS)

    Tan, Xiaojun; Li, Jun; Liu, Wei

    2005-02-01

    More and more image sensors are used in intelligent transportation systems. In practice, occlusion is always a problem when counting vehicles in congested traffic. This paper tries to present an approach to solve the problem. The proposed approach consists of three main procedures. Firstly, a new algorithm of background subtraction is performed. The aim is to segment moving objects from an illumination-variant background. Secondly, object tracking is performed, where the CONDENSATION algorithm is used. This can avoid the problem of matching vehicles in successive frames. Thirdly, an inspecting procedure is executed to count the vehicles. When a bus firstly occludes a car and then the bus moves away a few frames later, the car will appear in the scene. The inspecting procedure should find the "new" car and add it as a tracking object.

  8. Improving the counting efficiency in time-correlated single photon counting experiments by dead-time optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peronio, P.; Acconcia, G.; Rech, I.

    Time-Correlated Single Photon Counting (TCSPC) has been long recognized as the most sensitive method for fluorescence lifetime measurements, but often requiring “long” data acquisition times. This drawback is related to the limited counting capability of the TCSPC technique, due to pile-up and counting loss effects. In recent years, multi-module TCSPC systems have been introduced to overcome this issue. Splitting the light into several detectors connected to independent TCSPC modules proportionally increases the counting capability. Of course, multi-module operation also increases the system cost and can cause space and power supply problems. In this paper, we propose an alternative approach basedmore » on a new detector and processing electronics designed to reduce the overall system dead time, thus enabling efficient photon collection at high excitation rate. We present a fast active quenching circuit for single-photon avalanche diodes which features a minimum dead time of 12.4 ns. We also introduce a new Time-to-Amplitude Converter (TAC) able to attain extra-short dead time thanks to the combination of a scalable array of monolithically integrated TACs and a sequential router. The fast TAC (F-TAC) makes it possible to operate the system towards the upper limit of detector count rate capability (∼80 Mcps) with reduced pile-up losses, addressing one of the historic criticisms of TCSPC. Preliminary measurements on the F-TAC are presented and discussed.« less

  9. High Count-Rate Study of Two TES X-Ray Microcalorimeters With Different Transition Temperatures

    NASA Technical Reports Server (NTRS)

    Lee, Sang-Jun; Adams, Joseph S.; Bandler, Simon R.; Betancourt-Martinez, Gabriele L.; Chervenak, James A.; Eckart, Megan E.; Finkbeiner, Fred M.; Kelley, Richard L.; Kilbourne, Caroline A.; Porter, Frederick S.; hide

    2017-01-01

    We have developed transition-edge sensor (TES) microcalorimeter arrays with high count-rate capability and high energy resolution to carry out x-ray imaging spectroscopy observations of various astronomical sources and the Sun. We have studied the dependence of the energy resolution and throughput (fraction of processed pulses) on the count rate for such microcalorimeters with two different transition temperatures T(sub c). Devices with both transition temperatures were fabricated within a single microcalorimeter array directly on top of a solid substrate where the thermal conductance of the microcalorimeter is dependent upon the thermal boundary resistance between the TES sensor and the dielectric substrate beneath. Because the thermal boundary resistance is highly temperature dependent, the two types of device with different T(sub c)(sup s) had very different thermal decay times, approximately one order of magnitude different. In our earlier report, we achieved energy resolutions of 1.6 and 2.eV at 6 keV from lower and higher T(sub c) devices, respectively, using a standard analysis method based on optimal filtering in the low flux limit. We have now measured the same devices at elevated x-ray fluxes ranging from 50 Hz to 1000 Hz per pixel. In the high flux limit, however, the standard optimal filtering scheme nearly breaks down because of x-ray pile-up. To achieve the highest possible energy resolution for a fixed throughput, we have developed an analysis scheme based on the socalled event grade method. Using the new analysis scheme, we achieved 5.0 eV FWHM with 96 Percent throughput for 6 keV x-rays of 1025 Hz per pixel with the higher T(sub c) (faster) device, and 5.8 eV FWHM with 97 Percent throughput with the lower T(sub c) (slower) device at 722 Hz.

  10. High count-rate study of two TES x-ray microcalorimeters with different transition temperatures

    NASA Astrophysics Data System (ADS)

    Lee, Sang-Jun; Adams, Joseph S.; Bandler, Simon R.; Betancourt-Martinez, Gabriele L.; Chervenak, James A.; Eckart, Megan E.; Finkbeiner, Fred M.; Kelley, Richard L.; Kilbourne, Caroline A.; Porter, Frederick S.; Sadleir, John E.; Smith, Stephen J.; Wassell, Edward J.

    2017-10-01

    We have developed transition-edge sensor (TES) microcalorimeter arrays with high count-rate capability and high energy resolution to carry out x-ray imaging spectroscopy observations of various astronomical sources and the Sun. We have studied the dependence of the energy resolution and throughput (fraction of processed pulses) on the count rate for such microcalorimeters with two different transition temperatures (T c). Devices with both transition temperatures were fabricated within a single microcalorimeter array directly on top of a solid substrate where the thermal conductance of the microcalorimeter is dependent upon the thermal boundary resistance between the TES sensor and the dielectric substrate beneath. Because the thermal boundary resistance is highly temperature dependent, the two types of device with different T cs had very different thermal decay times, approximately one order of magnitude different. In our earlier report, we achieved energy resolutions of 1.6 and 2.3 eV at 6 keV from lower and higher T c devices, respectively, using a standard analysis method based on optimal filtering in the low flux limit. We have now measured the same devices at elevated x-ray fluxes ranging from 50 Hz to 1000 Hz per pixel. In the high flux limit, however, the standard optimal filtering scheme nearly breaks down because of x-ray pile-up. To achieve the highest possible energy resolution for a fixed throughput, we have developed an analysis scheme based on the so-called event grade method. Using the new analysis scheme, we achieved 5.0 eV FWHM with 96% throughput for 6 keV x-rays of 1025 Hz per pixel with the higher T c (faster) device, and 5.8 eV FWHM with 97% throughput with the lower T c (slower) device at 722 Hz.

  11. Study of Rubber Composites with Positron Doppler Broadening Spectroscopy: Consideration of Counting Rate

    NASA Astrophysics Data System (ADS)

    Yang, Chun; Quarles, C. A.

    2007-10-01

    We have used positron Doppler Broadening Spectroscopy (DBS) to investigate the uniformity of rubber-carbon black composite samples. The amount of carbon black added to a rubber sample is characterized by phr, the number of grams of carbon black per hundred grams of rubber. Typical concentrations in rubber tires are 50 phr. It has been shown that the S parameter measured by DBS depends on the phr of the sample, so the variation in carbon black concentration can be easily measured to 0.5 phr. In doing the experiments we observed a dependence of the S parameter on small variation in the counting rate or deadtime. By carefully calibrating this deadtime correction we can significantly reduce the experimental run time and thus make faster determination of the uniformity of extended samples.

  12. Power counting to better jet observables

    NASA Astrophysics Data System (ADS)

    Larkoski, Andrew J.; Moult, Ian; Neill, Duff

    2014-12-01

    Optimized jet substructure observables for identifying boosted topologies will play an essential role in maximizing the physics reach of the Large Hadron Collider. Ideally, the design of discriminating variables would be informed by analytic calculations in perturbative QCD. Unfortunately, explicit calculations are often not feasible due to the complexity of the observables used for discrimination, and so many validation studies rely heavily, and solely, on Monte Carlo. In this paper we show how methods based on the parametric power counting of the dynamics of QCD, familiar from effective theory analyses, can be used to design, understand, and make robust predictions for the behavior of jet substructure variables. As a concrete example, we apply power counting for discriminating boosted Z bosons from massive QCD jets using observables formed from the n-point energy correlation functions. We show that power counting alone gives a definite prediction for the observable that optimally separates the background-rich from the signal-rich regions of phase space. Power counting can also be used to understand effects of phase space cuts and the effect of contamination from pile-up, which we discuss. As these arguments rely only on the parametric scaling of QCD, the predictions from power counting must be reproduced by any Monte Carlo, which we verify using Pythia 8 and Herwig++. We also use the example of quark versus gluon discrimination to demonstrate the limits of the power counting technique.

  13. Montana Kids Count 1996 Data Book.

    ERIC Educational Resources Information Center

    Healthy Mothers, Healthy Babies--The Montana Coalition, Helena.

    This 1996 KIDS COUNT data book presents comparative data on child well-being for each county in Montana and for the state as a whole. Data in the county profiles, which comprise the bulk of the report, are grouped into: background facts (demographic, mental health, education, security, and income support information); charts showing changes in…

  14. Low-background Gamma Spectroscopy at Sanford Underground Laboratory

    NASA Astrophysics Data System (ADS)

    Chiller, Christopher; Alanson, Angela; Mei, Dongming

    2014-03-01

    Rare-event physics experiments require the use of material with unprecedented radio-purity. Low background counting assay capabilities and detectors are critical for determining the sensitivity of the planned ultra-low background experiments. A low-background counting, LBC, facility has been built at the 4850-Level Davis Campus of the Sanford Underground Research Facility to perform screening of material and detector parts. Like many rare event physics experiments, our LBC uses lead shielding to mitigate background radiation. Corrosion of lead brick shielding in subterranean installations creates radon plate-out potential as well as human risks of ingestible or respirable lead compounds. Our LBC facilities employ an exposed lead shield requiring clean smooth surfaces. A cleaning process of low-activity silica sand blasting and borated paraffin hot coating preservation was employed to guard against corrosion due to chemical and biological exposures. The resulting lead shield maintains low background contribution integrity while fully encapsulating the lead surface. We report the performance of the current LBC and a plan to develop a large germanium well detector for PMT screening. Support provided by Sd governors research center-CUBED, NSF PHY-0758120 and Sanford Lab.

  15. Kids Count Data Sheet, 2000.

    ERIC Educational Resources Information Center

    Annie E. Casey Foundation, Baltimore, MD.

    Data from the 50 United States are listed for 1997 from Kids Count in an effort to track state-by-state the status of children in the United States and to secure better futures for all children. Data include percent low birth weight babies; infant mortality rate; child death rate; rate of teen deaths by accident, homicide, and suicide; teen birth…

  16. The Significance of an Excess in a Counting Experiment: Assessing the Impact of Systematic Uncertainties and the Case with a Gaussian Background

    NASA Astrophysics Data System (ADS)

    Vianello, Giacomo

    2018-05-01

    Several experiments in high-energy physics and astrophysics can be treated as on/off measurements, where an observation potentially containing a new source or effect (“on” measurement) is contrasted with a background-only observation free of the effect (“off” measurement). In counting experiments, the significance of the new source or effect can be estimated with a widely used formula from Li & Ma, which assumes that both measurements are Poisson random variables. In this paper we study three other cases: (i) the ideal case where the background measurement has no uncertainty, which can be used to study the maximum sensitivity that an instrument can achieve, (ii) the case where the background estimate b in the off measurement has an additional systematic uncertainty, and (iii) the case where b is a Gaussian random variable instead of a Poisson random variable. The latter case applies when b comes from a model fitted on archival or ancillary data, or from the interpolation of a function fitted on data surrounding the candidate new source/effect. Practitioners typically use a formula that is only valid when b is large and when its uncertainty is very small, while we derive a general formula that can be applied in all regimes. We also develop simple methods that can be used to assess how much an estimate of significance is sensitive to systematic uncertainties on the efficiency or on the background. Examples of applications include the detection of short gamma-ray bursts and of new X-ray or γ-ray sources. All the techniques presented in this paper are made available in a Python code that is ready to use.

  17. Investigation of FPGA-Based Real-Time Adaptive Digital Pulse Shaping for High-Count-Rate Applications

    NASA Astrophysics Data System (ADS)

    Saxena, Shefali; Hawari, Ayman I.

    2017-07-01

    Digital signal processing techniques have been widely used in radiation spectrometry to provide improved stability and performance with compact physical size over the traditional analog signal processing. In this paper, field-programmable gate array (FPGA)-based adaptive digital pulse shaping techniques are investigated for real-time signal processing. National Instruments (NI) NI 5761 14-bit, 250-MS/s adaptor module is used for digitizing high-purity germanium (HPGe) detector's preamplifier pulses. Digital pulse processing algorithms are implemented on the NI PXIe-7975R reconfigurable FPGA (Kintex-7) using the LabVIEW FPGA module. Based on the time separation between successive input pulses, the adaptive shaping algorithm selects the optimum shaping parameters (rise time and flattop time of trapezoid-shaping filter) for each incoming signal. A digital Sallen-Key low-pass filter is implemented to enhance signal-to-noise ratio and reduce baseline drifting in trapezoid shaping. A recursive trapezoid-shaping filter algorithm is employed for pole-zero compensation of exponentially decayed (with two-decay constants) preamplifier pulses of an HPGe detector. It allows extraction of pulse height information at the beginning of each pulse, thereby reducing the pulse pileup and increasing throughput. The algorithms for RC-CR2 timing filter, baseline restoration, pile-up rejection, and pulse height determination are digitally implemented for radiation spectroscopy. Traditionally, at high-count-rate conditions, a shorter shaping time is preferred to achieve high throughput, which deteriorates energy resolution. In this paper, experimental results are presented for varying count-rate and pulse shaping conditions. Using adaptive shaping, increased throughput is accepted while preserving the energy resolution observed using the longer shaping times.

  18. Effects of dilution rates, animal species and instruments on the spectrophotometric determination of sperm counts.

    PubMed

    Rondeau, M; Rouleau, M

    1981-06-01

    Using semen from bull, boar and stallion as well as different spectrophotometers, we established the calibration curves relating the optical density of a sperm sample to the sperm count obtained on the hemacytometer. The results show that, for a given spectrophotometer, the calibration curve is not characteristic of the animal species we studied. The differences in size of the spermatozoa are probably too small to account for the anticipated specificity of the calibration curve. Furthermore, the fact that different dilution rates must be used, because of the vastly different concentrations of spermatozoa which is characteristic of those species, has no effect on the calibration curves since the dilution rate is shown to be artefactual. On the other hand, for a given semen, the calibration curve varies depending upon the spectrophotometry used. However, if two instruments have the same characteristic in terms of spectral bandwidth, the calibration curves are not statistically different.

  19. Different binarization processes validated against manual counts of fluorescent bacterial cells.

    PubMed

    Tamminga, Gerrit G; Paulitsch-Fuchs, Astrid H; Jansen, Gijsbert J; Euverink, Gert-Jan W

    2016-09-01

    State of the art software methods (such as fixed value approaches or statistical approaches) to create a binary image of fluorescent bacterial cells are not as accurate and precise as they should be for counting bacteria and measuring their area. To overcome these bottlenecks, we introduce biological significance to obtain a binary image from a greyscale microscopic image. Using our biological significance approach we are able to automatically count about the same number of cells as an individual researcher would do by manual/visual counting. Using the fixed value or statistical approach to obtain a binary image leads to about 20% less cells in automatic counting. In our procedure we included the area measurements of the bacterial cells to determine the right parameters for background subtraction and threshold values. In an iterative process the threshold and background subtraction values were incremented until the number of particles smaller than a typical bacterial cell is less than the number of bacterial cells with a certain area. This research also shows that every image has a specific threshold with respect to the optical system, magnification and staining procedure as well as the exposure time. The biological significance approach shows that automatic counting can be performed with the same accuracy, precision and reproducibility as manual counting. The same approach can be used to count bacterial cells using different optical systems (Leica, Olympus and Navitar), magnification factors (200× and 400×), staining procedures (DNA (Propidium Iodide) and RNA (FISH)) and substrates (polycarbonate filter or glass). Copyright © 2016 Elsevier B.V. All rights reserved.

  20. Hydrophilic-treated plastic plates for wide-range analysis of Giemsa-stained red blood cells and automated Plasmodium infection rate counting.

    PubMed

    Hashimoto, Muneaki; Yatsushiro, Shouki; Yamamura, Shohei; Tanaka, Masato; Sakamoto, Hirokazu; Ido, Yusuke; Kajimoto, Kazuaki; Bando, Mika; Kido, Jun-Ichi; Kataoka, Masatoshi

    2017-08-08

    Malaria is a red blood cell (RBC) infection caused by Plasmodium parasites. To determine RBC infection rate, which is essential for malaria study and diagnosis, microscopic evaluation of Giemsa-stained thin blood smears on glass slides ('Giemsa microscopy') has been performed as the accepted gold standard for over 100 years. However, only a small area of the blood smear provides a monolayer of RBCs suitable for determination of infection rate, which is one of the major reasons for the low parasite detection rate by Giemsa microscopy. In addition, because Giemsa microscopy is exacting and time-consuming, automated counting of infection rates is highly desirable. A method that allows for microscopic examination of Giemsa-stained cells spread in a monolayer on almost the whole surface of hydrophilic-treated cyclic olefin copolymer (COC) plates was established. Because wide-range Giemsa microscopy can be performed on a hydrophilic-treated plate, the method may enable more reliable diagnosis of malaria in patients with low parasitaemia burden. Furthermore, the number of RBCs and parasites stained with a fluorescent nuclear staining dye could be counted automatically with a software tool, without Giemsa staining. As a result, researchers studying malaria may calculate the infection rate easily, rapidly, and accurately even in low parasitaemia. Because the running cost of these methods is very low and they do not involve complicated techniques, the use of hydrophilic COC plates may contribute to improved and more accurate diagnosis and research of malaria.

  1. A Bridge from Optical to Infrared Galaxies: Explaining Local Properties, Predicting Galaxy Counts and the Cosmic Background Radiation

    NASA Astrophysics Data System (ADS)

    Totani, T.; Takeuchi, T. T.

    2001-12-01

    A new model of infrared galaxy counts and the cosmic background radiation (CBR) is developed by extending a model for optical/near-infrared galaxies. Important new characteristics of this model are that mass scale dependence of dust extinction is introduced based on the size-luminosity relation of optical galaxies, and that the big grain dust temperature T dust is calculated based on a physical consideration for energy balance, rather than using the empirical relation between T dust and total infrared luminosity L IR found in local galaxies, which has been employed in most of previous works. Consequently, the local properties of infrared galaxies, i.e., optical/infrared luminosity ratios, L IR-T dust correlation, and infrared luminosity function are outputs predicted by the model, while these have been inputs in a number of previous models. Our model indeed reproduces these local properties reasonably well. Then we make predictions for faint infrared counts (in 15, 60, 90, 170, 450, and 850 μ m) and CBR by this model. We found considerably different results from most of previous works based on the empirical L IR-T dust relation; especially, it is shown that the dust temperature of starbursting primordial elliptical galaxies is expected to be very high (40--80K). This indicates that intense starbursts of forming elliptical galaxies should have occurred at z ~ 2--3, in contrast to the previous results that significant starbursts beyond z ~ 1 tend to overproduce the far-infrared (FIR) CBR detected by COBE/FIRAS. On the other hand, our model predicts that the mid-infrared (MIR) flux from warm/nonequilibrium dust is relatively weak in such galaxies making FIR CBR, and this effect reconciles the prima facie conflict between the upper limit on MIR CBR from TeV gamma-ray observations and the COBE\\ detections of FIR CBR. The authors thank the financial support by the Japan Society for Promotion of Science.

  2. A Method Based on Wavelet Transforms for Source Detection in Photon-counting Detector Images. II. Application to ROSAT PSPC Images

    NASA Astrophysics Data System (ADS)

    Damiani, F.; Maggio, A.; Micela, G.; Sciortino, S.

    1997-07-01

    We apply to the specific case of images taken with the ROSAT PSPC detector our wavelet-based X-ray source detection algorithm presented in a companion paper. Such images are characterized by the presence of detector ``ribs,'' strongly varying point-spread function, and vignetting, so that their analysis provides a challenge for any detection algorithm. First, we apply the algorithm to simulated images of a flat background, as seen with the PSPC, in order to calibrate the number of spurious detections as a function of significance threshold and to ascertain that the spatial distribution of spurious detections is uniform, i.e., unaffected by the ribs; this goal was achieved using the exposure map in the detection procedure. Then, we analyze simulations of PSPC images with a realistic number of point sources; the results are used to determine the efficiency of source detection and the accuracy of output quantities such as source count rate, size, and position, upon a comparison with input source data. It turns out that sources with 10 photons or less may be confidently detected near the image center in medium-length (~104 s), background-limited PSPC exposures. The positions of sources detected near the image center (off-axis angles < 15') are accurate to within a few arcseconds. Output count rates and sizes are in agreement with the input quantities, within a factor of 2 in 90% of the cases. The errors on position, count rate, and size increase with off-axis angle and for detections of lower significance. We have also checked that the upper limits computed with our method are consistent with the count rates of undetected input sources. Finally, we have tested the algorithm by applying it on various actual PSPC images, among the most challenging for automated detection procedures (crowded fields, extended sources, and nonuniform diffuse emission). The performance of our method in these images is satisfactory and outperforms those of other current X-ray detection

  3. WBC count

    MedlinePlus

    Leukocyte count; White blood cell count; White blood cell differential; WBC differential; Infection - WBC count; Cancer - WBC count ... called leukopenia. A count less than 4,500 cells per microliter (4.5 × 10 9 /L) is ...

  4. An observational study of spectators’ step counts and reasons for attending a professional golf tournament in Scotland

    PubMed Central

    Murray, Andrew D; Turner, Kieran; Archibald, Daryll; Schiphorst, Chloe; Griffin, Steffan Arthur; Scott, Hilary; Hawkes, Roger; Kelly, Paul; Grant, Liz; Mutrie, Nanette

    2017-01-01

    Background Spectators at several hundred golf tournaments on six continents worldwide may gain health-enhancing physical activity (HEPA) during their time at the event. This study aims to investigate spectators’ reasons for attending and assess spectator physical activity (PA) (measured by step count). Methods Spectators at the Paul Lawrie Matchplay event in Scotland (August 2016) were invited to take part in this study. They were asked to complete a brief questionnaire with items to assess (1) demographics, (2) reasons for attendance and (3) baseline PA. In addition, participants were requested to wear a pedometer from time of entry to the venue until exit. Results A total of 339 spectators were recruited to the study and out of which 329 (97.2%) returned step-count data. Spectators took a mean of 11 589 steps (SD 4531). ‘Fresh air’ (rated median 9 out of 10) then ‘watching star players’, ‘exercise/physical activity’, ‘time with friends and family’ and ‘atmosphere’ (all median 8 out of 10) were rated the most important reasons for attending. Conclusion This study is the first to assess spectator physical activity while watching golf (measured by step count). Obtaining exercise/PA is rated as an important reason for attending a tournament by many golf spectators. Spectating at a golf tournament can provide HEPA. 82.9% of spectators achieved the recommended daily step count while spectating. Further research directly assessing whether spectating may constitute a ‘teachable moment’, for increasing physical activity beyond the tournament itself, is merited. PMID:28761718

  5. Predictions of CD4 lymphocytes’ count in HIV patients from complete blood count

    PubMed Central

    2013-01-01

    Background HIV diagnosis, prognostic and treatment requires T CD4 lymphocytes’ number from flow cytometry, an expensive technique often not available to people in developing countries. The aim of this work is to apply a previous developed methodology that predicts T CD4 lymphocytes’ value based on total white blood cell (WBC) count and lymphocytes count applying sets theory, from information taken from the Complete Blood Count (CBC). Methods Sets theory was used to classify into groups named A, B, C and D the number of leucocytes/mm3, lymphocytes/mm3, and CD4/μL3 subpopulation per flow cytometry of 800 HIV diagnosed patients. Union between sets A and C, and B and D were assessed, and intersection between both unions was described in order to establish the belonging percentage to these sets. Results were classified into eight ranges taken by 1000 leucocytes/mm3, calculating the belonging percentage of each range with respect to the whole sample. Results Intersection (A ∪ C) ∩ (B ∪ D) showed an effectiveness in the prediction of 81.44% for the range between 4000 and 4999 leukocytes, 91.89% for the range between 3000 and 3999, and 100% for the range below 3000. Conclusions Usefulness and clinical applicability of a methodology based on sets theory were confirmed to predict the T CD4 lymphocytes’ value, beginning with WBC and lymphocytes’ count from CBC. This methodology is new, objective, and has lower costs than the flow cytometry which is currently considered as Gold Standard. PMID:24034560

  6. Counting-loss correction for X-ray spectroscopy using unit impulse pulse shaping.

    PubMed

    Hong, Xu; Zhou, Jianbin; Ni, Shijun; Ma, Yingjie; Yao, Jianfeng; Zhou, Wei; Liu, Yi; Wang, Min

    2018-03-01

    High-precision measurement of X-ray spectra is affected by the statistical fluctuation of the X-ray beam under low-counting-rate conditions. It is also limited by counting loss resulting from the dead-time of the system and pile-up pulse effects, especially in a high-counting-rate environment. In this paper a detection system based on a FAST-SDD detector and a new kind of unit impulse pulse-shaping method is presented, for counting-loss correction in X-ray spectroscopy. The unit impulse pulse-shaping method is evolved by inverse deviation of the pulse from a reset-type preamplifier and a C-R shaper. It is applied to obtain the true incoming rate of the system based on a general fast-slow channel processing model. The pulses in the fast channel are shaped to unit impulse pulse shape which possesses small width and no undershoot. The counting rate in the fast channel is corrected by evaluating the dead-time of the fast channel before it is used to correct the counting loss in the slow channel.

  7. Photon Counting Using Edge-Detection Algorithm

    NASA Technical Reports Server (NTRS)

    Gin, Jonathan W.; Nguyen, Danh H.; Farr, William H.

    2010-01-01

    New applications such as high-datarate, photon-starved, free-space optical communications require photon counting at flux rates into gigaphoton-per-second regimes coupled with subnanosecond timing accuracy. Current single-photon detectors that are capable of handling such operating conditions are designed in an array format and produce output pulses that span multiple sample times. In order to discern one pulse from another and not to overcount the number of incoming photons, a detection algorithm must be applied to the sampled detector output pulses. As flux rates increase, the ability to implement such a detection algorithm becomes difficult within a digital processor that may reside within a field-programmable gate array (FPGA). Systems have been developed and implemented to both characterize gigahertz bandwidth single-photon detectors, as well as process photon count signals at rates into gigaphotons per second in order to implement communications links at SCPPM (serial concatenated pulse position modulation) encoded data rates exceeding 100 megabits per second with efficiencies greater than two bits per detected photon. A hardware edge-detection algorithm and corresponding signal combining and deserialization hardware were developed to meet these requirements at sample rates up to 10 GHz. The photon discriminator deserializer hardware board accepts four inputs, which allows for the ability to take inputs from a quadphoton counting detector, to support requirements for optical tracking with a reduced number of hardware components. The four inputs are hardware leading-edge detected independently. After leading-edge detection, the resultant samples are ORed together prior to deserialization. The deserialization is performed to reduce the rate at which data is passed to a digital signal processor, perhaps residing within an FPGA. The hardware implements four separate analog inputs that are connected through RF connectors. Each analog input is fed to a high-speed 1

  8. Investigating the limits of PET/CT imaging at very low true count rates and high random fractions in ion-beam therapy monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurz, Christopher, E-mail: Christopher.Kurz@physik.uni-muenchen.de; Bauer, Julia; Conti, Maurizio

    Purpose: External beam radiotherapy with protons and heavier ions enables a tighter conformation of the applied dose to arbitrarily shaped tumor volumes with respect to photons, but is more sensitive to uncertainties in the radiotherapeutic treatment chain. Consequently, an independent verification of the applied treatment is highly desirable. For this purpose, the irradiation-induced β{sup +}-emitter distribution within the patient is detected shortly after irradiation by a commercial full-ring positron emission tomography/x-ray computed tomography (PET/CT) scanner installed next to the treatment rooms at the Heidelberg Ion-Beam Therapy Center (HIT). A major challenge to this approach is posed by the small numbermore » of detected coincidences. This contribution aims at characterizing the performance of the used PET/CT device and identifying the best-performing reconstruction algorithm under the particular statistical conditions of PET-based treatment monitoring. Moreover, this study addresses the impact of radiation background from the intrinsically radioactive lutetium-oxyorthosilicate (LSO)-based detectors at low counts. Methods: The authors have acquired 30 subsequent PET scans of a cylindrical phantom emulating a patientlike activity pattern and spanning the entire patient counting regime in terms of true coincidences and random fractions (RFs). Accuracy and precision of activity quantification, image noise, and geometrical fidelity of the scanner have been investigated for various reconstruction algorithms and settings in order to identify a practical, well-suited reconstruction scheme for PET-based treatment verification. Truncated listmode data have been utilized for separating the effects of small true count numbers and high RFs on the reconstructed images. A corresponding simulation study enabled extending the results to an even wider range of counting statistics and to additionally investigate the impact of scatter coincidences. Eventually, the

  9. Investigating the limits of PET/CT imaging at very low true count rates and high random fractions in ion-beam therapy monitoring.

    PubMed

    Kurz, Christopher; Bauer, Julia; Conti, Maurizio; Guérin, Laura; Eriksson, Lars; Parodi, Katia

    2015-07-01

    External beam radiotherapy with protons and heavier ions enables a tighter conformation of the applied dose to arbitrarily shaped tumor volumes with respect to photons, but is more sensitive to uncertainties in the radiotherapeutic treatment chain. Consequently, an independent verification of the applied treatment is highly desirable. For this purpose, the irradiation-induced β(+)-emitter distribution within the patient is detected shortly after irradiation by a commercial full-ring positron emission tomography/x-ray computed tomography (PET/CT) scanner installed next to the treatment rooms at the Heidelberg Ion-Beam Therapy Center (HIT). A major challenge to this approach is posed by the small number of detected coincidences. This contribution aims at characterizing the performance of the used PET/CT device and identifying the best-performing reconstruction algorithm under the particular statistical conditions of PET-based treatment monitoring. Moreover, this study addresses the impact of radiation background from the intrinsically radioactive lutetium-oxyorthosilicate (LSO)-based detectors at low counts. The authors have acquired 30 subsequent PET scans of a cylindrical phantom emulating a patientlike activity pattern and spanning the entire patient counting regime in terms of true coincidences and random fractions (RFs). Accuracy and precision of activity quantification, image noise, and geometrical fidelity of the scanner have been investigated for various reconstruction algorithms and settings in order to identify a practical, well-suited reconstruction scheme for PET-based treatment verification. Truncated listmode data have been utilized for separating the effects of small true count numbers and high RFs on the reconstructed images. A corresponding simulation study enabled extending the results to an even wider range of counting statistics and to additionally investigate the impact of scatter coincidences. Eventually, the recommended reconstruction scheme

  10. Laboratory productivity and the rate of manual peripheral blood smear review: a College of American Pathologists Q-Probes study of 95,141 complete blood count determinations performed in 263 institutions.

    PubMed

    Novis, David A; Walsh, Molly; Wilkinson, David; St Louis, Mary; Ben-Ezra, Jonathon

    2006-05-01

    Automated laboratory hematology analyzers are capable of performing differential counts on peripheral blood smears with greater precision and more accurate detection of distributional and morphologic abnormalities than those performed by manual examinations of blood smears. Manual determinations of blood morphology and leukocyte differential counts are time-consuming, expensive, and may not always be necessary. The frequency with which hematology laboratory workers perform manual screens despite the availability of labor-saving features of automated analyzers is unknown. To determine the normative rates with which manual peripheral blood smears were performed in clinical laboratories, to examine laboratory practices associated with higher or lower manual review rates, and to measure the effects of manual smear review on the efficiency of generating complete blood count (CBC) determinations. From each of 3 traditional shifts per day, participants were asked to select serially, 10 automated CBC specimens, and to indicate whether manual scans and/or reviews with complete differential counts were performed on blood smears prepared from those specimens. Sampling continued until a total of 60 peripheral smears were reviewed manually. For each specimen on which a manual review was performed, participants indicated the patient's age, hemoglobin value, white blood cell count, platelet count, and the primary reason why the manual review was performed. Participants also submitted data concerning their institutions' demographic profiles and their laboratories' staffing, work volume, and practices regarding CBC determinations. The rates of manual reviews and estimations of efficiency in performing CBC determinations were obtained from the data. A total of 263 hospitals and independent laboratories, predominantly located in the United States, participating in the College of American Pathologists Q-Probes Program. There were 95,141 CBC determinations examined in this study

  11. Deep 3 GHz number counts from a P(D) fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Vernstrom, T.; Scott, Douglas; Wall, J. V.; Condon, J. J.; Cotton, W. D.; Fomalont, E. B.; Kellermann, K. I.; Miller, N.; Perley, R. A.

    2014-05-01

    Radio source counts constrain galaxy populations and evolution, as well as the global star formation history. However, there is considerable disagreement among the published 1.4-GHz source counts below 100 μJy. Here, we present a statistical method for estimating the μJy and even sub-μJy source count using new deep wide-band 3-GHz data in the Lockman Hole from the Karl G. Jansky Very Large Array. We analysed the confusion amplitude distribution P(D), which provides a fresh approach in the form of a more robust model, with a comprehensive error analysis. We tested this method on a large-scale simulation, incorporating clustering and finite source sizes. We discuss in detail our statistical methods for fitting using Markov chain Monte Carlo, handling correlations, and systematic errors from the use of wide-band radio interferometric data. We demonstrated that the source count can be constrained down to 50 nJy, a factor of 20 below the rms confusion. We found the differential source count near 10 μJy to have a slope of -1.7, decreasing to about -1.4 at fainter flux densities. At 3 GHz, the rms confusion in an 8-arcsec full width at half-maximum beam is ˜ 1.2 μJy beam-1, and a radio background temperature ˜14 mK. Our counts are broadly consistent with published evolutionary models. With these results, we were also able to constrain the peak of the Euclidean normalized differential source count of any possible new radio populations that would contribute to the cosmic radio background down to 50 nJy.

  12. Development of a low background liquid scintillation counter for a shallow underground laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erchinger, Jennifer L.; Aalseth, Craig E.; Bernacki, Bruce E.

    2015-08-20

    Pacific Northwest National Laboratory has recently opened a shallow underground laboratory intended for measurement of lowconcentration levels of radioactive isotopes in samples collected from the environment. The development of a low-background liquid scintillation counter is currently underway to further augment the measurement capabilities within this underground laboratory. Liquid scintillation counting is especially useful for measuring charged particle (e.g., B, a) emitting isotopes with no (orvery weak) gamma-ray yields. The combination of high-efficiency detection of charged particle emission in a liquid scintillation cocktail coupled with the low-background environment of an appropriately-designed shield located in a clean underground laboratory provides the opportunitymore » for increased-sensitivity measurements of a range of isotopes. To take advantage of the 35-meter water-equivalent overburden of the underground laboratory, a series of simulations have evaluated the instrumental shield design requirements to assess the possible background rate achievable. This report presents the design and background evaluation for a shallow underground, low background liquid scintillation counter design for sample measurements.« less

  13. Recursive least squares background prediction of univariate syndromic surveillance data

    PubMed Central

    2009-01-01

    Background Surveillance of univariate syndromic data as a means of potential indicator of developing public health conditions has been used extensively. This paper aims to improve the performance of detecting outbreaks by using a background forecasting algorithm based on the adaptive recursive least squares method combined with a novel treatment of the Day of the Week effect. Methods Previous work by the first author has suggested that univariate recursive least squares analysis of syndromic data can be used to characterize the background upon which a prediction and detection component of a biosurvellance system may be built. An adaptive implementation is used to deal with data non-stationarity. In this paper we develop and implement the RLS method for background estimation of univariate data. The distinctly dissimilar distribution of data for different days of the week, however, can affect filter implementations adversely, and so a novel procedure based on linear transformations of the sorted values of the daily counts is introduced. Seven-days ahead daily predicted counts are used as background estimates. A signal injection procedure is used to examine the integrated algorithm's ability to detect synthetic anomalies in real syndromic time series. We compare the method to a baseline CDC forecasting algorithm known as the W2 method. Results We present detection results in the form of Receiver Operating Characteristic curve values for four different injected signal to noise ratios using 16 sets of syndromic data. We find improvements in the false alarm probabilities when compared to the baseline W2 background forecasts. Conclusion The current paper introduces a prediction approach for city-level biosurveillance data streams such as time series of outpatient clinic visits and sales of over-the-counter remedies. This approach uses RLS filters modified by a correction for the weekly patterns often seen in these data series, and a threshold detection algorithm from the

  14. Simultaneous measurement of tritium and radiocarbon by ultra-low-background proportional counting.

    PubMed

    Mace, Emily; Aalseth, Craig; Alexander, Tom; Back, Henning; Day, Anthony; Hoppe, Eric; Keillor, Martin; Moran, Jim; Overman, Cory; Panisko, Mark; Seifert, Allen

    2017-08-01

    Use of ultra-low-background capabilities at Pacific Northwest National Laboratory provide enhanced sensitivity for measurement of low-activity sources of tritium and radiocarbon using proportional counters. Tritium levels are nearly back to pre-nuclear test backgrounds (~2-8 TU in rainwater), which can complicate their dual measurement with radiocarbon due to overlap in the beta decay spectra. We present results of single-isotope proportional counter measurements used to analyze a dual-isotope methane sample synthesized from ~120mg of H 2 O and present sensitivity results. Copyright © 2017 Elsevier Ltd. All rights reserved.

  15. Simultaneous measurement of tritium and radiocarbon by ultra-low-background proportional counting

    DOE PAGES

    Mace, Emily; Aalseth, Craig; Alexander, Tom; ...

    2016-12-21

    Use of ultra-low-background capabilities at Pacific Northwest National Laboratory provide enhanced sensitivity for measurement of low-activity sources of tritium and radiocarbon using proportional counters. Tritium levels are nearly back to pre-nuclear test backgrounds (~2-8 TU in rainwater), which can complicate their dual measurement with radiocarbon due to overlap in the beta decay spectra. In this paper, we present results of single-isotope proportional counter measurements used to analyze a dual-isotope methane sample synthesized from ~120 mg of H 2O and present sensitivity results.

  16. In Orbit Performance of Si Avalanche Photodiode Single Photon Counting Modules in the Geoscience Laser Altimeter System on ICESat

    NASA Technical Reports Server (NTRS)

    Sun, X.; Jester, P. L.; Palm, S. P.; Abshire, J. B.; Spinhime, J. D.; Krainak, M. A.

    2006-01-01

    Si avalanche photodiode (APD) single photon counting modules (SPCMs) are used in the Geoscience Laser Altimeter System (GLAS) on Ice, Cloud, anti land Elevation Satellite (ICESat), currently in orbit measuring Earth surface elevation and atmosphere backscattering. These SPCMs are used to measure cloud and aerosol backscatterings to the GLAS laser light at 532-nm wavelength with 60-70% quantum efficiencies and up to 15 millions/s maximum count rates. The performance of the SPCMs has been closely monitored since ICESat launch on January 12, 2003. There has been no measurable change in the quantum efficiency, as indicated by the average photon count rates in response to the background light from the sunlit earth. The linearity and the afterpulsing seen from the cloud and surface backscatterings profiles have been the same as those during ground testing. The detector dark count rates monitored while the spacecraft was in the dark side of the globe have increased almost linearly at about 60 counts/s per day due to space radiation damage. The radiation damage appeared to be independent of the device temperature and power states. There was also an abrupt increase in radiation damage during the solar storm in 28-30 October 2003. The observed radiation damage is a factor of two to three lower than the expected and sufficiently low to provide useful atmosphere backscattering measurements through the end of the ICESat mission. To date, these SPCMs have been in orbit for more than three years. The accumulated operating time to date has reached 290 days (7000 hours). These SPCMs have provided unprecedented receiver sensitivity and dynamic range in ICESat atmosphere backscattering measurements.

  17. Observation of fluctuation of gamma-ray count rate accompanying thunderstorm activity and energy spectrum of gamma rays in the atmosphere up to several kilometers altitude from the ground

    NASA Astrophysics Data System (ADS)

    Torii, T.; Sanada, Y.; Watanabe, A.

    2017-12-01

    In the vicinity of the tops of high mountains and in the coastal areas of the Sea of Japan in winter, the generation of high energy photons that lasts more than 100 seconds at the occurrence of thunderclouds has been reported. At the same time, 511 keV gamma rays are also detected. On the other hand, we irradiated a radiosonde equipped with gamma-ray detectors at the time of thunderstorm and observed fluctuation in gamma-ray count-rate. As a result, we found that the gamma-ray count-rate increases significantly near the top of the thundercloud. Therefore, in order to investigate the fluctuation of the energy of the gamma rays, we developed a radiation detector for radiosonde to observe the fluctuation of the low energy gamma-ray spectrum and observed the fluctuation of the gamma-ray spectrum. We will describe the counting rate and spectral fluctuation of gamma-ray detectors for radiosonde observed in the sky in Fukushima prefecture, Japan.

  18. Effects of sampling strategy, detection probability, and independence of counts on the use of point counts

    USGS Publications Warehouse

    Pendleton, G.W.; Ralph, C. John; Sauer, John R.; Droege, Sam

    1995-01-01

    Many factors affect the use of point counts for monitoring bird populations, including sampling strategies, variation in detection rates, and independence of sample points. The most commonly used sampling plans are stratified sampling, cluster sampling, and systematic sampling. Each of these might be most useful for different objectives or field situations. Variation in detection probabilities and lack of independence among sample points can bias estimates and measures of precision. All of these factors should be con-sidered when using point count methods.

  19. Nicotine dependence, "background" and cue-induced craving and smoking in the laboratory.

    PubMed

    Dunbar, Michael S; Shiffman, Saul; Kirchner, Thomas R; Tindle, Hilary A; Scholl, Sarah M

    2014-09-01

    Nicotine dependence has been associated with higher "background" craving and smoking, independent of situational cues. Due in part to conceptual and methodological differences across past studies, the relationship between dependence and cue-reactivity (CR; e.g., cue-induced craving and smoking) remains unclear. 207 daily smokers completed six pictorial CR sessions (smoking, negative affect, positive affect, alcohol, smoking prohibitions, and neutral). Individuals rated craving before (background craving) and after cues, and could smoke following cue exposure. Session videos were coded to assess smoking. Participants completed four nicotine dependence measures. Regression models assessed the relationship of dependence to cue-independent (i.e., pre-cue) and cue-specific (i.e., pre-post cue change for each cue, relative to neutral) craving and smoking (likelihood of smoking, latency to smoke, puff count). Dependence was associated with background craving and smoking, but did not predict change in craving across the entire sample for any cue. Among alcohol drinkers, dependence was associated with greater increases in craving following the alcohol cue. Only one dependence measure (Wisconsin Inventory of Smoking Dependence Motives) was consistently associated with smoking reactivity (higher likelihood of smoking, shorter latency to smoke, greater puff count) in response to cues. While related to cue-independent background craving and smoking, dependence is not strongly associated with laboratory cue-induced craving under conditions of minimal deprivation. Dependence measures that incorporate situational influences on smoking correlate with greater cue-provoked smoking. This may suggest independent roles for CR and traditional dependence as determinants of smoking, and highlights the importance of assessing behavioral CR outcomes. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  20. Antral follicle counts are strongly associated with live-birth rates after assisted reproduction, with superior treatment outcome in women with polycystic ovaries.

    PubMed

    Holte, Jan; Brodin, Thomas; Berglund, Lars; Hadziosmanovic, Nermin; Olovsson, Matts; Bergh, Torbjörn

    2011-09-01

    To evaluate the association of antral follicle count (AFC) with in vitro fertilization/intracytoplasmic sperm injection (IVF-ICSI) outcome in a large unselected cohort of patients covering the entire range of AFC. Prospective observational study. University-affiliated private infertility center. 2,092 women undergoing 4,308 IVF-ICSI cycles. AFC analyzed for associations with treatment outcome and statistically adjusted for repeated treatments and age. Pregnancy rate, live-birth rate, and stimulation outcome parameters. The AFC was log-normally distributed. Pregnancy rates and live-birth rates were positively associated with AFC in a log-linear way, leveling out above AFC ∼30. Treatment outcome was superior among women with polycystic ovaries, independent from ovulatory status. The findings were significant also after adjustment for age and number of oocytes retrieved. Pregnancy and live-birth rates are log-linearly related to AFC. Polycystic ovaries, most often excluded from studies on ovarian reserve, fit as one extreme in the spectrum of AFC; a low count constitutes the other extreme, with the lowest ovarian reserve and poor treatment outcome. The findings remained statistically significant also after adjustment for the number of oocytes retrieved, suggesting this measure of ovarian reserve comprises information on oocyte quality and not only quantity. Copyright © 2011 American Society for Reproductive Medicine. Published by Elsevier Inc. All rights reserved.

  1. Low Background Signal Readout Electronics for the MAJORANA DEMONSTRATOR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guinn, I.; Abgrall, N.; Arnquist, Isaac J.

    2015-03-18

    The Majorana Demonstrator (MJD)[1] is an array of p-type point contact (PPC) high purity Germanium (HPGe) detectors intended to search for neutrinoless double beta decay (0vBB decay) in 76Ge. MJD will consist of 40 kg of detectors, 30 kg of which will be isotopically enriched to 87% 76Ge. The array will consist of 14 strings of four or ve detectors placed in two separate cryostats. One of the main goals of the experiment is to demonstrate the feasibility of building a tonne-scale array of detectors to search for 0vBB decay with a much higher sensitivity. This involves acheiving backgrounds inmore » the 4 keV region of interest (ROI) around the 2039 keV Q-value of the BB decay of less than 1 count/ROI-t-y. Because many backgrounds will not directly scale with detector mass, the specific background goal of MJD is less than 3 counts/ROI-t-y.« less

  2. The faint galaxy contribution to the diffuse extragalactic background light

    NASA Technical Reports Server (NTRS)

    Cole, Shaun; Treyer, Marie-Agnes; Silk, Joseph

    1992-01-01

    Models of the faint galaxy contribution to the diffuse extragalactic background light are presented, which are consistent with current data on faint galaxy number counts and redshifts. The autocorrelation function of surface brightness fluctuations in the extragalactic diffuse light is predicted, and the way in which these predictions depend on the cosmological model and assumptions of biasing is determined. It is confirmed that the recent deep infrared number counts are most compatible with a high density universe (Omega-0 is approximately equal to 1) and that the steep blue counts then require an extra population of rapidly evolving blue galaxies. The faintest presently detectable galaxies produce an interesting contribution to the extragalactic diffuse light, and still fainter galaxies may also produce a significant contribution. These faint galaxies still only produce a small fraction of the total optical diffuse background light, but on scales of a few arcminutes to a few degrees, they produce a substantial fraction of the fluctuations in the diffuse light.

  3. Pitch Counts in Youth Baseball and Softball: A Historical Review.

    PubMed

    Feeley, Brian T; Schisel, Jessica; Agel, Julie

    2018-07-01

    Pitching injuries are getting increased attention in the mass media. Many references are made to pitch counts and the role they play in injury prevention. The original purpose of regulating the pitch count in youth baseball was to reduce injury and fatigue to pitchers. This article reviews the history and development of the pitch count limit in baseball, the effect it has had on injury, and the evidence regarding injury rates on softball windmill pitching. Literature search through PubMed, mass media, and organizational Web sites through June 2015. Pitch count limits and rest recommendations were introduced in 1996 after a survey of 28 orthopedic surgeons and baseball coaches showed injuries to baseball pitchers' arms were believed to be from the number of pitches thrown. Follow-up research led to revised recommendations with more detailed guidelines in 2006. Since that time, data show a relationship between innings pitched and upper extremity injury, but pitch type has not clearly been shown to affect injury rates. Current surveys of coaches and players show that coaches, parents, and athletes often do not adhere to these guidelines. There are no pitch count guidelines currently available in softball. The increase in participation in youth baseball and softball with an emphasis on early sport specialization in youth sports activities suggests that there will continue to be a rise in injury rates to young throwers. The published pitch counts are likely to positively affect injury rates but must be adhered to by athletes, coaches, and parents.

  4. Point Count Length and Detection of Forest Neotropical Migrant Birds

    Treesearch

    Deanna K. Dawson; David R. Smith; Chandler S. Robbins

    1995-01-01

    Comparisons of bird abundances among years or among habitats assume that the rates at which birds are detected and counted are constant within species. We use point count data collected in forests of the Mid-Atlantic states to estimate detection probabilities for Neotropical migrant bird species as a function of count length. For some species, significant differences...

  5. High repetition rate laser-driven MeV ion acceleration at variable background pressures

    NASA Astrophysics Data System (ADS)

    Snyder, Joseph; Ngirmang, Gregory; Orban, Chris; Feister, Scott; Morrison, John; Frische, Kyle; Chowdhury, Enam; Roquemore, W. M.

    2017-10-01

    Ultra-intense laser-plasma interactions (LPI) can produce highly energetic photons, electrons, and ions with numerous potential real-world applications. Many of these applications will require repeatable, high repetition targets that are suitable for LPI experiments. Liquid targets can meet many of these needs, but they typically require higher chamber pressure than is used for many low repetition rate experiments. The effect of background pressure on the LPI has not been thoroughly studied. With this in mind, the Extreme Light group at the Air Force Research Lab has carried out MeV ion and electron acceleration experiments at kHz repetition rate with background pressures ranging from 30 mTorr to >1 Torr using a submicron ethylene glycol liquid sheet target. We present these results and provide two-dimensional particle-in-cell simulation results that offer insight on the thresholds for the efficient acceleration of electrons and ions. This research is supported by the Air Force Office of Scientific Research under LRIR Project 17RQCOR504 under the management of Dr. Riq Parra and Dr. Jean-Luc Cambier. Support was also provided by the DOD HPCMP Internship Program.

  6. Mars sedimentary rock erosion rates constrained using crater counts, with applications to organic-matter preservation and to the global dust cycle

    NASA Astrophysics Data System (ADS)

    Kite, Edwin S.; Mayer, David P.

    2017-04-01

    Small-crater counts on Mars light-toned sedimentary rock are often inconsistent with any isochron; these data are usually plotted then ignored. We show (using an 18-HiRISE-image, > 104-crater dataset) that these non-isochron crater counts are often well-fit by a model where crater production is balanced by crater obliteration via steady exhumation. For these regions, we fit erosion rates. We infer that Mars light-toned sedimentary rocks typically erode at ∼102 nm/yr, when averaged over 10 km2 scales and 107-108 yr timescales. Crater-based erosion-rate determination is consistent with independent techniques, but can be applied to nearly all light-toned sedimentary rocks on Mars. Erosion is swift enough that radiolysis cannot destroy complex organic matter at some locations (e.g. paleolake deposits at SW Melas), but radiolysis is a severe problem at other locations (e.g. Oxia Planum). The data suggest that the relief of the Valles Marineris mounds is currently being reduced by wind erosion, and that dust production on Mars < 3 Gya greatly exceeds the modern reservoir of mobile dust.

  7. Recursive least squares background prediction of univariate syndromic surveillance data.

    PubMed

    Najmi, Amir-Homayoon; Burkom, Howard

    2009-01-16

    Surveillance of univariate syndromic data as a means of potential indicator of developing public health conditions has been used extensively. This paper aims to improve the performance of detecting outbreaks by using a background forecasting algorithm based on the adaptive recursive least squares method combined with a novel treatment of the Day of the Week effect. Previous work by the first author has suggested that univariate recursive least squares analysis of syndromic data can be used to characterize the background upon which a prediction and detection component of a biosurvellance system may be built. An adaptive implementation is used to deal with data non-stationarity. In this paper we develop and implement the RLS method for background estimation of univariate data. The distinctly dissimilar distribution of data for different days of the week, however, can affect filter implementations adversely, and so a novel procedure based on linear transformations of the sorted values of the daily counts is introduced. Seven-days ahead daily predicted counts are used as background estimates. A signal injection procedure is used to examine the integrated algorithm's ability to detect synthetic anomalies in real syndromic time series. We compare the method to a baseline CDC forecasting algorithm known as the W2 method. We present detection results in the form of Receiver Operating Characteristic curve values for four different injected signal to noise ratios using 16 sets of syndromic data. We find improvements in the false alarm probabilities when compared to the baseline W2 background forecasts. The current paper introduces a prediction approach for city-level biosurveillance data streams such as time series of outpatient clinic visits and sales of over-the-counter remedies. This approach uses RLS filters modified by a correction for the weekly patterns often seen in these data series, and a threshold detection algorithm from the residuals of the RLS forecasts. We

  8. Simulation of background from low-level tritium and radon emanation in the KATRIN spectrometers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Leiber, B.; Collaboration: KATRIN Collaboration

    The KArlsruhe TRItium Neutrino (KATRIN) experiment is a large-scale experiment for the model independent determination of the mass of electron anti-neutrinos with a sensitivity of 200 meV/c{sup 2}. It investigates the kinematics of electrons from tritium beta decay close to the endpoint of the energy spectrum at 18.6 keV. To achieve a good signal to background ratio at the endpoint, a low background rate below 10{sup −2} counts per second is required. The KATRIN setup thus consists of a high luminosity windowless gaseous tritium source (WGTS), a magnetic electron transport system with differential and cryogenic pumping for tritium retention, andmore » electro-static retarding spectrometers (pre-spectrometer and main spectrometer) for energy analysis, followed by a segmented detector system for counting transmitted beta-electrons. A major source of background comes from magnetically trapped electrons in the main spectrometer (vacuum vessel: 1240 m{sup 3}, 10{sup −11} mbar) produced by nuclear decays in the magnetic flux tube of the spectrometer. Major contributions are expected from short-lived radon isotopes and tritium. Primary electrons, originating from these decays, can be trapped for hours, until having lost almost all their energy through inelastic scattering on residual gas particles. Depending on the initial energy of the primary electron, up to hundreds of low energetic secondary electrons can be produced. Leaving the spectrometer, these electrons will contribute to the background rate. This contribution describes results from simulations for the various background sources. Decays of {sup 219}Rn, emanating from the main vacuum pump, and tritium from the WGTS that reaches the spectrometers are expected to account for most of the background. As a result of the radon alpha decay, electrons are emitted through various processes, such as shake-off, internal conversion and the Auger deexcitations. The corresponding simulations were done using the

  9. An Ultrasensitive Hot-Electron Bolometer for Low-Background SMM Applications

    NASA Technical Reports Server (NTRS)

    Olayaa, David; Wei, Jian; Pereverzev, Sergei; Karasik, Boris S.; Kawamura, Jonathan H.; McGrath, William R.; Sergeev, Andrei V.; Gershenson, Michael E.

    2006-01-01

    We are developing a hot-electron superconducting transition-edge sensor (TES) that is capable of counting THz photons and operates at T = 0.3K. The main driver for this work is moderate resolution spectroscopy (R approx. 1000) on the future space telescopes with cryogenically cooled (approx. 4 K) mirrors. The detectors for these telescopes must be background-limited with a noise equivalent power (NEP) approx. 10(exp -19)-10(exp -20) W/Hz(sup 1/2) over the range v = 0.3-10 THz. Above about 1 THz, the background photon arrival rate is expected to be approx. 10-100/s), and photon counting detectors may be preferable to an integrating type. We fabricated superconducting Ti nanosensors with a volume of approx. 3x10(exp -3) cubic microns on planar substrate and have measured the thermal conductance G to the thermal bath. A very low G = 4x10(exp -14) W/K, measured at 0.3 K, is due to the weak electron-phonon coupling in the material and the thermal isolation provided by superconducting Nb contacts. This low G corresponds to NEP(0.3K) = 3x10(exp -19) W/Hz(sup 1/2). This Hot-Electron Direct Detector (HEDD) is expected to have a sufficient energy resolution for detecting individual photons with v > 0.3 THz at 0.3 K. With the sensor time constant of a few microseconds, the dynamic range is approx. 50 dB.

  10. High-efficiency and low-background multi-segmented proportional gas counter for β-decay spectroscopy

    NASA Astrophysics Data System (ADS)

    Mukai, M.; Hirayama, Y.; Watanabe, Y. X.; Schury, P.; Jung, H. S.; Ahmed, M.; Haba, H.; Ishiyama, H.; Jeong, S. C.; Kakiguchi, Y.; Kimura, S.; Moon, J. Y.; Oyaizu, M.; Ozawa, A.; Park, J. H.; Ueno, H.; Wada, M.; Miyatake, H.

    2018-03-01

    A multi-segmented proportional gas counter (MSPGC) with high detection efficiency and low-background event rate has been developed for β-decay spectroscopy. The MSPGC consists of two cylindrically aligned layers of 16 counters (32 counters in total). Each counter has a long active length and small trapezoidal cross-section, and the total solid angle of the 32 counters is 80% of 4 π. β-rays are distinguished from the background events including cosmic-rays by analyzing the hit patterns of independent counters. The deduced intrinsic detection efficiency of each counter was almost 100%. The measured background event rate was 0.11 counts per second using the combination of veto counters for cosmic-rays and lead block shields for background γ-rays. The MSPGC was applied to measure the β-decay half-lives of 198Ir and 199mPt. The evaluated half-lives of T1/2 = 9 . 8(7) s and 12.4(7) s for 198Ir and 199mPt, respectively, were in agreement with previously reported values. The estimated absolute detection efficiency of the MSPGC from GEANT4 simulations was consistent with the evaluated efficiency from the analysis of the β- γ spectroscopy of 199Pt, saturating at approximately 60% for Qβ > 4 MeV.

  11. Radon induced background processes in the KATRIN pre-spectrometer

    NASA Astrophysics Data System (ADS)

    Fränkle, F. M.; Bornschein, L.; Drexlin, G.; Glück, F.; Görhardt, S.; Käfer, W.; Mertens, S.; Wandkowsky, N.; Wolf, J.

    2011-10-01

    The KArlsruhe TRItium Neutrino (KATRIN) experiment is a next generation, model independent, large scale tritium β-decay experiment to determine the effective electron anti-neutrino mass by investigating the kinematics of tritium β-decay with a sensitivity of 200 meV/c 2 using the MAC-E filter technique. In order to reach this sensitivity, a low background level of 10 -2 counts per second (cps) is required. This paper describes how the decay of radon in a MAC-E filter generates background events, based on measurements performed at the KATRIN pre-spectrometer test setup. Radon (Rn) atoms, which emanate from materials inside the vacuum region of the KATRIN spectrometers, are able to penetrate deep into the magnetic flux tube so that the α-decay of Rn contributes to the background. Of particular importance are electrons emitted in processes accompanying the Rn α-decay, such as shake-off, internal conversion of excited levels in the Rn daughter atoms and Auger electrons. While low-energy electrons (<100 eV) directly contribute to the background in the signal region, higher energy electrons can be stored magnetically inside the volume of the spectrometer. Depending on their initial energy, they are able to create thousands of secondary electrons via subsequent ionization processes with residual gas molecules and, since the detector is not able to distinguish these secondary electrons from the signal electrons, an increased background rate over an extended period of time is generated.

  12. Statistical tests to compare motif count exceptionalities

    PubMed Central

    Robin, Stéphane; Schbath, Sophie; Vandewalle, Vincent

    2007-01-01

    Background Finding over- or under-represented motifs in biological sequences is now a common task in genomics. Thanks to p-value calculation for motif counts, exceptional motifs are identified and represent candidate functional motifs. The present work addresses the related question of comparing the exceptionality of one motif in two different sequences. Just comparing the motif count p-values in each sequence is indeed not sufficient to decide if this motif is significantly more exceptional in one sequence compared to the other one. A statistical test is required. Results We develop and analyze two statistical tests, an exact binomial one and an asymptotic likelihood ratio test, to decide whether the exceptionality of a given motif is equivalent or significantly different in two sequences of interest. For that purpose, motif occurrences are modeled by Poisson processes, with a special care for overlapping motifs. Both tests can take the sequence compositions into account. As an illustration, we compare the octamer exceptionalities in the Escherichia coli K-12 backbone versus variable strain-specific loops. Conclusion The exact binomial test is particularly adapted for small counts. For large counts, we advise to use the likelihood ratio test which is asymptotic but strongly correlated with the exact binomial test and very simple to use. PMID:17346349

  13. Establishing a gold standard for manual cough counting: video versus digital audio recordings

    PubMed Central

    Smith, Jaclyn A; Earis, John E; Woodcock, Ashley A

    2006-01-01

    Background Manual cough counting is time-consuming and laborious; however it is the standard to which automated cough monitoring devices must be compared. We have compared manual cough counting from video recordings with manual cough counting from digital audio recordings. Methods We studied 8 patients with chronic cough, overnight in laboratory conditions (diagnoses were 5 asthma, 1 rhinitis, 1 gastro-oesophageal reflux disease and 1 idiopathic cough). Coughs were recorded simultaneously using a video camera with infrared lighting and digital sound recording. The numbers of coughs in each 8 hour recording were counted manually, by a trained observer, in real time from the video recordings and using audio-editing software from the digital sound recordings. Results The median cough frequency was 17.8 (IQR 5.9–28.7) cough sounds per hour in the video recordings and 17.7 (6.0–29.4) coughs per hour in the digital sound recordings. There was excellent agreement between the video and digital audio cough rates; mean difference of -0.3 coughs per hour (SD ± 0.6), 95% limits of agreement -1.5 to +0.9 coughs per hour. Video recordings had poorer sound quality even in controlled conditions and can only be analysed in real time (8 hours per recording). Digital sound recordings required 2–4 hours of analysis per recording. Conclusion Manual counting of cough sounds from digital audio recordings has excellent agreement with simultaneous video recordings in laboratory conditions. We suggest that ambulatory digital audio recording is therefore ideal for validating future cough monitoring devices, as this as this can be performed in the patients own environment. PMID:16887019

  14. A Comparison of Bird Detection Rates Derived from On-Road vs. Off-Road Point Counts in Northern Montana

    Treesearch

    Richard L. Hutto; Sallie J. Hejl; Jeffrey F. Kelly; Sandra M. Pletschet

    1995-01-01

    We conducted a series of 275 paired (on- and off-road) point counts within 4 distinct vegetation cover types in northwestern Montana. Roadside counts generated a bird list that was essentially the same as the list generated from off-road counts within the same vegetation cover type. Species that were restricted to either on- or off-road counts were rare, suggesting...

  15. A New High Channel-Count, High Scan-Rate, Data Acquisition System for the NASA Langley Transonic Dynamics Tunnel

    NASA Technical Reports Server (NTRS)

    Ivanco, Thomas G.; Sekula, Martin K.; Piatak, David J.; Simmons, Scott A.; Babel, Walter C.; Collins, Jesse G.; Ramey, James M.; Heald, Dean M.

    2016-01-01

    A data acquisition system upgrade project, known as AB-DAS, is underway at the NASA Langley Transonic Dynamics Tunnel. AB-DAS will soon serve as the primary data system and will substantially increase the scan-rate capabilities and analog channel count while maintaining other unique aeroelastic and dynamic test capabilities required of the facility. AB-DAS is configurable, adaptable, and enables buffet and aeroacoustic tests by synchronously scanning all analog channels and recording the high scan-rate time history values for each data quantity. AB-DAS is currently available for use as a stand-alone data system with limited capabilities while development continues. This paper describes AB-DAS, the design methodology, and the current features and capabilities. It also outlines the future work and projected capabilities following completion of the data system upgrade project.

  16. Deep Galex Observations of the Coma Cluster: Source Catalog and Galaxy Counts

    NASA Technical Reports Server (NTRS)

    Hammer, D.; Hornschemeier, A. E.; Mobasher, B.; Miller, N.; Smith, R.; Arnouts, S.; Milliard, B.; Jenkins, L.

    2010-01-01

    We present a source catalog from deep 26 ks GALEX observations of the Coma cluster in the far-UV (FUV; 1530 Angstroms) and near-UV (NUV; 2310 Angstroms) wavebands. The observed field is centered 0.9 deg. (1.6 Mpc) south-west of the Coma core, and has full optical photometric coverage by SDSS and spectroscopic coverage to r-21. The catalog consists of 9700 galaxies with GALEX and SDSS photometry, including 242 spectroscopically-confirmed Coma member galaxies that range from giant spirals and elliptical galaxies to dwarf irregular and early-type galaxies. The full multi-wavelength catalog (cluster plus background galaxies) is 80% complete to NUV=23 and FUV=23.5, and has a limiting depth at NUV=24.5 and FUV=25.0 which corresponds to a star formation rate of 10(exp -3) solar mass yr(sup -1) at the distance of Coma. The GALEX images presented here are very deep and include detections of many resolved cluster members superposed on a dense field of unresolved background galaxies. This required a two-fold approach to generating a source catalog: we used a Bayesian deblending algorithm to measure faint and compact sources (using SDSS coordinates as a position prior), and used the GALEX pipeline catalog for bright and/or extended objects. We performed simulations to assess the importance of systematic effects (e.g. object blends, source confusion, Eddington Bias) that influence source detection and photometry when using both methods. The Bayesian deblending method roughly doubles the number of source detections and provides reliable photometry to a few magnitudes deeper than the GALEX pipeline catalog. This method is also free from source confusion over the UV magnitude range studied here: conversely, we estimate that the GALEX pipeline catalogs are confusion limited at NUV approximately 23 and FUV approximately 24. We have measured the total UV galaxy counts using our catalog and report a 50% excess of counts across FUV=22-23.5 and NUV=21.5-23 relative to previous GALEX

  17. Tutorial on X-ray photon counting detector characterization.

    PubMed

    Ren, Liqiang; Zheng, Bin; Liu, Hong

    2018-01-01

    Recent advances in photon counting detection technology have led to significant research interest in X-ray imaging. As a tutorial level review, this paper covers a wide range of aspects related to X-ray photon counting detector characterization. The tutorial begins with a detailed description of the working principle and operating modes of a pixelated X-ray photon counting detector with basic architecture and detection mechanism. Currently available methods and techniques for charactering major aspects including energy response, noise floor, energy resolution, count rate performance (detector efficiency), and charge sharing effect of photon counting detectors are comprehensively reviewed. Other characterization aspects such as point spread function (PSF), line spread function (LSF), contrast transfer function (CTF), modulation transfer function (MTF), noise power spectrum (NPS), detective quantum efficiency (DQE), bias voltage, radiation damage, and polarization effect are also remarked. A cadmium telluride (CdTe) pixelated photon counting detector is employed for part of the characterization demonstration and the results are presented. This review can serve as a tutorial for X-ray imaging researchers and investigators to understand, operate, characterize, and optimize photon counting detectors for a variety of applications.

  18. Gauge backgrounds and zero-mode counting in F-theory

    NASA Astrophysics Data System (ADS)

    Bies, Martin; Mayrhofer, Christoph; Weigand, Timo

    2017-11-01

    Computing the exact spectrum of charged massless matter is a crucial step towards understanding the effective field theory describing F-theory vacua in four dimensions. In this work we further develop a coherent framework to determine the charged massless matter in F-theory compactified on elliptic fourfolds, and demonstrate its application in a concrete example. The gauge background is represented, via duality with M-theory, by algebraic cycles modulo rational equivalence. Intersection theory within the Chow ring allows us to extract coherent sheaves on the base of the elliptic fibration whose cohomology groups encode the charged zero-mode spectrum. The dimensions of these cohomology groups are computed with the help of modern techniques from algebraic geometry, which we implement in the software gap. We exemplify this approach in models with an Abelian and non-Abelian gauge group and observe jumps in the exact massless spectrum as the complex structure moduli are varied. An extended mathematical appendix gives a self-contained introduction to the algebro-geometric concepts underlying our framework.

  19. Triple-Label β Liquid Scintillation Counting

    PubMed Central

    Bukowski, Thomas R.; Moffett, Tyler C.; Revkin, James H.; Ploger, James D.; Bassingthwaighte, James B.

    2010-01-01

    The detection of radioactive compounds by liquid scintillation has revolutionized modern biology, yet few investigators make full use of the power of this technique. Even though multiple isotope counting is considerably more difficult than single isotope counting, many experimental designs would benefit from using more than one isotope. The development of accurate isotope counting techniques enabling the simultaneous use of three β-emitting tracers has facilitated studies in our laboratory using the multiple tracer indicator dilution technique for assessing rates of transmembrane transport and cellular metabolism. The details of sample preparation, and of stabilizing the liquid scintillation spectra of the tracers, are critical to obtaining good accuracy. Reproducibility is enhanced by obtaining detailed efficiency/quench curves for each particular set of tracers and solvent media. The numerical methods for multiple-isotope quantitation depend on avoiding error propagation (inherent to successive subtraction techniques) by using matrix inversion. Experimental data obtained from triple-label β counting illustrate reproducibility and good accuracy even when the relative amounts of different tracers in samples of protein/electrolyte solutions, plasma, and blood are changed. PMID:1514684

  20. 2D dark-count-rate modeling of PureB single-photon avalanche diodes in a TCAD environment

    NASA Astrophysics Data System (ADS)

    Knežević, Tihomir; Nanver, Lis K.; Suligoj, Tomislav

    2018-02-01

    PureB silicon photodiodes have nm-shallow p+n junctions with which photons/electrons with penetration-depths of a few nanometer can be detected. PureB Single-Photon Avalanche Diodes (SPADs) were fabricated and analysed by 2D numerical modeling as an extension to TCAD software. The very shallow p+ -anode has high perimeter curvature that enhances the electric field. In SPADs, noise is quantified by the dark count rate (DCR) that is a measure for the number of false counts triggered by unwanted processes in the non-illuminated device. Just like for desired events, the probability a dark count increases with increasing electric field and the perimeter conditions are critical. In this work, the DCR was studied by two 2D methods of analysis: the "quasi-2D" (Q-2D) method where vertical 1D cross-sections were assumed for calculating the electron/hole avalanche-probabilities, and the "ionization-integral 2D" (II-2D) method where crosssections were placed where the maximum ionization-integrals were calculated. The Q-2D method gave satisfactory results in structures where the peripheral regions had a small contribution to the DCR, such as in devices with conventional deepjunction guard rings (GRs). Otherwise, the II-2D method proved to be much more precise. The results show that the DCR simulation methods are useful for optimizing the compromise between fill-factor and p-/n-doping profile design in SPAD devices. For the experimentally investigated PureB SPADs, excellent agreement of the measured and simulated DCR was achieved. This shows that although an implicit GR is attractively compact, the very shallow pn-junction gives a risk of having such a low breakdown voltage at the perimeter that the DCR of the device may be negatively impacted.

  1. A Bridge from Optical to Infrared Galaxies: Explaining Local Properties and Predicting Galaxy Counts and the Cosmic Background Radiation

    NASA Astrophysics Data System (ADS)

    Totani, Tomonori; Takeuchi, Tsutomu T.

    2002-05-01

    We give an explanation for the origin of various properties observed in local infrared galaxies and make predictions for galaxy counts and cosmic background radiation (CBR) using a new model extended from that for optical/near-infrared galaxies. Important new characteristics of this study are that (1) mass scale dependence of dust extinction is introduced based on the size-luminosity relation of optical galaxies and that (2) the large-grain dust temperature Tdust is calculated based on a physical consideration for energy balance rather than by using the empirical relation between Tdust and total infrared luminosity LIR found in local galaxies, which has been employed in most previous works. Consequently, the local properties of infrared galaxies, i.e., optical/infrared luminosity ratios, LIR-Tdust correlation, and infrared luminosity function are outputs predicted by the model, while these have been inputs in a number of previous models. Our model indeed reproduces these local properties reasonably well. Then we make predictions for faint infrared counts (in 15, 60, 90, 170, 450, and 850 μm) and CBR using this model. We found results considerably different from those of most previous works based on the empirical LIR-Tdust relation; especially, it is shown that the dust temperature of starbursting primordial elliptical galaxies is expected to be very high (40-80 K), as often seen in starburst galaxies or ultraluminous infrared galaxies in the local and high-z universe. This indicates that intense starbursts of forming elliptical galaxies should have occurred at z~2-3, in contrast to the previous results that significant starbursts beyond z~1 tend to overproduce the far-infrared (FIR) CBR detected by COBE/FIRAS. On the other hand, our model predicts that the mid-infrared (MIR) flux from warm/nonequilibrium dust is relatively weak in such galaxies making FIR CBR, and this effect reconciles the prima facie conflict between the upper limit on MIR CBR from TeV gamma

  2. Retrospective determination of the contamination in the HML's counting chambers.

    PubMed

    Kramer, Gary H; Hauck, Barry; Capello, Kevin; Phan, Quoc

    2008-09-01

    The original documentation surrounding the purchase of the Human Monitoring Laboratory's (HML) counting chambers clearly showed that the steel contained low levels of radioactivity, presumably as a result of A-bomb fallout or perhaps to the inadvertent mixing of radioactive sources with scrap steel. Monte Carlo simulations have been combined with experimental measurements to estimate the level of contamination in the steel of the HML's whole body counting chamber. A 24-h empty chamber background count showed the presence of 137Cs and 60Co. The estimated activity of 137Cs in the 51 tons of steel was 2.7 kBq in 2007 (51.3 microBq g(-1) steel) which would have been 8 kBq at the time of manufacture. The 60Co that was found in the background spectrum is postulated to be contained in the bed-frame. The estimated amount in 2007 was 5 Bq and its origin is likely to be contaminated scrap metal entering the steel production cycle sometime in the past. The estimated activities are 10 to 25 times higher than the estimated minimum detectable activity for this measurement. These amounts have no impact on the usefulness of the whole body counter.

  3. People counting in classroom based on video surveillance

    NASA Astrophysics Data System (ADS)

    Zhang, Quanbin; Huang, Xiang; Su, Juan

    2014-11-01

    Currently, the switches of the lights and other electronic devices in the classroom are mainly relied on manual control, as a result, many lights are on while no one or only few people in the classroom. It is important to change the current situation and control the electronic devices intelligently according to the number and the distribution of the students in the classroom, so as to reduce the considerable waste of electronic resources. This paper studies the problem of people counting in classroom based on video surveillance. As the camera in the classroom can not get the full shape contour information of bodies and the clear features information of faces, most of the classical algorithms such as the pedestrian detection method based on HOG (histograms of oriented gradient) feature and the face detection method based on machine learning are unable to obtain a satisfied result. A new kind of dual background updating model based on sparse and low-rank matrix decomposition is proposed in this paper, according to the fact that most of the students in the classroom are almost in stationary state and there are body movement occasionally. Firstly, combining the frame difference with the sparse and low-rank matrix decomposition to predict the moving areas, and updating the background model with different parameters according to the positional relationship between the pixels of current video frame and the predicted motion regions. Secondly, the regions of moving objects are determined based on the updated background using the background subtraction method. Finally, some operations including binarization, median filtering and morphology processing, connected component detection, etc. are performed on the regions acquired by the background subtraction, in order to induce the effects of the noise and obtain the number of people in the classroom. The experiment results show the validity of the algorithm of people counting.

  4. Spatial variability in the pollen count in Sydney, Australia: can one sampling site accurately reflect the pollen count for a region?

    PubMed

    Katelaris, Constance H; Burke, Therese V; Byth, Karen

    2004-08-01

    There is increasing interest in the daily pollen count, with pollen-sensitive individuals using it to determine medication use and researchers relying on it for commencing clinical drug trials and assessing drug efficacy according to allergen exposure. Counts are often expressed qualitatively as low, medium, and high, and often only 1 pollen trap is used for an entire region. To examine the spatial variability in the pollen count in Sydney, Australia, and to compare discrepancies among low-, medium-, and high-count days at 3 sites separated by a maximum of 30 km. Three sites in western Sydney were sampled using Burkard traps. Data from the 3 sites were used to compare vegetation differences, possible effects of some meteorological parameters, and discrepancies among sites in low-, medium-, and high-count days. Total pollen counts during the spring months were 14,382 grains/m3 at Homebush, 11,584 grains/m3 at Eastern Creek, and 9,269 grains/m3 at Nepean. The only significant correlation between differences in meteorological parameters and differences in pollen counts was the Homebush-Nepean differences in rainfall and pollen counts. Comparison between low- and high-count days among the 3 sites revealed a discordance rate of 8% to 17%. For informing the public about pollen counts, the count from 1 trap is a reasonable estimation in a 30-km region; however, the discrepancies among 3 trap sites would have a significant impact on the performance of a clinical trial where enrollment was determined by a low or high count. Therefore, for clinical studies, data collection must be local and applicable to the study population.

  5. Growth Curve Models for Zero-Inflated Count Data: An Application to Smoking Behavior

    ERIC Educational Resources Information Center

    Liu, Hui; Powers, Daniel A.

    2007-01-01

    This article applies growth curve models to longitudinal count data characterized by an excess of zero counts. We discuss a zero-inflated Poisson regression model for longitudinal data in which the impact of covariates on the initial counts and the rate of change in counts over time is the focus of inference. Basic growth curve models using a…

  6. The effect of blood cell count on coronary flow in patients with coronary slow flow phenomenon

    PubMed Central

    Soylu, Korhan; Gulel, Okan; Yucel, Huriye; Yuksel, Serkan; Aksan, Gokhan; Soylu, Ayşegül İdil; Demircan, Sabri; Yılmaz, Özcan; Sahin, Mahmut

    2014-01-01

    Background and Objective: The coronary slow flow phenomenon (CSFP) is a coronary artery disease with a benign course, but its pathological mechanisms are not yet fully understood.The purpose of this controlled study was to investigate the cellular content of blood in patients diagnosed with CSFP and the relationship of this with coronary flow rates. Methods: Selective coronary angiographies of 3368 patients were analyzed to assess Thrombolysis in Myocardial Infarction (TIMI) frame count (TFC) values. Seventy eight of them had CSFP, and their demographic and laboratory findings were compared with 61 patients with normal coronary flow. Results: Patients’ demographic characteristics were similar in both groups. Mean corrected TFC (cTFC) values were significantly elevated in CSFP patients (p<0.001). Furthermore, hematocrit and hemoglobin values, and eosinophil and basophil counts of the CSFP patients were significantly elevated compared to the values obtained in the control group (p=0.005, p=0.047, p=0.001 and p=0.002, respectively). The increase observed in hematocrit and eosinophil levels showed significant correlations with increased TFC values (r=0.288 and r=0.217, respectively). Conclusion: Significant changes have been observed in the cellular composition of blood in patients diagnosed with CSFP as compared to the patients with normal coronary blood flow. The increases in hematocrit levels and in the eosinophil and basophil counts may have direct or indirect effects on the rate of coronary blood flow. PMID:25225502

  7. Background considerations in the analysis of PIXE spectra by Artificial Neural Systems.

    NASA Astrophysics Data System (ADS)

    Correa, R.; Morales, J. R.; Requena, I.; Miranda, J.; Barrera, V. A.

    2016-05-01

    In order to study the importance of background in PIXE spectra to determine elemental concentrations in atmospheric aerosols using artificial neural systems ANS, two independently trained ANS were constructed, one which considered as input the net number of counts in the peak, and another which included the background. In the training and validation phases thirty eight spectra of aerosols collected in Santiago, Chile, were used. In both cases the elemental concentration values were similar. This fact was due to the intrinsic characteristic of ANS operating with normalized values of the net and total number of counts under the peaks, something that was verified in the analysis of 172 spectra obtained from aerosols collected in Mexico city. Therefore, networks operating under the mode which include background can reduce time and cost when dealing with large number of samples.

  8. Improving the limits of detection of low background alpha emission measurements

    NASA Astrophysics Data System (ADS)

    McNally, Brendan D.; Coleman, Stuart; Harris, Jack T.; Warburton, William K.

    2018-01-01

    Alpha particle emission - even at extremely low levels - is a significant issue in the search for rare events (e.g., double beta decay, dark matter detection). Traditional measurement techniques require long counting times to measure low sample rates in the presence of much larger instrumental backgrounds. To address this, a commercially available instrument developed by XIA uses pulse shape analysis to discriminate alpha emissions produced by the sample from those produced by other surfaces of the instrument itself. Experience with this system has uncovered two residual sources of background: cosmogenics and radon emanation from internal components. An R&D program is underway to enhance the system and extend the pulse shape analysis technique further, so that these residual sources can be identified and rejected as well. In this paper, we review the theory of operation and pulse shape analysis techniques used in XIA's alpha counter, and briefly explore data suggesting the origin of the residual background terms. We will then present our approach to enhance the system's ability to identify and reject these terms. Finally, we will describe a prototype system that incorporates our concepts and demonstrates their feasibility.

  9. Photon-Counting Kinetic Inductance Detectors for the Origins Space Telescope

    NASA Astrophysics Data System (ADS)

    Noroozian, Omid

    We propose to develop photon-counting Kinetic Inductance Detectors (KIDs) for the Origins Space Telescope (OST) and any predecessor missions, with the goal of producing background-limited photon-counting sensitivity, and with a preliminary technology demonstration in time to inform the Decadal Survey planning process. The OST, a midto far- infrared observatory concept, is being developed as a major NASA mission to be considered by the next Decadal Survey with support from NASA Headquarters. The objective of such a facility is to allow rapid spectroscopic surveys of the high redshift universe at 420-800 μm, using arrays of integrated spectrometers with moderate resolutions (R=λ/Δλ 1000), to create a powerful new data set for exploring galaxy evolution and the growth of structure in the Universe. A second objective of OST is to perform higher resolution (R 10,000-100,000) spectroscopic surveys at 20-300 µm, a uniquely powerful tool for exploring the evolution of protoplanetary disks into fledgling solar systems. Finally the OST aims to obtain sensitive mid-infrared (5-40 µm) spectroscopy of thermal emission from rocky planets in the habitable zone using the transit method. These OST science objectives are very exciting and represent a wellorganized community agreement. However, they are all impossible to reach without new detector technology, and the OST can’t be recommended or approved if suitable detectors do not exist. In all of the above instrument concepts, photon-counting direct detectors are mission-enabling and essential for reaching the sensitivity permitted by the cryogenic Origins Space Telescope and the performance required for its important science programs. Our group has developed an innovative design for an optically-coupled KID that can reach the photon-counting sensitivity required by the ambitious science goals of the OST mission. A KID is a planar microwave resonator patterned from a superconducting thin film, which

  10. The ALMA Spectroscopic Survey in the Hubble Ultra Deep Field: Continuum Number Counts, Resolved 1.2 mm Extragalactic Background, and Properties of the Faintest Dusty Star-forming Galaxies

    NASA Astrophysics Data System (ADS)

    Aravena, M.; Decarli, R.; Walter, F.; Da Cunha, E.; Bauer, F. E.; Carilli, C. L.; Daddi, E.; Elbaz, D.; Ivison, R. J.; Riechers, D. A.; Smail, I.; Swinbank, A. M.; Weiss, A.; Anguita, T.; Assef, R. J.; Bell, E.; Bertoldi, F.; Bacon, R.; Bouwens, R.; Cortes, P.; Cox, P.; Gónzalez-López, J.; Hodge, J.; Ibar, E.; Inami, H.; Infante, L.; Karim, A.; Le Le Fèvre, O.; Magnelli, B.; Ota, K.; Popping, G.; Sheth, K.; van der Werf, P.; Wagg, J.

    2016-12-01

    We present an analysis of a deep (1σ = 13 μJy) cosmological 1.2 mm continuum map based on ASPECS, the ALMA Spectroscopic Survey in the Hubble Ultra Deep Field. In the 1 arcmin2 covered by ASPECS we detect nine sources at \\gt 3.5σ significance at 1.2 mm. Our ALMA-selected sample has a median redshift of z=1.6+/- 0.4, with only one galaxy detected at z > 2 within the survey area. This value is significantly lower than that found in millimeter samples selected at a higher flux density cutoff and similar frequencies. Most galaxies have specific star formation rates (SFRs) similar to that of main-sequence galaxies at the same epoch, and we find median values of stellar mass and SFRs of 4.0× {10}10 {M}⊙ and ˜ 40 {M}⊙ yr-1, respectively. Using the dust emission as a tracer for the interstellar medium (ISM) mass, we derive depletion times that are typically longer than 300 Myr, and we find molecular gas fractions ranging from ˜0.1 to 1.0. As noted by previous studies, these values are lower than those using CO-based ISM estimates by a factor of ˜2. The 1 mm number counts (corrected for fidelity and completeness) are in agreement with previous studies that were typically restricted to brighter sources. With our individual detections only, we recover 55% ± 4% of the extragalactic background light (EBL) at 1.2 mm measured by the Planck satellite, and we recover 80% ± 7% of this EBL if we include the bright end of the number counts and additional detections from stacking. The stacked contribution is dominated by galaxies at z˜ 1{--}2, with stellar masses of (1-3) × 1010 M {}⊙ . For the first time, we are able to characterize the population of galaxies that dominate the EBL at 1.2 mm.

  11. Research on channel characteristics of differential multi pulse position modulation without background noise

    NASA Astrophysics Data System (ADS)

    Gao, Zhuo; Zhan, Weida; Sun, Quan; Hao, Ziqiang

    2018-04-01

    Differential multi-pulse position modulation (DMPPM) is a new type of modulation technology. There is a fast transmission rate, high bandwidth utilization, high modulation rate characteristics. The study of DMPPM modulation has important scientific value and practical significance. Channel capacity is one of the important indexes to measure the communication capability of communication system, and studying the channel capacity of DMPPM without background noise is the key to analyze the characteristics of DMPPM. The DMPPM theoretical model is established. The symbol structure of DMPPM with guard time slot is analyzed, and the channel capacity expression of DMPPM is deduced. Simulation analysis by MATLAB. The curves of unit channel capacity and capacity efficiency at different pulse and photon counting rates are analyzed. The results show that DMPPM is more advantageous than multi-pulse position modulation (MPPM), and is more suitable for future wireless optical communication system.

  12. Choral Counting

    ERIC Educational Resources Information Center

    Turrou, Angela Chan; Franke, Megan L.; Johnson, Nicholas

    2017-01-01

    The students in Ms. Moscoso's second-grade class gather on the rug after recess, ready for one of their favorite math warm-ups: Choral Counting. Counting is an important part of doing mathematics throughout the school; students count collections (Schwerdtfeger and Chan 2007) and solve problems using a variety of strategies, many of which are…

  13. A technology review of time-of-flight photon counting for advanced remote sensing

    NASA Astrophysics Data System (ADS)

    Lamb, Robert A.

    2010-04-01

    Time correlated single photon counting (TCSPC) has made tremendous progress during the past ten years enabling improved performance in precision time-of-flight (TOF) rangefinding and lidar. In this review the development and performance of several ranging systems is presented that use TCSPC for accurate ranging and range profiling over distances up to 17km. A range resolution of a few millimetres is routinely achieved over distances of several kilometres. These systems include single wavelength devices operating in the visible; multi-wavelength systems covering the visible and near infra-red; the use of electronic gating to reduce in-band solar background and, most recently, operation at high repetition rates without range aliasing- typically 10MHz over several kilometres. These systems operate at very low optical power (<100μW). The technique therefore has potential for eye-safe lidar monitoring of the environment and obvious military, security and surveillance sensing applications. The review will highlight the theoretical principles of photon counting and progress made in developing absolute ranging techniques that enable high repetition rate data acquisition that avoids range aliasing. Technology trends in TCSPC rangefinding are merging with those of quantum cryptography and its future application to revolutionary quantum imaging provides diverse and exciting research into secure covert sensing, ultra-low power active imaging and quantum rangefinding.

  14. Spike Phase Locking in CA1 Pyramidal Neurons depends on Background Conductance and Firing Rate

    PubMed Central

    Broiche, Tilman; Malerba, Paola; Dorval, Alan D.; Borisyuk, Alla; Fernandez, Fernando R.; White, John A.

    2012-01-01

    Oscillatory activity in neuronal networks correlates with different behavioral states throughout the nervous system, and the frequency-response characteristics of individual neurons are believed to be critical for network oscillations. Recent in vivo studies suggest that neurons experience periods of high membrane conductance, and that action potentials are often driven by membrane-potential fluctuations in the living animal. To investigate the frequency-response characteristics of CA1 pyramidal neurons in the presence of high conductance and voltage fluctuations, we performed dynamic-clamp experiments in rat hippocampal brain slices. We drove neurons with noisy stimuli that included a sinusoidal component ranging, in different trials, from 0.1 to 500 Hz. In subsequent data analysis, we determined action potential phase-locking profiles with respect to background conductance, average firing rate, and frequency of the sinusoidal component. We found that background conductance and firing rate qualitatively change the phase-locking profiles of CA1 pyramidal neurons vs. frequency. In particular, higher average spiking rates promoted band-pass profiles, and the high-conductance state promoted phase-locking at frequencies well above what would be predicted from changes in the membrane time constant. Mechanistically, spike-rate adaptation and frequency resonance in the spike-generating mechanism are implicated in shaping the different phase-locking profiles. Our results demonstrate that CA1 pyramidal cells can actively change their synchronization properties in response to global changes in activity associated with different behavioral states. PMID:23055508

  15. Sperm count as a surrogate endpoint for male fertility control.

    PubMed

    Benda, Norbert; Gerlinger, Christoph

    2007-11-30

    When assessing the effectiveness of a hormonal method of fertility control in men, the classical approach used for the assessment of hormonal contraceptives in women, by estimating the pregnancy rate or using a life-table analysis for the time to pregnancy, is difficult to apply in a clinical development program. The main reasons are the dissociation of the treated unit, i.e. the man, and the observed unit, i.e. his female partner, the high variability in the frequency of male intercourse, the logistical cost and ethical concerns related to the monitoring of the trial. A reasonable surrogate endpoint of the definite endpoint time to pregnancy is sperm count. In addition to the avoidance of the mentioned problems, trials that compare different treatments are possible with reasonable sample sizes, and study duration can be shorter. However, current products do not suppress sperm production to 100 per cent in all men and the sperm count is only observed with measurement error. Complete azoospermia might not be necessary in order to achieve an acceptable failure rate compared with other forms of male fertility control. Therefore, the use of sperm count as a surrogate endpoint must rely on the results of a previous trial in which both the definitive- and surrogate-endpoint results were assessed. The paper discusses different estimation functions of the mean pregnancy rate (corresponding to the cumulative hazard) that are based on the results of sperm count trial and a previous trial in which both sperm count and time to pregnancy were assessed, as well as the underlying assumptions. Sample size estimations are given for pregnancy rate estimation with a given precision.

  16. Guidance on the Use of Hand-Held Survey Meters for Radiological Triage: Time-Dependent Detector Count Rates Corresponding to 50, 250, and 500 mSv Effective Dose for Adult Males and Adult Females

    PubMed Central

    Bolch, Wesley E.; Hurtado, Jorge L.; Lee, Choonsik; Manger, Ryan; Hertel, Nolan; Dickerson, William

    2013-01-01

    measurements be conducted in a low-background, and possibly mobile, facility positioned at the triage location. Net count rate data are provided in both tabular and graphical format within a series of eight handbooks available at the CDC website http://emergency.cdc.gov/radiation. PMID:22420020

  17. Guidance on the Use of Hand-Held Survey Meters for radiological Triage: Time-Dependent Detector Count Rates Corresponding to 50, 250, and 500 mSv Effective Dose for Adult Males and Adult Females

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bolch, W.E.; Hurtado, J.L.; Lee, C.

    2012-01-01

    measurements be conducted in a low background, and possibly mobile, facility positioned at the triage location. Net count rate data are provided in both tabular and graphical format within a series of eight handbooks available at the CDC website (http://www.bt.cdc.gov/radiation/clinicians/evaluation).« less

  18. HgCdTe APD-based linear-mode photon counting components and ladar receivers

    NASA Astrophysics Data System (ADS)

    Jack, Michael; Wehner, Justin; Edwards, John; Chapman, George; Hall, Donald N. B.; Jacobson, Shane M.

    2011-05-01

    Linear mode photon counting (LMPC) provides significant advantages in comparison with Geiger Mode (GM) Photon Counting including absence of after-pulsing, nanosecond pulse to pulse temporal resolution and robust operation in the present of high density obscurants or variable reflectivity objects. For this reason Raytheon has developed and previously reported on unique linear mode photon counting components and modules based on combining advanced APDs and advanced high gain circuits. By using HgCdTe APDs we enable Poisson number preserving photon counting. A metric of photon counting technology is dark count rate and detection probability. In this paper we report on a performance breakthrough resulting from improvement in design, process and readout operation enabling >10x reduction in dark counts rate to ~10,000 cps and >104x reduction in surface dark current enabling long 10 ms integration times. Our analysis of key dark current contributors suggest that substantial further reduction in DCR to ~ 1/sec or less can be achieved by optimizing wavelength, operating voltage and temperature.

  19. Blade counting tool with a 3D borescope for turbine applications

    NASA Astrophysics Data System (ADS)

    Harding, Kevin G.; Gu, Jiajun; Tao, Li; Song, Guiju; Han, Jie

    2014-07-01

    Video borescopes are widely used for turbine and aviation engine inspection to guarantee the health of blades and prevent blade failure during running. When the moving components of a turbine engine are inspected with a video borescope, the operator must view every blade in a given stage. The blade counting tool is video interpretation software that runs simultaneously in the background during inspection. It identifies moving turbine blades in a video stream, tracks and counts the blades as they move across the screen. This approach includes blade detection to identify blades in different inspection scenarios and blade tracking to perceive blade movement even in hand-turning engine inspections. The software is able to label each blade by comparing counting results to a known blade count for the engine type and stage. On-screen indications show the borescope user labels for each blade and how many blades have been viewed as the turbine is rotated.

  20. Maryland Kids Count Factbook, 2001.

    ERIC Educational Resources Information Center

    Advocates for Children and Youth, Baltimore, MD.

    This 7th annual Kids Count Factbook provides information on trends in the well-being of children in Maryland and its 24 jurisdictions. The statistical portrait is based on 18 indicators of well-being: (1) low birth-weight infants; (2) infant mortality; (3) early prenatal care; (4) binge drinking; (5) child deaths; (6) child injury rate; (7) grade…

  1. Maryland's Kids Count Factbook 1996.

    ERIC Educational Resources Information Center

    Advocates for Children and Youth, Baltimore, MD.

    This Kids Count report details statewide trends in the well-being of Maryland's children. The statistical portrait is based on 14 indicators of child well being: (1) child poverty; (2) child support; (3) births to teens; (4) low birthweight infants; (5) infant mortality; (6) lead screening; (7) child abuse and neglect; (8) child death rate; (9)…

  2. Kids Count in Delaware, Families Count in Delaware: Fact Book, 2003.

    ERIC Educational Resources Information Center

    Delaware Univ., Newark. Kids Count in Delaware.

    This Kids Count Fact Book is combined with the Families Count Fact Book to provide information on statewide trends affecting children and families in Delaware. The Kids Count and Families Count indicators have been combined into four new categories: health and health behaviors, educational involvement and achievement, family environment and…

  3. High-Rate Data-Capture for an Airborne Lidar System

    NASA Technical Reports Server (NTRS)

    Valett, Susan; Hicks, Edward; Dabney, Philip; Harding, David

    2012-01-01

    A high-rate data system was required to capture the data for an airborne lidar system. A data system was developed that achieved up to 22 million (64-bit) events per second sustained data rate (1408 million bits per second), as well as short bursts (less than 4 s) at higher rates. All hardware used for the system was off the shelf, but carefully selected to achieve these rates. The system was used to capture laser fire, single-photon detection, and GPS data for the Slope Imaging Multi-polarization Photo-counting Lidar (SIMPL). However, the system has applications for other laser altimeter systems (waveform-recording), mass spectroscopy, xray radiometry imaging, high-background- rate ranging lidar, and other similar areas where very high-speed data capture is needed. The data capture software was used for the SIMPL instrument that employs a micropulse, single-photon ranging measurement approach and has 16 data channels. The detected single photons are from two sources those reflected from the target and solar background photons. The instrument is non-gated, so background photons are acquired for a range window of 13 km and can comprise many times the number of target photons. The highest background rate occurs when the atmosphere is clear, the Sun is high, and the target is a highly reflective surface such as snow. Under these conditions, the total data rate for the 16 channels combined is expected to be approximately 22 million events per second. For each photon detection event, the data capture software reads the relative time of receipt, with respect to a one-per-second absolute time pulse from a GPS receiver, from an event timer card with 0.1-ns precision, and records that information to a RAID (Redundant Array of Independent Disks) storage device. The relative time of laser pulse firings must also be read and recorded with the same precision. Each of the four event timer cards handles the throughput from four of the channels. For each detection event, a flag is

  4. Optimal measurement counting time and statistics in gamma spectrometry analysis: The time balance

    NASA Astrophysics Data System (ADS)

    Joel, Guembou Shouop Cebastien; Penabei, Samafou; Maurice, Ndontchueng Moyo; Gregoire, Chene; Jilbert, Nguelem Mekontso Eric; Didier, Takoukam Serge; Werner, Volker; David, Strivay

    2017-01-01

    The optimal measurement counting time for gamma-ray spectrometry analysis using HPGe detectors was determined in our laboratory by comparing twelve hours measurement counting time at day and twelve hours measurement counting time at night. The day spectrum does not fully cover the night spectrum for the same sample. It is observed that the perturbation come to the sun-light. After several investigations became clearer: to remove all effects of radiation from outside (earth, the sun, and universe) our system, it is necessary to measure the background for 24, 48 or 72 hours. In the same way, the samples have to be measured for 24, 48 or 72 hours to be safe to be purified the measurement (equality of day and night measurement). It is also possible to not use the background of the winter in summer. Depend on to the energy of radionuclide we seek, it is clear that the most important steps of a gamma spectrometry measurement are the preparation of the sample and the calibration of the detector.

  5. Monte Carlo Simulations of Background Spectra in Integral Imager Detectors

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.; Colborn, B. L.; Dietz, K. L.; Ramsey, B. D.; Weisskopf, M. C.

    1998-01-01

    Predictions of the expected gamma-ray backgrounds in the ISGRI (CdTe) and PiCsIT (Csl) detectors on INTEGRAL due to cosmic-ray interactions and the diffuse gamma-ray background have been made using a coupled set of Monte Carlo radiation transport codes (HETC, FLUKA, EGS4, and MORSE) and a detailed, 3-D mass model of the spacecraft and detector assemblies. The simulations include both the prompt background component from induced hadronic and electromagnetic cascades and the delayed component due to emissions from induced radioactivity. Background spectra have been obtained with and without the use of active (BGO) shielding and charged particle rejection to evaluate the effectiveness of anticoincidence counting on background rejection.

  6. Vehicle counting system using real-time video processing

    NASA Astrophysics Data System (ADS)

    Crisóstomo-Romero, Pedro M.

    2006-02-01

    Transit studies are important for planning a road network with optimal vehicular flow. A vehicular count is essential. This article presents a vehicle counting system based on video processing. An advantage of such system is the greater detail than is possible to obtain, like shape, size and speed of vehicles. The system uses a video camera placed above the street to image transit in real-time. The video camera must be placed at least 6 meters above the street level to achieve proper acquisition quality. Fast image processing algorithms and small image dimensions are used to allow real-time processing. Digital filters, mathematical morphology, segmentation and other techniques allow identifying and counting all vehicles in the image sequences. The system was implemented under Linux in a 1.8 GHz Pentium 4 computer. A successful count was obtained with frame rates of 15 frames per second for images of size 240x180 pixels and 24 frames per second for images of size 180x120 pixels, thus being able to count vehicles whose speeds do not exceed 150 km/h.

  7. Negative Avalanche Feedback Detectors for Photon-Counting Optical Communications

    NASA Technical Reports Server (NTRS)

    Farr, William H.

    2009-01-01

    Negative Avalanche Feedback photon counting detectors with near-infrared spectral sensitivity offer an alternative to conventional Geiger mode avalanche photodiode or phototube detectors for free space communications links at 1 and 1.55 microns. These devices demonstrate linear mode photon counting without requiring any external reset circuitry and may even be operated at room temperature. We have now characterized the detection efficiency, dark count rate, after-pulsing, and single photon jitter for three variants of this new detector class, as well as operated these uniquely simple to use devices in actual photon starved free space optical communications links.

  8. Montana Kids Count Data Book and County Profiles, 1994.

    ERIC Educational Resources Information Center

    Healthy Mothers, Healthy Babies--The Montana Coalition, Helena.

    This Kids Count publication is the first to examine statewide trends in the well-being of Montana's children. The statistical portrait is based on 13 indicators of well-being: (1) low birthweight rate; (2) infant mortality rate; (3) child death rate; (4) teen violent death rate; (5) percent of public school enrollment in Chapter 1 programs; (6)…

  9. The Hawaii SCUBA-2 Lensing Cluster Survey: Number Counts and Submillimeter Flux Ratios

    NASA Astrophysics Data System (ADS)

    Hsu, Li-Yen; Cowie, Lennox L.; Chen, Chian-Chou; Barger, Amy J.; Wang, Wei-Hao

    2016-09-01

    We present deep number counts at 450 and 850 μm using the SCUBA-2 camera on the James Clerk Maxwell Telescope. We combine data for six lensing cluster fields and three blank fields to measure the counts over a wide flux range at each wavelength. Thanks to the lensing magnification, our measurements extend to fluxes fainter than 1 mJy and 0.2 mJy at 450 μm and 850 μm, respectively. Our combined data highly constrain the faint end of the number counts. Integrating our counts shows that the majority of the extragalactic background light (EBL) at each wavelength is contributed by faint sources with L IR < 1012 L ⊙, corresponding to luminous infrared galaxies (LIRGs) or normal galaxies. By comparing our result with the 500 μm stacking of K-selected sources from the literature, we conclude that the K-selected LIRGs and normal galaxies still cannot fully account for the EBL that originates from sources with L IR < 1012 L ⊙. This suggests that many faint submillimeter galaxies may not be included in the UV star formation history. We also explore the submillimeter flux ratio between the two bands for our 450 μm and 850 μm selected sources. At 850 μm, we find a clear relation between the flux ratio and the observed flux. This relation can be explained by a redshift evolution, where galaxies at higher redshifts have higher luminosities and star formation rates. In contrast, at 450 μm, we do not see a clear relation between the flux ratio and the observed flux.

  10. THE HAWAII SCUBA-2 LENSING CLUSTER SURVEY: NUMBER COUNTS AND SUBMILLIMETER FLUX RATIOS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hsu, Li-Yen; Cowie, Lennox L.; Barger, Amy J.

    2016-09-20

    We present deep number counts at 450 and 850 μ m using the SCUBA-2 camera on the James Clerk Maxwell Telescope. We combine data for six lensing cluster fields and three blank fields to measure the counts over a wide flux range at each wavelength. Thanks to the lensing magnification, our measurements extend to fluxes fainter than 1 mJy and 0.2 mJy at 450 μ m and 850 μ m, respectively. Our combined data highly constrain the faint end of the number counts. Integrating our counts shows that the majority of the extragalactic background light (EBL) at each wavelength ismore » contributed by faint sources with L {sub IR} < 10{sup 12} L {sub ⊙}, corresponding to luminous infrared galaxies (LIRGs) or normal galaxies. By comparing our result with the 500 μ m stacking of K -selected sources from the literature, we conclude that the K -selected LIRGs and normal galaxies still cannot fully account for the EBL that originates from sources with L {sub IR} < 10{sup 12} L {sub ⊙}. This suggests that many faint submillimeter galaxies may not be included in the UV star formation history. We also explore the submillimeter flux ratio between the two bands for our 450 μ m and 850 μ m selected sources. At 850 μ m, we find a clear relation between the flux ratio and the observed flux. This relation can be explained by a redshift evolution, where galaxies at higher redshifts have higher luminosities and star formation rates. In contrast, at 450 μ m, we do not see a clear relation between the flux ratio and the observed flux.« less

  11. Avalanche photodiode photon counting receivers for space-borne lidars

    NASA Technical Reports Server (NTRS)

    Sun, Xiaoli; Davidson, Frederic M.

    1991-01-01

    Avalanche photodiodes (APD) are studied for uses as photon counting detectors in spaceborne lidars. Non-breakdown APD photon counters, in which the APD's are biased below the breakdown point, are shown to outperform: (1) conventional APD photon counters biased above the breakdown point; (2) conventional APD photon counters biased above the breakdown point; and (3) APD's in analog mode when the received optical signal is extremely weak. Non-breakdown APD photon counters were shown experimentally to achieve an effective photon counting quantum efficiency of 5.0 percent at lambda = 820 nm with a dead time of 15 ns and a dark count rate of 7000/s which agreed with the theoretically predicted values. The interarrival times of the counts followed an exponential distribution and the counting statistics appeared to follow a Poisson distribution with no after pulsing. It is predicted that the effective photon counting quantum efficiency can be improved to 18.7 percent at lambda = 820 nm and 1.46 percent at lambda = 1060 nm with a dead time of a few nanoseconds by using more advanced commercially available electronic components.

  12. The Money/Counting Kit. The Prospectus Series, Paper No. 6.

    ERIC Educational Resources Information Center

    Musumeci, Judith

    The Money/Counting Kit for Handicapped Children and Youth, frees the teacher from lessons in money and counting concepts and enables a student to learn at his own rate with immediate feedback from activity cards, name cards, thermoformed coin cards (optional), and self-instructional booklets. The activity cards, which may be used individually or…

  13. Background mortality rates for recovering populations of Acropora cytherea in the Chagos Archipelago, central Indian Ocean.

    PubMed

    Pratchett, M S; Pisapia, C; Sheppard, C R C

    2013-05-01

    This study quantified background rates of mortality for Acropora cytherea in the Chagos Archipelago. Despite low levels of anthropogenic disturbance, 27.5% (149/541) of A. cytherea colonies exhibited some level of partial mortality, and 9.0% (49/541) of colonies had recent injuries. A total of 15.3% of the overall surface area of physically intact A. cytherea colonies was dead. Observed mortality was partly attributable to overtopping and/or self-shading among colonies. There were also low-densities of Acanthaster planci apparent at some study sites. However, most of the recent mortality recorded was associated with isolated infestations of the coral crab, Cymo melanodactylus. A. cytherea is a relatively fast growing coral and these levels of mortality may be biologically unimportant. However, few studies have measured background rates of coral mortality, especially in the absence of direct human disturbances. These data are important for assessing the impacts of increasing disturbances, especially in projecting likely recovery. Copyright © 2013. Published by Elsevier Ltd.

  14. On-demand generation of background-free single photons from a solid-state source

    NASA Astrophysics Data System (ADS)

    Schweickert, Lucas; Jöns, Klaus D.; Zeuner, Katharina D.; Covre da Silva, Saimon Filipe; Huang, Huiying; Lettner, Thomas; Reindl, Marcus; Zichi, Julien; Trotta, Rinaldo; Rastelli, Armando; Zwiller, Val

    2018-02-01

    True on-demand high-repetition-rate single-photon sources are highly sought after for quantum information processing applications. However, any coherently driven two-level quantum system suffers from a finite re-excitation probability under pulsed excitation, causing undesirable multi-photon emission. Here, we present a solid-state source of on-demand single photons yielding a raw second-order coherence of g(2 )(0 )=(7.5 ±1.6 )×10-5 without any background subtraction or data processing. To this date, this is the lowest value of g(2 )(0 ) reported for any single-photon source even compared to the previously reported best background subtracted values. We achieve this result on GaAs/AlGaAs quantum dots embedded in a low-Q planar cavity by employing (i) a two-photon excitation process and (ii) a filtering and detection setup featuring two superconducting single-photon detectors with ultralow dark-count rates of (0.0056 ±0.0007 ) s-1 and (0.017 ±0.001 ) s-1, respectively. Re-excitation processes are dramatically suppressed by (i), while (ii) removes false coincidences resulting in a negligibly low noise floor.

  15. Effects of Sampling Strategy, Detection Probability, and Independence of Counts on the Use of Point Counts

    Treesearch

    Grey W. Pendleton

    1995-01-01

    Many factors affect the use of point counts for monitoring bird populations, including sampling strategies, variation in detection rates, and independence of sample points. The most commonly used sampling plans are stratified sampling, cluster sampling, and systematic sampling. Each of these might be most useful for different objectives or field situations. Variation...

  16. Multiplicity counting from fission detector signals with time delay effects

    NASA Astrophysics Data System (ADS)

    Nagy, L.; Pázsit, I.; Pál, L.

    2018-03-01

    In recent work, we have developed the theory of using the first three auto- and joint central moments of the currents of up to three fission chambers to extract the singles, doubles and triples count rates of traditional multiplicity counting (Pázsit and Pál, 2016; Pázsit et al., 2016). The objective is to elaborate a method for determining the fissile mass, neutron multiplication, and (α, n) neutron emission rate of an unknown assembly of fissile material from the statistics of the fission chamber signals, analogous to the traditional multiplicity counting methods with detectors in the pulse mode. Such a method would be an alternative to He-3 detector systems, which would be free from the dead time problems that would be encountered in high counting rate applications, for example the assay of spent nuclear fuel. A significant restriction of our previous work was that all neutrons born in a source event (spontaneous fission) were assumed to be detected simultaneously, which is not fulfilled in reality. In the present work, this restriction is eliminated, by assuming an independent, identically distributed random time delay for all neutrons arising from one source event. Expressions are derived for the same auto- and joint central moments of the detector current(s) as in the previous case, expressed with the singles, doubles, and triples (S, D and T) count rates. It is shown that if the time-dispersion of neutron detections is of the same order of magnitude as the detector pulse width, as they typically are in measurements of fast neutrons, the multiplicity rates can still be extracted from the moments of the detector current, although with more involved calibration factors. The presented formulae, and hence also the performance of the proposed method, are tested by both analytical models of the time delay as well as with numerical simulations. Methods are suggested also for the modification of the method for large time delay effects (for thermalised neutrons).

  17. Linear operating region in the ozone dial photon counting system

    NASA Technical Reports Server (NTRS)

    Andrawis, Madeleine

    1995-01-01

    Ozone is a relatively unstable molecule found in Earth's atmosphere. An ozone molecule is made up of three atoms of oxygen. Depending on where ozone resides, it can protect or harm life on Earth. High in the atmosphere, about 15 miles up, ozone acts as a shield to protect Earth's surface from the sun's harmful ultraviolet radiation. Without this shield, we would be more susceptible to skin cancer, cataracts, and impaired immune systems. Closer to Earth, in the air we breathe, ozone is a harmful pollutant that causes damage to lung tissue and plants. Since the early 1980's, airborne lidar systems have been used for making measurements of ozone. The differential absorption lidar (DIAL) technique is used in the remote measurement of O3. This system allows the O3 to be measured as function of the range in the atmosphere. Two frequency-doubled Nd:YAG lasers are used to pump tunable dye lasers. The lasers are operating at 289 nm for the DIAL on-line wavelength of O3, and the other one is operated at 300 nm for the off-line wavelength. The DIAL wavelengths are produced in sequential laser pulses with a time separation of 300 micro s. The backscattered laser energy is collected by telescopes and measured using photon counting systems. The photon counting system measures the light signal by making use of the photon nature of light. The output pulse from the Photo-Multiplier Tube (PE), caused by a photon striking the PMT photo-cathode, is amplified and passed to a pulse height discriminator. The peak value of the pulse is compared to a reference voltage (discrimination level). If the pulse amplitude exceeds the discrimination level, the discriminator generates a standard pulse which is counted by the digital counter. Non-linearity in the system is caused by the overlapping of pulses and the finite response time of the electronics. At low count rates one expects the system to register one event for each output pulse from the PMT corresponding to a photon incident upon the

  18. Technology Counts 2007: A Digital Decade

    ERIC Educational Resources Information Center

    Education Week, 2007

    2007-01-01

    "Technology Counts 2007" looks back, and ahead, after a decade of enormous upheaval in the educational technology landscape. This special issue of "Education Week" includes the following articles: (1) A Digital Decade; (2) Getting Up to Speed (Andrew Trotter); (3) E-Rate's Imprint Seen in Schools (Andrew Trotter); (4) Teaching…

  19. The isotropic radio background revisited

    NASA Astrophysics Data System (ADS)

    Fornengo, Nicolao; Lineros, Roberto A.; Regis, Marco; Taoso, Marco

    2014-04-01

    We present an extensive analysis on the determination of the isotropic radio background. We consider six different radio maps, ranging from 22 MHz to 2.3 GHz and covering a large fraction of the sky. The large scale emission is modeled as a linear combination of an isotropic component plus the Galactic synchrotron radiation and thermal bremsstrahlung. Point-like and extended sources are either masked or accounted for by means of a template. We find a robust estimate of the isotropic radio background, with limited scatter among different Galactic models. The level of the isotropic background lies significantly above the contribution obtained by integrating the number counts of observed extragalactic sources. Since the isotropic component dominates at high latitudes, thus making the profile of the total emission flat, a Galactic origin for such excess appears unlikely. We conclude that, unless a systematic offset is present in the maps, and provided that our current understanding of the Galactic synchrotron emission is reasonable, extragalactic sources well below the current experimental threshold seem to account for the majority of the brightness of the extragalactic radio sky.

  20. Correlation of platelet count and acute ST-elevation in myocardial infarction.

    PubMed

    Paul, G K; Sen, B; Bari, M A; Rahman, Z; Jamal, F; Bari, M S; Sazidur, S R

    2010-07-01

    The role of platelets in the pathogenesis of ST-elevation myocardial infarction (STEMI) has been substantiated by studies that demonstrated significant clinical benefits associated with antiplatelet therapy. Initial platelet counts in Acute Myocardial Infarction (AMI) may be a useful adjunct for identifying those patients who may or may not respond to fibrinolytic agents. Patient with acute STEMI has variable level of platelet count and with higher platelet count have poor in hospital outcome. There are many predictors of poor outcome in Acute Myocardial Infarction (AMI) like cardiac biomarkers (Troponin I, Troponin T and CK-MB), C-Reactive Protien (CRP) and WBC (White Blood Cell) counts. Platelet count on presentation of STEMI is one of them. Higher platelet count is associated with higher rate of adverse clinical outcome in ST-Elevation Myocardial Infarction (STEMI), like heart failure, arrhythmia, re-infarction & death. So, categorization of patient with STEMI on the basis of platelet counts may be helpful for risk stratification and management of these patients.

  1. AzTEC/ASTE 1.1 mm Deep Surveys: Number Counts and Clustering of Millimeter-bright Galaxies

    NASA Astrophysics Data System (ADS)

    Hatsukade, B.; Kohno, K.; Aretxaga, I.; Austermann, J. E.; Ezawa, H.; Hughes, D. H.; Ikarashi, S.; Iono, D.; Kawabe, R.; Matsuo, H.; Matsuura, S.; Nakanishi, K.; Oshima, T.; Perera, T.; Scott, K. S.; Shirahata, M.; Takeuchi, T. T.; Tamura, Y.; Tanaka, K.; Tosaki, T.; Wilson, G. W.; Yun, M. S.

    2010-10-01

    We present number counts and clustering properties of millimeter-bright galaxies uncovered by the AzTEC camera mounted on the Atacama Submillimeter Telescope Experiment (ASTE). We surveyed the AKARI Deep Field South (ADF-S), the Subaru/XMM Newton Deep Field (SXDF), and the SSA22 fields with an area of ~0.25 deg2 each with an rms noise level of ~0.4-1.0 mJy. We constructed differential and cumulative number counts, which provide currently the tightest constraints on the faint end. The integration of the best-fit number counts in the ADF-S find that the contribution of 1.1 mm sources with fluxes >=1 mJy to the cosmic infrared background (CIB) at 1.1 mm is 12-16%, suggesting that the large fraction of the CIB originates from faint sources of which the number counts are not yet constrained. We estimate the cosmic star-formation rate density contributed by 1.1 mm sources with >=1 mJy using the best-fit number counts in the ADF-S and find that it is lower by about a factor of 5-10 compared to those derived from UV/optically-selected galaxies at z~2-3. The average mass of dark halos hosting bright 1.1 mm sources was calculated to be 1013-1014 Msolar. Comparison of correlation lengths of 1.1 mm sources with other populations and with a bias evolution model suggests that dark halos hosting bright 1.1 mm sources evolve into systems of clusters at present universe and the 1.1 mm sources residing the dark halos evolve into massive elliptical galaxies located in the center of clusters.

  2. Assessing Rotation-Invariant Feature Classification for Automated Wildebeest Population Counts.

    PubMed

    Torney, Colin J; Dobson, Andrew P; Borner, Felix; Lloyd-Jones, David J; Moyer, David; Maliti, Honori T; Mwita, Machoke; Fredrick, Howard; Borner, Markus; Hopcraft, J Grant C

    2016-01-01

    Accurate and on-demand animal population counts are the holy grail for wildlife conservation organizations throughout the world because they enable fast and responsive adaptive management policies. While the collection of image data from camera traps, satellites, and manned or unmanned aircraft has advanced significantly, the detection and identification of animals within images remains a major bottleneck since counting is primarily conducted by dedicated enumerators or citizen scientists. Recent developments in the field of computer vision suggest a potential resolution to this issue through the use of rotation-invariant object descriptors combined with machine learning algorithms. Here we implement an algorithm to detect and count wildebeest from aerial images collected in the Serengeti National Park in 2009 as part of the biennial wildebeest count. We find that the per image error rates are greater than, but comparable to, two separate human counts. For the total count, the algorithm is more accurate than both manual counts, suggesting that human counters have a tendency to systematically over or under count images. While the accuracy of the algorithm is not yet at an acceptable level for fully automatic counts, our results show this method is a promising avenue for further research and we highlight specific areas where future research should focus in order to develop fast and accurate enumeration of aerial count data. If combined with a bespoke image collection protocol, this approach may yield a fully automated wildebeest count in the near future.

  3. Ageing & long-term CD4 cell count trends in HIV-positive patients with 5 years or more combination antiretroviral therapy experience

    PubMed Central

    WRIGHT, ST; PETOUMENOS, K; BOYD, M; CARR, A; DOWNING, S; O’CONNOR, CC; GROTOWSKI, M; LAW, MG

    2012-01-01

    Background The aim of this analysis is to describe the long-term changes in CD4 cell counts beyond 5 years of combination antiretroviral therapy (cART). If natural ageing leads to a long-term decline in the immune system via low-grade chronic immune activation/inflammation, then one might expect to see a greater or earlier decline in CD4 counts in older HIV-positive patients with increasing duration of cART. Methods Retrospective and prospective data were examined from long-term virologically stable HIV-positive adults from the Australian HIV Observational Database. We estimated mean CD4 cell counts changes following the completion of 5 years of cART using linear mixed models. Results A total of 37,916 CD4 measurements were observed for 892 patients over a combined total of 9,753 patient years. Older patients (>50 years) at cART initiation had estimated mean(95% confidence interval) change in CD4 counts by Year-5 CD4 count strata (<500, 501–750 and >750 cells/μL) of 14(7 to 21), 3(−5 to 11) and −6(−17 to 4) cells/μL/year. Of the CD4 cell count rates of change estimated, none were indicative of long-term declines in CD4 cell counts. Conclusions Our results suggest that duration of cART and increasing age does not result in decreasing mean changes in CD4 cell counts for long-term virologically suppressed patients. Indicating that level of immune recovery achieved during the first 5 years of treatment are sustained through long-term cART. PMID:23036045

  4. Scale-dependent associations of Band-tailed Pigeon counts at mineral sites

    USGS Publications Warehouse

    Overton, Cory T.; Casazza, Michael L.; Coates, Peter S.

    2010-01-01

    The abundance of Band-tailed Pigeons (Patagioenas fasciata monilis) has declined substantially from historic numbers along the Pacific Coast. Identification of patterns and causative factors of this decline are hampered because habitat use data are limited, and temporal and spatial variability patterns associated with population indices are not known. Furthermore, counts are influenced not only by pigeon abundance but also by rate of visitation to mineral sites, which may not be consistent. To address these issues, we conducted mineral site counts during 2001 and 2002 at 20 locations from 4 regions in the Pacific Northwest, including central Oregon and western Washington, USA, and British Columbia, Canada. We developed inference models that consisted of environmental factors and spatial characteristics at multiple spatial scales. Based on information theory, we compared models within a final set that included variables measured at 3 spatial scales (0.03 ha, 3.14 ha, and 7850 ha). Pigeon counts increased from central Oregon through northern Oregon and decreased into British Columbia. After accounting for this spatial pattern, we found that pigeon counts increased 12% ± 2.7 with a 10% increase in the amount of deciduous forested area within 100 m from a mineral site. Also, distance from the mineral site of interest to the nearest known mineral site was positively related to pigeon counts. These findings provide direction for future research focusing on understanding the relationships between indices of relative abundance and complete counts (censuses) of pigeon populations by identifying habitat characteristics that might influence visitation rates. Furthermore, our results suggest that spatial arrangement of mineral sites influences Band-tailed Pigeon counts and the populations which those counts represent.

  5. South Dakota KIDS COUNT Factbook, 1999.

    ERIC Educational Resources Information Center

    Cochran, Carole, Ed.

    This Kids Count fact book examines statewide trends in well-being for South Dakota's children. The statistical portrait is based on 25 indicators in the areas of demographics, health, education, economic status, and safety. The indicators are: (1) population; (2) family profile; (3) poverty thresholds; (4) infant mortality rate; (5) low birth…

  6. Mcps-range photon-counting x-ray computed tomography system

    NASA Astrophysics Data System (ADS)

    Sato, Eiichi; Oda, Yasuyuki; Abudurexiti, Abulajiang; Hagiwara, Osahiko; Enomoto, Toshiyuki; Sugimura, Shigeaki; Endo, Haruyuki; Sato, Shigehiro; Ogawa, Akira; Onagawa, Jun

    2011-10-01

    10 Mcps photon counting was carried out using a detector consisting of a 2.0 mm-thick ZnO (zinc oxide) single-crystal scintillator and an MPPC (multipixel photon counter) module in an X-ray computed tomography (CT) system. The maximum count rate was 10 Mcps (mega counts per second) at a tube voltage of 70 kV and a tube current of 2.0 mA. Next, a photon-counting X-ray CT system consists of an X-ray generator, a turntable, a scan stage, a two-stage controller, the ZnO-MPPC detector, a counter card (CC), and a personal computer (PC). Tomography is accomplished by repeated linear scans and rotations of an object, and projection curves of the object are obtained by the linear scan with a scan velocity of 25 mm/s. The pulses of the event signal from the module are counted by the CC in conjunction with the PC. The exposure time for obtaining a tomogram was 600 s at a scan step of 0.5 mm and a rotation step of 1.0°, and photon-counting CT was accomplished using iodine-based contrast media.

  7. SU-G-IeP4-12: Performance of In-111 Coincident Gamma-Ray Counting: A Monte Carlo Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pahlka, R; Kappadath, S; Mawlawi, O

    2016-06-15

    Purpose: The decay of In-111 results in a non-isotropic gamma-ray cascade, which is normally imaged using a gamma camera. Creating images with a gamma camera using coincident gamma-rays from In-111 has not been previously studied. Our objective was to explore the feasibility of imaging this cascade as coincidence events and to determine the optimal timing resolution and source activity using Monte Carlo simulations. Methods: GEANT4 was used to simulate the decay of the In-111 nucleus and to model the gamma camera. Each photon emission was assigned a timestamp, and the time delay and angular separation for the second gamma-ray inmore » the cascade was consistent with the known intermediate state half-life of 85ns. The gamma-rays are transported through a model of a Siemens dual head Symbia “S” gamma camera with a 5/8-inch thick crystal and medium energy collimators. A true coincident event was defined as a single 171keV gamma-ray followed by a single 245keV gamma-ray within a specified time window (or vice versa). Several source activities (ranging from 10uCi to 5mCi) with and without incorporation of background counts were then simulated. Each simulation was analyzed using varying time windows to assess random events. The noise equivalent count rate (NECR) was computed based on the number of true and random counts for each combination of activity and time window. No scatter events were assumed since sources were simulated in air. Results: As expected, increasing the timing window increased the total number of observed coincidences albeit at the expense of true coincidences. A timing window range of 200–500ns maximizes the NECR at clinically-used source activities. The background rate did not significantly alter the maximum NECR. Conclusion: This work suggests coincident measurements of In-111 gamma-ray decay can be performed with commercial gamma cameras at clinically-relevant activities. Work is ongoing to assess useful clinical applications.« less

  8. Reprocessing WFC3/IR Exposures Affected by Time-Variable Backgrounds

    NASA Astrophysics Data System (ADS)

    Brammer, G.

    2016-11-01

    The background seen in WFC3/IR observations frequently shows strong time-dependent behavior above the constant flux expected for zodiacal continuum light. This is often caused by an emission line of helium at 1.083 μm excited in the sun-illuminated upper atmosphere, when seen in the filters (F105W, F110W) and grisms (G102, G141) sensitive to the feature. The default behavior of the calwf3 pipeline assumes constant source-plus-background fluxes when it performs up-the-ramp fitting to identify cosmic rays and determine the average count rate within a MULTIACCUM IR exposure. calwf3 provides undesirable results in the presence of strongly variable backgrounds, primarily in the form of elevated and non-Gaussian noise in the FLT products. Here we describe methods to improve the noise properties of the reduced products. In the first, we simply turn off the calwf3 crcorr step, treating the IR detector as if it were a CCD, i.e., accumulating flux and reading it out at the end of the exposure. Next, we artificially flatten the ramps in the IMA products and then allow calwf3 to proceed as normal fitting the ramp and identifying CRs. Either of these procedures enable recovery of datasets otherwise corrupted beyond repair and have no discernible effects on photometry of sources in deep combined images.

  9. An accurate derivation of the air dose-rate and the deposition concentration distribution by aerial monitoring in a low level contaminated area

    NASA Astrophysics Data System (ADS)

    Nishizawa, Yukiyasu; Sugita, Takeshi; Sanada, Yukihisa; Torii, Tatsuo

    2015-04-01

    Since 2011, MEXT (Ministry of Education, Culture, Sports, Science and Technology, Japan) have been conducting aerial monitoring to investigate the distribution of radioactive cesium dispersed into the atmosphere after the accident at the Fukushima Dai-ichi Nuclear Power Plant (FDNPP), Tokyo Electric Power Company. Distribution maps of the air dose-rate at 1 m above the ground and the radioactive cesium deposition concentration on the ground are prepared using spectrum obtained by aerial monitoring. The radioactive cesium deposition is derived from its dose rate, which is calculated by excluding the dose rate of the background radiation due to natural radionuclides from the air dose-rate at 1 m above the ground. The first step of the current method of calculating the dose rate due to natural radionuclides is calculate the ratio of the total count rate of areas where no radioactive cesium is detected and the count rate of regions with energy levels of 1,400 keV or higher (BG-Index). Next, calculate the air dose rate of radioactive cesium by multiplying the BG-Index and the integrated count rate of 1,400 keV or higher for the area where the radioactive cesium is distributed. In high dose-rate areas, however, the count rate of the 1,365-keV peak of Cs-134, though small, is included in the integrated count rate of 1,400 keV or higher, which could cause an overestimation of the air dose rate of natural radionuclides. We developed a method for accurately evaluating the distribution maps of natural air dose-rate by excluding the effect of radioactive cesium, even in contaminated areas, and obtained the accurate air dose-rate map attributed the radioactive cesium deposition on the ground. Furthermore, the natural dose-rate distribution throughout Japan has been obtained by this method.

  10. Alabama Kids Count 2002 Data Book.

    ERIC Educational Resources Information Center

    Curtis, Apreill; Bogie, Don

    This Kids Count data book examines statewide trends in well-being of Alabamas children. The statistical portrait is based on 18 indicators in the areas of child health, education, safety, and security: (1) infant mortality rate; (2) low weight births; (3) child health index; (4) births to unmarried teens; (5) first grade retention; (6) school…

  11. Measurement of Radon-Induced Backgrounds in the NEXT Double Beta Decay Experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Novella, P.; et al.

    The measurement of the internal 222Rn activity in the NEXT-White detector during the so-called Run-II period with 136Xe-depleted xenon is discussed in detail, together with its implications for double beta decay searches in NEXT. The activity is measured through the alpha production rate induced in the fiducial volume by 222Rn and its alpha-emitting progeny. The specific activity is measured to bemore » $$(37.5\\pm 2.3~\\mathrm{(stat.)}\\pm 5.9~\\mathrm{(syst.)})$$~mBq/m$^3$. Radon-induced electrons have also been characterized from the decay of the 214Bi daughter ions plating out on the cathode of the time projection chamber. From our studies, we conclude that radon-induced backgrounds are sufficiently low to enable a successful NEXT-100 physics program, as the projected rate contribution should not exceed 0.2~counts/yr in the neutrinoless double beta decay sample.« less

  12. Analysis of overdispersed count data by mixtures of Poisson variables and Poisson processes.

    PubMed

    Hougaard, P; Lee, M L; Whitmore, G A

    1997-12-01

    Count data often show overdispersion compared to the Poisson distribution. Overdispersion is typically modeled by a random effect for the mean, based on the gamma distribution, leading to the negative binomial distribution for the count. This paper considers a larger family of mixture distributions, including the inverse Gaussian mixture distribution. It is demonstrated that it gives a significantly better fit for a data set on the frequency of epileptic seizures. The same approach can be used to generate counting processes from Poisson processes, where the rate or the time is random. A random rate corresponds to variation between patients, whereas a random time corresponds to variation within patients.

  13. Probing cluster potentials through gravitational lensing of background X-ray sources

    NASA Technical Reports Server (NTRS)

    Refregier, A.; Loeb, A.

    1996-01-01

    The gravitational lensing effect of a foreground galaxy cluster, on the number count statistics of background X-ray sources, was examined. The lensing produces a deficit in the number of resolved sources in a ring close to the critical radius of the cluster. The cluster lens can be used as a natural telescope to study the faint end of the (log N)-(log S) relation for the sources which account for the X-ray background.

  14. Kids Count Report in Nebraska, 2002.

    ERIC Educational Resources Information Center

    Johnston, Janet M.

    This Kids Count report examines statewide trend data on the well-being of Nebraska's children. Section 1 of the report presents U.S. Census data on population trends in Nebraska as well as child poverty rates, and urges Nebraskans to work together to ensure that its youngest citizens have the best start possible. Section 2, the bulk of this…

  15. Nevada Kids Count Data Book, 1997.

    ERIC Educational Resources Information Center

    We Can, Inc., Las Vegas, NV.

    This Kids Count data book is the first to examine statewide indicators of the well being of Nevada's children. The statistical portrait is based on 15 indicators of child well being: (1) percent low birth-weight babies; (2) infant mortality rate; (3) percent of children in poverty; (4) percent of children in single-parent families; (5) percent of…

  16. Alabama Kids Count 2001 Data Book.

    ERIC Educational Resources Information Center

    Curtis, Apreill; Bogie, Don

    This Kids Count data book examines statewide trends in well-being for Alabama's children. The statistical portrait is based on 17 indicators in the areas of health, education, safety, and security. The indicators are: (1) infant mortality rate; (2) low weight births; (3) child health index; (4) births to unmarried teens; (5) first grade retention;…

  17. CLARO: an ASIC for high rate single photon counting with multi-anode photomultipliers

    NASA Astrophysics Data System (ADS)

    Baszczyk, M.; Carniti, P.; Cassina, L.; Cotta Ramusino, A.; Dorosz, P.; Fiorini, M.; Gotti, C.; Kucewicz, W.; Malaguti, R.; Pessina, G.

    2017-08-01

    The CLARO is a radiation-hard 8-channel ASIC designed for single photon counting with multi-anode photomultiplier tubes. Each channel outputs a digital pulse when the input signal from the photomultiplier crosses a configurable threshold. The fast return to baseline, typically within 25 ns, and below 50 ns in all conditions, allows to count up to 107 hits/s on each channel, with a power consumption of about 1 mW per channel. The ASIC presented here is a much improved version of the first 4-channel prototype. The threshold can be precisely set in a wide range, between 30 ke- (5 fC) and 16 Me- (2.6 pC). The noise of the amplifier with a 10 pF input capacitance is 3.5 ke- (0.6 fC) RMS. All settings are stored in a 128-bit configuration and status register, protected against soft errors with triple modular redundancy. The paper describes the design of the ASIC at transistor-level, and demonstrates its performance on the test bench.

  18. Patient-dependent count-rate adaptive normalization for PET detector efficiency with delayed-window coincidence events

    NASA Astrophysics Data System (ADS)

    Niu, Xiaofeng; Ye, Hongwei; Xia, Ting; Asma, Evren; Winkler, Mark; Gagnon, Daniel; Wang, Wenli

    2015-07-01

    Quantitative PET imaging is widely used in clinical diagnosis in oncology and neuroimaging. Accurate normalization correction for the efficiency of each line-of- response is essential for accurate quantitative PET image reconstruction. In this paper, we propose a normalization calibration method by using the delayed-window coincidence events from the scanning phantom or patient. The proposed method could dramatically reduce the ‘ring’ artifacts caused by mismatched system count-rates between the calibration and phantom/patient datasets. Moreover, a modified algorithm for mean detector efficiency estimation is proposed, which could generate crystal efficiency maps with more uniform variance. Both phantom and real patient datasets are used for evaluation. The results show that the proposed method could lead to better uniformity in reconstructed images by removing ring artifacts, and more uniform axial variance profiles, especially around the axial edge slices of the scanner. The proposed method also has the potential benefit to simplify the normalization calibration procedure, since the calibration can be performed using the on-the-fly acquired delayed-window dataset.

  19. Trends in CD4 Count Testing, Retention in Pre-ART Care, and ART Initiation Rates over the First Decade of Expansion of HIV Services in Haiti

    PubMed Central

    Koenig, Serena P.; Bernard, Daphne; Dévieux, Jessy G.; Atwood, Sidney; McNairy, Margaret L.; Severe, Patrice; Marcelin, Adias; Julma, Pierrot; Apollon, Alexandra; Pape, Jean W.

    2016-01-01

    Background High attrition during the period from HIV testing to antiretroviral therapy (ART) initiation is widely reported. Though treatment guidelines have changed to broaden ART eligibility and services have been widely expanded over the past decade, data on the temporal trends in pre-ART outcomes are limited; such data would be useful to guide future policy decisions. Methods We evaluated temporal trends and predictors of retention for each step from HIV testing to ART initiation over the past decade at the GHESKIO clinic in Port-au-Prince Haiti. The 24,925 patients >17 years of age who received a positive HIV test at GHESKIO from March 1, 2003 to February 28, 2013 were included. Patients were followed until they remained in pre-ART care for one year or initiated ART. Results 24,925 patients (61% female, median age 35 years) were included, and 15,008 (60%) had blood drawn for CD4 count within 12 months of HIV testing; the trend increased over time from 36% in Year 1 to 78% in Year 10 (p<0.0001). Excluding transfers, the proportion of patients who were retained in pre-ART care or initiated ART within the first year after HIV testing was 84%, 82%, 64%, and 64%, for CD4 count strata ≤200, 201 to 350, 351 to 500, and >500 cells/mm3, respectively. The trend increased over time for each CD4 strata, and in Year 10, 94%, 95%, 79%, and 74% were retained in pre-ART care or initiated ART for each CD4 strata. Predictors of pre-ART attrition included male gender, low income, and low educational status. Older age and tuberculosis (TB) at HIV testing were associated with retention in care. Conclusions The proportion of patients completing assessments for ART eligibility, remaining in pre-ART care, and initiating ART have increased over the last decade across all CD4 count strata, particularly among patients with CD4 count ≤350 cells/mm3. However, additional retention efforts are needed for patients with higher CD4 counts. PMID:26901795

  20. Daily step count predicts acute exacerbations in a US cohort with COPD.

    PubMed

    Moy, Marilyn L; Teylan, Merilee; Weston, Nicole A; Gagnon, David R; Garshick, Eric

    2013-01-01

    COPD is characterized by variability in exercise capacity and physical activity (PA), and acute exacerbations (AEs). Little is known about the relationship between daily step count, a direct measure of PA, and the risk of AEs, including hospitalizations. In an observational cohort study of 169 persons with COPD, we directly assessed PA with the StepWatch Activity Monitor, an ankle-worn accelerometer that measures daily step count. We also assessed exercise capacity with the 6-minute walk test (6MWT) and patient-reported PA with the St. George's Respiratory Questionnaire Activity Score (SGRQ-AS). AEs and COPD-related hospitalizations were assessed and validated prospectively over a median of 16 months. Mean daily step count was 5804±3141 steps. Over 209 person-years of observation, there were 263 AEs (incidence rate 1.3±1.6 per person-year) and 116 COPD-related hospitalizations (incidence rate 0.56±1.09 per person-year). Adjusting for FEV1 % predicted and prednisone use for AE in previous year, for each 1000 fewer steps per day walked at baseline, there was an increased rate of AEs (rate ratio 1.07; 95%CI = 1.003-1.15) and COPD-related hospitalizations (rate ratio 1.24; 95%CI = 1.08-1.42). There was a significant linear trend of decreasing daily step count by quartiles and increasing rate ratios for AEs (P = 0.008) and COPD-related hospitalizations (P = 0.003). Each 30-meter decrease in 6MWT distance was associated with an increased rate ratio of 1.07 (95%CI = 1.01-1.14) for AEs and 1.18 (95%CI = 1.07-1.30) for COPD-related hospitalizations. Worsening of SGRQ-AS by 4 points was associated with an increased rate ratio of 1.05 (95%CI = 1.01-1.09) for AEs and 1.10 (95%CI = 1.02-1.17) for COPD-related hospitalizations. Lower daily step count, lower 6MWT distance, and worse SGRQ-AS predict future AEs and COPD-related hospitalizations, independent of pulmonary function and previous AE history. These results support the importance of

  1. Association of Radon Background and Total Background Ionizing Radiation with Alzheimer's Disease Deaths in U.S. States.

    PubMed

    Lehrer, Steven; Rheinstein, Peter H; Rosenzweig, Kenneth E

    2017-01-01

    Exposure of the brain to ionizing radiation might promote the development of Alzheimer's disease (AD). Analysis of AD death rates versus radon background radiation and total background radiation in U.S. states. Total background, radon background, cosmic and terrestrial background radiation measurements are from Assessment of Variations in Radiation Exposure in the United States and Report No. 160 - Ionizing Radiation Exposure of the Population of the United States. 2013 AD death rates by U.S. state are from the Alzheimer's Association. Radon background ionizing radiation was significantly correlated with AD death rate in 50 states and the District of Columbia (r = 0.467, p = 0.001). Total background ionizing radiation was also significantly correlated with AD death rate in 50 states and the District of Columbia (r = 0.452, p = 0.001). Multivariate linear regression weighted by state population demonstrated that AD death rate was significantly correlated with radon background (β= 0.169, p < 0.001), age (β= 0.231, p < 0.001), hypertension (β= 0.155, p < 0.001), and diabetes (β= 0.353, p < 0.001). Our findings, like other studies, suggest that ionizing radiation is a risk factor for AD. Intranasal inhalation of radon gas could subject the rhinencephalon and hippocampus to damaging radiation that initiates AD. The damage would accumulate over time, causing age to be a powerful risk factor.

  2. Automatic vehicle counting system for traffic monitoring

    NASA Astrophysics Data System (ADS)

    Crouzil, Alain; Khoudour, Louahdi; Valiere, Paul; Truong Cong, Dung Nghy

    2016-09-01

    The article is dedicated to the presentation of a vision-based system for road vehicle counting and classification. The system is able to achieve counting with a very good accuracy even in difficult scenarios linked to occlusions and/or presence of shadows. The principle of the system is to use already installed cameras in road networks without any additional calibration procedure. We propose a robust segmentation algorithm that detects foreground pixels corresponding to moving vehicles. First, the approach models each pixel of the background with an adaptive Gaussian distribution. This model is coupled with a motion detection procedure, which allows correctly location of moving vehicles in space and time. The nature of trials carried out, including peak periods and various vehicle types, leads to an increase of occlusions between cars and between cars and trucks. A specific method for severe occlusion detection, based on the notion of solidity, has been carried out and tested. Furthermore, the method developed in this work is capable of managing shadows with high resolution. The related algorithm has been tested and compared to a classical method. Experimental results based on four large datasets show that our method can count and classify vehicles in real time with a high level of performance (>98%) under different environmental situations, thus performing better than the conventional inductive loop detectors.

  3. Comparative Effectiveness of Two Walking Interventions on Participation, Step Counts, and Health.

    PubMed

    Smith-McLallen, Aaron; Heller, Debbie; Vernisi, Kristin; Gulick, Diana; Cruz, Samantha; Snyder, Richard L

    2017-03-01

    To (1) compare the effects of two worksite-based walking interventions on employee participation rates; (2) compare average daily step counts between conditions, and; (3) examine the effects of increases in average daily step counts on biometric and psychologic outcomes. We conducted a cluster-randomized trial in which six employer groups were randomly selected and randomly assigned to condition. Four manufacturing worksites and two office-based worksite served as the setting. A total of 474 employees from six employer groups were included. A standard walking program was compared to an enhanced program that included incentives, feedback, competitive challenges, and monthly wellness workshops. Walking was measured by self-reported daily step counts. Survey measures and biometric screenings were administered at baseline and 3, 6, and 9 months after baseline. Analysis used linear mixed models with repeated measures. During 9 months, participants in the enhanced condition averaged 726 more steps per day compared with those in the standard condition (p < .001). A 1000-step increase in average daily steps was associated with significant weight loss for both men (-3.8 lbs.) and women (-2.1 lbs.), and reductions in body mass index (-0.41 men, -0.31 women). Higher step counts were also associated with improvements in mood, having more energy, and higher ratings of overall health. An enhanced walking program significantly increases participation rates and daily step counts, which were associated with weight loss and reductions in body mass index.

  4. Short communication: Repeatability of differential goat bulk milk culture and associations with somatic cell count, total bacterial count, and standard plate count.

    PubMed

    Koop, G; Dik, N; Nielen, M; Lipman, L J A

    2010-06-01

    The aims of this study were to assess how different bacterial groups in bulk milk are related to bulk milk somatic cell count (SCC), bulk milk total bacterial count (TBC), and bulk milk standard plate count (SPC) and to measure the repeatability of bulk milk culturing. On 53 Dutch dairy goat farms, 3 bulk milk samples were collected at intervals of 2 wk. The samples were cultured for SPC, coliform count, and staphylococcal count and for the presence of Staphylococcus aureus. Furthermore, SCC (Fossomatic 5000, Foss, Hillerød, Denmark) and TBC (BactoScan FC 150, Foss) were measured. Staphylococcal count was correlated to SCC (r=0.40), TBC (r=0.51), and SPC (r=0.53). Coliform count was correlated to TBC (r=0.33), but not to any of the other variables. Staphylococcus aureus did not correlate to SCC. The contribution of the staphylococcal count to the SPC was 31%, whereas the coliform count comprised only 1% of the SPC. The agreement of the repeated measurements was low. This study indicates that staphylococci in goat bulk milk are related to SCC and make a significant contribution to SPC. Because of the high variation in bacterial counts, repeated sampling is necessary to draw valid conclusions from bulk milk culturing. 2010 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  5. Kids Count in Delaware, Families Count in Delaware: Fact Book, 2002.

    ERIC Educational Resources Information Center

    Delaware Univ., Newark. Kids Count in Delaware.

    This Kids Count Fact Book is combined with the Families Count Fact Book to provide information on statewide trends affecting children and families in Delaware. The Kids Count statistical profile is based on 11 main indicators of child well-being: (1) births to teens 15-17 years; (2) births to teens 10 to 14 years; (3) low birth weight babies; (3)…

  6. Scatter Fraction, Count Rates, and Noise Equivalent Count Rate of a Single-Bed Position RPC TOF-PET System Assessed by Simulations Following the NEMA NU2-2001 Standards

    NASA Astrophysics Data System (ADS)

    Couceiro, Miguel; Crespo, Paulo; Marques, Rui F.; Fonte, Paulo

    2014-06-01

    Scatter Fraction (SF) and Noise Equivalent Count Rate (NECR) of a 2400 mm wide axial field-of-view Positron Emission Tomography (PET) system based on Resistive Plate Chamber (RPC) detectors with 300 ps Time Of Flight (TOF) resolution were studied by simulation using Geant4. The study followed the NEMA NU2-2001 standards, using the standard 700 mm long phantom and an axially extended one with 1800 mm, modeling the foreseeable use of this PET system. Data was processed based on the actual RPC readout, which requires a 0.2 μs non-paralyzable dead time for timing signals and a paralyzable dead time (τps) for position signals. For NECR, the best coincidence trigger consisted of a multiple time window coincidence sorter retaining single coincidence pairs (involving only two photons) and all possible coincidence pairs obtained from Multiple coincidences, keeping only those for which the direct TOF-reconstructed point falls inside a tight region surrounding the phantom. For the 700 mm phantom, the SF was 51.8% and, with τps = 3.0 μs, the peak NECR was 167 kcps at 7.6 kBq/cm3. Using τps = 1.0 μs the NECR was 349 kcps at 7.6 kBq/cm3, and no peak was found. For the 1800 mm phantom, the SF was slightly higher, and the NECR curves were identical to those obtained with the standard phantom, but shifted to lower activity concentrations. Although the higher SF, the values obtained for NECR allow concluding that the proposed scanner is expected to outperform current commercial PET systems.

  7. Nutsedge Counts Predict Meloidogyne incognita Juvenile Counts in an Integrated Management System.

    PubMed

    Ou, Zhining; Murray, Leigh; Thomas, Stephen H; Schroeder, Jill; Libbin, James

    2008-06-01

    The southern root-knot nematode (Meloidogyne incognita), yellow nutsedge (Cyperus esculentus) and purple nutsedge (Cyperus rotundus) are important pests in crops grown in the southern US. Management of the individual pests rather than the pest complex is often unsuccessful due to mutually beneficial pest interactions. In an integrated pest management scheme using alfalfa to suppress nutsedges and M. incognita, we evaluated quadratic polynomial regression models for prediction of the number of M. incognita J2 in soil samples as a function of yellow and purple nutsedge plant counts, squares of nutsedge counts and the cross-product between nutsedge counts . In May 2005, purple nutsedge plant count was a significant predictor of M. incognita count. In July and September 2005, counts of both nutsedges and the cross-product were significant predictors. In 2006, the second year of the alfalfa rotation, counts of all three species were reduced. As a likely consequence, the predictive relationship between nutsedges and M. incognita was not significant for May and July. In September 2006, purple nutsedge was a significant predictor of M. incognita. These results lead us to conclude that nutsedge plant counts in a field infested with the M. incognita-nutsedge pest complex can be used as a visual predictor of M. incognita J2 populations, unless the numbers of nutsedge plants and M. incognita are all very low.

  8. Nutsedge Counts Predict Meloidogyne incognita Juvenile Counts in an Integrated Management System

    PubMed Central

    Ou, Zhining; Murray, Leigh; Thomas, Stephen H.; Schroeder, Jill; Libbin, James

    2008-01-01

    The southern root-knot nematode (Meloidogyne incognita), yellow nutsedge (Cyperus esculentus) and purple nutsedge (Cyperus rotundus) are important pests in crops grown in the southern US. Management of the individual pests rather than the pest complex is often unsuccessful due to mutually beneficial pest interactions. In an integrated pest management scheme using alfalfa to suppress nutsedges and M. incognita, we evaluated quadratic polynomial regression models for prediction of the number of M. incognita J2 in soil samples as a function of yellow and purple nutsedge plant counts, squares of nutsedge counts and the cross-product between nutsedge counts . In May 2005, purple nutsedge plant count was a significant predictor of M. incognita count. In July and September 2005, counts of both nutsedges and the cross-product were significant predictors. In 2006, the second year of the alfalfa rotation, counts of all three species were reduced. As a likely consequence, the predictive relationship between nutsedges and M. incognita was not significant for May and July. In September 2006, purple nutsedge was a significant predictor of M. incognita. These results lead us to conclude that nutsedge plant counts in a field infested with the M. incognita-nutsedge pest complex can be used as a visual predictor of M. incognita J2 populations, unless the numbers of nutsedge plants and M. incognita are all very low. PMID:19259526

  9. Language and counting: Some recent results

    NASA Astrophysics Data System (ADS)

    Bell, Garry

    1990-02-01

    It has long been recognised that the language of mathematics is an important variable in the learning of mathematics, and there has been useful work in isolating and describing the linkage. Steffe and his co-workers at Georgia, for example, (Steffe, von Glasersfeld, Richardson and Cobb, 1983) have suggested that young children may construct verbal countable items to count objects which are hidden from their view. Although there has been a surge of research interest in counting and early childhood mathematics, and in cultural differences in mathematics attainment, there has been little work reported on the linkage between culture as exemplified by language, and initial concepts of numeration. This paper reports on some recent clinical research with kindergarten children of European and Asian background in Australia and America. The research examines the influence that number naming grammar appears to have on young children's understandings of two-digit numbers and place value. It appears that Transparent Standard Number Word Sequences such as Japanese, Chinese and Vietnamese which follow the numerical representation pattern by naming tens and units in order ("two tens three"), may be associated with distinctive place value concepts which may support sophisticated mental algorithms.

  10. On the use of positron counting for radio-Assay in nuclear pharmaceutical production.

    PubMed

    Maneuski, D; Giacomelli, F; Lemaire, C; Pimlott, S; Plenevaux, A; Owens, J; O'Shea, V; Luxen, A

    2017-07-01

    Current techniques for the measurement of radioactivity at various points during PET radiopharmaceutical production and R&D are based on the detection of the annihilation gamma rays from the radionuclide in the labelled compound. The detection systems to measure these gamma rays are usually variations of NaI or CsF scintillation based systems requiring costly and heavy lead shielding to reduce background noise. These detectors inherently suffer from low detection efficiency, high background noise and very poor linearity. They are also unable to provide any reasonably useful position information. A novel positron counting technique is proposed for the radioactivity assay during radiopharmaceutical manufacturing that overcomes these limitations. Detection of positrons instead of gammas offers an unprecedented level of position resolution of the radiation source (down to sub-mm) thanks to the nature of the positron interaction with matter. Counting capability instead of charge integration in the detector brings the sensitivity down to the statistical limits at the same time as offering very high dynamic range and linearity from zero to any arbitrarily high activity. This paper reports on a quantitative comparison between conventional detector systems and the proposed positron counting detector. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. The 2-24 μm source counts from the AKARI North Ecliptic Pole survey

    NASA Astrophysics Data System (ADS)

    Murata, K.; Pearson, C. P.; Goto, T.; Kim, S. J.; Matsuhara, H.; Wada, T.

    2014-11-01

    We present herein galaxy number counts of the nine bands in the 2-24 μm range on the basis of the AKARI North Ecliptic Pole (NEP) surveys. The number counts are derived from NEP-deep and NEP-wide surveys, which cover areas of 0.5 and 5.8 deg2, respectively. To produce reliable number counts, the sources were extracted from recently updated images. Completeness and difference between observed and intrinsic magnitudes were corrected by Monte Carlo simulation. Stellar counts were subtracted by using the stellar fraction estimated from optical data. The resultant source counts are given down to the 80 per cent completeness limit; 0.18, 0.16, 0.10, 0.05, 0.06, 0.10, 0.15, 0.16 and 0.44 mJy in the 2.4, 3.2, 4.1, 7, 9, 11, 15, 18 and 24 μm bands, respectively. On the bright side of all bands, the count distribution is flat, consistent with the Euclidean universe, while on the faint side, the counts deviate, suggesting that the galaxy population of the distant universe is evolving. These results are generally consistent with previous galaxy counts in similar wavebands. We also compare our counts with evolutionary models and find them in good agreement. By integrating the models down to the 80 per cent completeness limits, we calculate that the AKARI NEP survey revolves 20-50 per cent of the cosmic infrared background, depending on the wavebands.

  12. Novel Photon-Counting Detectors for Free-Space Communication

    NASA Technical Reports Server (NTRS)

    Krainak, M. A.; Yang, G.; Sun, X.; Lu, W.; Merritt, S.; Beck, J.

    2016-01-01

    We present performance data for novel photon-counting detectors for free space optical communication. NASA GSFC is testing the performance of two types of novel photon-counting detectors 1) a 2x8 mercury cadmium telluride (HgCdTe) avalanche array made by DRS Inc., and a 2) a commercial 2880-element silicon avalanche photodiode (APD) array. We present and compare dark count, photon-detection efficiency, wavelength response and communication performance data for these detectors. We successfully measured real-time communication performance using both the 2 detected-photon threshold and AND-gate coincidence methods. Use of these methods allows mitigation of dark count, after-pulsing and background noise effects. The HgCdTe APD array routinely demonstrated photon detection efficiencies of greater than 50% across 5 arrays, with one array reaching a maximum PDE of 70%. We performed high-resolution pixel-surface spot scans and measured the junction diameters of its diodes. We found that decreasing the junction diameter from 31 micrometers to 25 micrometers doubled the e- APD gain from 470 for an array produced in the year 2010 to a gain of 1100 on an array delivered to NASA GSFC recently. The mean single-photon SNR was over 12 and the excess noise factors measurements were 1.2-1.3. The commercial silicon APD array exhibited a fast output with rise times of 300 ps and pulse widths of 600 ps. On-chip individually filtered signals from the entire array were multiplexed onto a single fast output.

  13. Disk-based k-mer counting on a PC

    PubMed Central

    2013-01-01

    Background The k-mer counting problem, which is to build the histogram of occurrences of every k-symbol long substring in a given text, is important for many bioinformatics applications. They include developing de Bruijn graph genome assemblers, fast multiple sequence alignment and repeat detection. Results We propose a simple, yet efficient, parallel disk-based algorithm for counting k-mers. Experiments show that it usually offers the fastest solution to the considered problem, while demanding a relatively small amount of memory. In particular, it is capable of counting the statistics for short-read human genome data, in input gzipped FASTQ file, in less than 40 minutes on a PC with 16 GB of RAM and 6 CPU cores, and for long-read human genome data in less than 70 minutes. On a more powerful machine, using 32 GB of RAM and 32 CPU cores, the tasks are accomplished in less than half the time. No other algorithm for most tested settings of this problem and mammalian-size data can accomplish this task in comparable time. Our solution also belongs to memory-frugal ones; most competitive algorithms cannot efficiently work on a PC with 16 GB of memory for such massive data. Conclusions By making use of cheap disk space and exploiting CPU and I/O parallelism we propose a very competitive k-mer counting procedure, called KMC. Our results suggest that judicious resource management may allow to solve at least some bioinformatics problems with massive data on a commodity personal computer. PMID:23679007

  14. A normalization strategy for comparing tag count data

    PubMed Central

    2012-01-01

    Background High-throughput sequencing, such as ribonucleic acid sequencing (RNA-seq) and chromatin immunoprecipitation sequencing (ChIP-seq) analyses, enables various features of organisms to be compared through tag counts. Recent studies have demonstrated that the normalization step for RNA-seq data is critical for a more accurate subsequent analysis of differential gene expression. Development of a more robust normalization method is desirable for identifying the true difference in tag count data. Results We describe a strategy for normalizing tag count data, focusing on RNA-seq. The key concept is to remove data assigned as potential differentially expressed genes (DEGs) before calculating the normalization factor. Several R packages for identifying DEGs are currently available, and each package uses its own normalization method and gene ranking algorithm. We compared a total of eight package combinations: four R packages (edgeR, DESeq, baySeq, and NBPSeq) with their default normalization settings and with our normalization strategy. Many synthetic datasets under various scenarios were evaluated on the basis of the area under the curve (AUC) as a measure for both sensitivity and specificity. We found that packages using our strategy in the data normalization step overall performed well. This result was also observed for a real experimental dataset. Conclusion Our results showed that the elimination of potential DEGs is essential for more accurate normalization of RNA-seq data. The concept of this normalization strategy can widely be applied to other types of tag count data and to microarray data. PMID:22475125

  15. 500-MHz x-ray counting with a Si-APD and a fast-pulse processing system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kishimoto, Shunji; Taniguchi, Takashi; Tanaka, Manobu

    2010-06-23

    We introduce a counting system of up to 500 MHz for synchrotron x-ray high-rate measurements. A silicon avalanche photodiode detector was used in the counting system. The fast-pulse circuit of the amplifier was designed with hybrid ICs to prepare an ASIC system for a large-scale pixel array detector in near future. The fast amplifier consists of two cascading emitter-followers using 10-GHz band transistors. A count-rate of 3.25x10{sup 8} s{sup -1} was then achieved using the system for 8-keV x-rays. However, a baseline shift by adopting AC-coupling in the amplifier disturbed us to observe the maximum count of 4.49x10{sup 8} s{supmore » -1}, determined by electron-bunch filling into a ring accelerator. We also report that an amplifier with a baseline restorer was tested in order to keep the baseline level to be 0 V even at high input rates.« less

  16. Characterization of an ultraviolet imaging detector with high event rate ROIC (HEROIC) readout

    NASA Astrophysics Data System (ADS)

    Nell, Nicholas; France, Kevin; Harwit, Alex; Bradley, Scott; Franka, Steve; Freymiller, Ed; Ebbets, Dennis

    2016-07-01

    We present characterization results from a photon counting imaging detector consisting of one microchannel plate (MCP) and an array of two readout integrated circuits (ROIC) that record photon position. The ROICs used in the position readout are the high event rate ROIC (HEROIC) devices designed to handle event rates up to 1 MHz per pixel, recently developed by the Ball Aerospace and Technologies Corporation in collaboration with the University of Colorado. An opaque cesium iodide (CsI) photocathode sensitive in the far-ultraviolet (FUV; 122-200 nm), is deposited on the upper surface of the MCP. The detector is characterized in a chamber developed by CU Boulder that is capable of illumination with vacuum-ultraviolet (VUV) monochromatic light and measurement of absolute ux with a calibrated photodiode. Testing includes investigation of the effects of adjustment of internal settings of the HEROIC devices including charge threshold, gain, and amplifier bias. The detector response to high count rates is tested. We report initial results including background, uniformity, and quantum detection efficiency (QDE) as a function of wavelength.

  17. Making Hawai'i's Kids Count. Issue Paper Number 3.

    ERIC Educational Resources Information Center

    Hawaii Univ., Manoa. Center on the Family.

    This issue paper from Hawai'i Kids Count addresses the issue of teen pregnancy and birth rates. The paper notes that teen pregnancy and birth rates are declining both nationally and in Hawaii and describes key risk factors associated with having a baby before age 20: (1) early school failure; (2) early behavioral problems; (3) family dysfunction;…

  18. A new approach to counting measurements: Addressing the problems with ISO-11929

    NASA Astrophysics Data System (ADS)

    Klumpp, John; Miller, Guthrie; Poudel, Deepesh

    2018-06-01

    We present an alternative approach to making counting measurements of radioactivity which offers probabilistic interpretations of the measurements. Unlike the approach in the current international standard (ISO-11929), our approach, which uses an assumed prior probability distribution of the true amount in the sample, is able to answer the question of interest for most users of the standard: "what is the probability distribution of the true amount in the sample, given the data?" The final interpretation of the measurement requires information not necessarily available at the measurement stage. However, we provide an analytical formula for what we term the "measurement strength" that depends only on measurement-stage count quantities. We show that, when the sources are rare, the posterior odds that the sample true value exceeds ε are the measurement strength times the prior odds, independently of ε, the prior odds, and the distribution of the calibration coefficient. We recommend that the measurement lab immediately follow-up on unusually high samples using an "action threshold" on the measurement strength which is similar to the decision threshold recommended by the current standard. We further recommend that the measurement lab perform large background studies in order to characterize non constancy of background, including possible time correlation of background.

  19. Development of a Photon Counting System for Differential Lidar Signal Detection

    NASA Technical Reports Server (NTRS)

    Elsayed-Ali, Hani

    1997-01-01

    Photon counting has been chosen as a means to extend the detection range of current airborne DIAL ozone measurements. Lidar backscattered return signals from the on and off-line lasers experience a significant exponential decay. To extract further data from the decaying ozone return signals, photon counting will be used to measure the low light levels, thus extending the detection range. In this application, photon counting will extend signal measurement where the analog return signal is too weak. The current analog measurement range is limited to approximately 25 kilometers from an aircraft flying at 12 kilometers. Photon counting will be able to exceed the current measurement range so as to follow the mid-latitude model of ozone density as a function of height. This report describes the development of a photon counting system. The initial development phase begins with detailed evaluation of individual photomultiplier tubes. The PMT qualities investigated are noise count rates, single electron response peaks, voltage versus gain values, saturation effects, and output signal linearity. These evaluations are followed by analysis of two distinctive tube base gating schemes. The next phase is to construct and operate a photon counting system in a laboratory environment. The laboratory counting simulations are used to determine optimum discriminator setpoints and to continue further evaluations of PMT properties. The final step in the photon counting system evaluation process is the compiling of photon counting measurements on the existing ozone DIAL laser system.

  20. Comparative analysis of dose rates in bricks determined by neutron activation analysis, alpha counting and X-ray fluorescence analysis for the thermoluminescence fine grain dating method

    NASA Astrophysics Data System (ADS)

    Bártová, H.; Kučera, J.; Musílek, L.; Trojek, T.

    2014-11-01

    In order to evaluate the age from the equivalent dose and to obtain an optimized and efficient procedure for thermoluminescence (TL) dating, it is necessary to obtain the values of both the internal and the external dose rates from dated samples and from their environment. The measurements described and compared in this paper refer to bricks from historic buildings and a fine-grain dating method. The external doses are therefore negligible, if the samples are taken from a sufficient depth in the wall. However, both the alpha dose rate and the beta and gamma dose rates must be taken into account in the internal dose. The internal dose rate to fine-grain samples is caused by the concentrations of natural radionuclides 238U, 235U, 232Th and members of their decay chains, and by 40K concentrations. Various methods can be used for determining trace concentrations of these natural radionuclides and their contributions to the dose rate. The dose rate fraction from 238U and 232Th can be calculated, e.g., from the alpha count rate, or from the concentrations of 238U and 232Th, measured by neutron activation analysis (NAA). The dose rate fraction from 40K can be calculated from the concentration of potassium measured, e.g., by X-ray fluorescence analysis (XRF) or by NAA. Alpha counting and XRF are relatively simple and are accessible for an ordinary laboratory. NAA can be considered as a more accurate method, but it is more demanding regarding time and costs, since it needs a nuclear reactor as a neutron source. A comparison of these methods allows us to decide whether the time- and cost-saving simpler techniques introduce uncertainty that is still acceptable.

  1. Kids Count Data Book, 2003: State Profiles of Child Well-Being.

    ERIC Educational Resources Information Center

    O'Hare, William P.

    This Kids Count data book examines national and statewide trends in the well being of the nation's children. Statistical portraits are based on 10 indicators of well being: (1) percent of low birth weight babies; (2) infant mortality rate; (3) child death rate; (4) rate of teen deaths by accident, homicide, and suicide; (5) teen birth rate; (6)…

  2. KIDS COUNT Data Book, 2002: State Profiles of Child Well-Being.

    ERIC Educational Resources Information Center

    O'Hare, William P.

    This KIDS COUNT data book examines national and statewide trends in the well being of the nations children. Statistical portraits are based on 10 indicators of well being: (1) percent of low birth weight babies; (2) infant mortality rate; (3) child death rate; (4) rate of teen deaths by accident, homicide, and suicide; (5) teen birth rate; (6)…

  3. KIDS COUNT Data Book, 2001: State Profiles of Child Well-Being.

    ERIC Educational Resources Information Center

    Annie E. Casey Foundation, Baltimore, MD.

    This Kids Count report examines national and statewide trends in the well-being of the nation's children. The statistical portrait is based on 10 indicators of well being: (1) percent of low birth weight babies; (2) infant mortality rate; (3) child death rate; (4) rate of teen deaths by accident, homicide and suicide; (5) teen birth rate; (6)…

  4. Background sources at PEP

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lynch, H.; Schwitters, R.F.; Toner, W.T.

    Important sources of background for PEP experiments are studied. Background particles originate from high-energy electrons and positrons which have been lost from stable orbits, ..gamma..-rays emitted by the primary beams through bremsstrahlung in the residual gas, and synchrotron radiation x-rays. The effect of these processes on the beam lifetime are calculated and estimates of background rates at the interaction region are given. Recommendations for the PEP design, aimed at minimizing background are presented. 7 figs., 4 tabs.

  5. Current automated 3D cell detection methods are not a suitable replacement for manual stereologic cell counting

    PubMed Central

    Schmitz, Christoph; Eastwood, Brian S.; Tappan, Susan J.; Glaser, Jack R.; Peterson, Daniel A.; Hof, Patrick R.

    2014-01-01

    Stereologic cell counting has had a major impact on the field of neuroscience. A major bottleneck in stereologic cell counting is that the user must manually decide whether or not each cell is counted according to three-dimensional (3D) stereologic counting rules by visual inspection within hundreds of microscopic fields-of-view per investigated brain or brain region. Reliance on visual inspection forces stereologic cell counting to be very labor-intensive and time-consuming, and is the main reason why biased, non-stereologic two-dimensional (2D) “cell counting” approaches have remained in widespread use. We present an evaluation of the performance of modern automated cell detection and segmentation algorithms as a potential alternative to the manual approach in stereologic cell counting. The image data used in this study were 3D microscopic images of thick brain tissue sections prepared with a variety of commonly used nuclear and cytoplasmic stains. The evaluation compared the numbers and locations of cells identified unambiguously and counted exhaustively by an expert observer with those found by three automated 3D cell detection algorithms: nuclei segmentation from the FARSIGHT toolkit, nuclei segmentation by 3D multiple level set methods, and the 3D object counter plug-in for ImageJ. Of these methods, FARSIGHT performed best, with true-positive detection rates between 38 and 99% and false-positive rates from 3.6 to 82%. The results demonstrate that the current automated methods suffer from lower detection rates and higher false-positive rates than are acceptable for obtaining valid estimates of cell numbers. Thus, at present, stereologic cell counting with manual decision for object inclusion according to unbiased stereologic counting rules remains the only adequate method for unbiased cell quantification in histologic tissue sections. PMID:24847213

  6. A video-based real-time adaptive vehicle-counting system for urban roads.

    PubMed

    Liu, Fei; Zeng, Zhiyuan; Jiang, Rong

    2017-01-01

    In developing nations, many expanding cities are facing challenges that result from the overwhelming numbers of people and vehicles. Collecting real-time, reliable and precise traffic flow information is crucial for urban traffic management. The main purpose of this paper is to develop an adaptive model that can assess the real-time vehicle counts on urban roads using computer vision technologies. This paper proposes an automatic real-time background update algorithm for vehicle detection and an adaptive pattern for vehicle counting based on the virtual loop and detection line methods. In addition, a new robust detection method is introduced to monitor the real-time traffic congestion state of road section. A prototype system has been developed and installed on an urban road for testing. The results show that the system is robust, with a real-time counting accuracy exceeding 99% in most field scenarios.

  7. A video-based real-time adaptive vehicle-counting system for urban roads

    PubMed Central

    2017-01-01

    In developing nations, many expanding cities are facing challenges that result from the overwhelming numbers of people and vehicles. Collecting real-time, reliable and precise traffic flow information is crucial for urban traffic management. The main purpose of this paper is to develop an adaptive model that can assess the real-time vehicle counts on urban roads using computer vision technologies. This paper proposes an automatic real-time background update algorithm for vehicle detection and an adaptive pattern for vehicle counting based on the virtual loop and detection line methods. In addition, a new robust detection method is introduced to monitor the real-time traffic congestion state of road section. A prototype system has been developed and installed on an urban road for testing. The results show that the system is robust, with a real-time counting accuracy exceeding 99% in most field scenarios. PMID:29135984

  8. Pollen count and presentation of angiotensin-converting enzyme inhibitor-associated angioedema.

    PubMed

    Straka, Brittany; Nian, Hui; Sloan, Chantel; Byrd, James Brian; Woodard-Grice, Alencia; Yu, Chang; Stone, Elizabeth; Steven, Gary; Hartert, Tina; Teo, Koon K; Pare, Guillaume; McCarty, Catherine A; Brown, Nancy J

    2013-01-01

    The incidence of angiotensin-converting enzyme (ACE) inhibitor-associated angioedema is increased in patients with seasonal allergies. We tested the hypothesis that patients with ACE inhibitor-associated angioedema present during months when pollen counts are increased. Cohort analysis examined the month of presentation of ACE inhibitor-associated angioedema and pollen counts in the ambulatory and hospital setting. Patients with ACE inhibitor-associated angioedema were ascertained through (1) an observational study of patients presenting to Vanderbilt University Medical Center, (2) patients presenting to the Marshfield Clinic and participating in the Marshfield Clinic Personalized Medicine Research Project, and (3) patients enrolled in The Ongoing Telmisartan Alone and in Combination with Ramipril Global Endpoint Trial (ONTARGET). Measurements include date of presentation of ACE inhibitor-associated angioedema, population exposure to ACE inhibitor by date, and local pollen counts by date. At Vanderbilt, the rate of angioedema was significantly associated with tree pollen months (P = .01 from χ(2) test). When separate analyses were conducted in patients with a history of seasonal allergies and patients without, the rate of ACE inhibitor-associated angioedema was increased during tree pollen months only in patients with a history of seasonal allergies (P = .002). In Marshfield, the rate of angioedema was significantly associated with ragweed pollen months (P = .025). In ONTARGET, a positive trend was observed between the ACE inhibitor-associated angioedema rate and grass season, although it was not statistically significant (P = .057). Patients with ACE inhibitor-associated angioedema are more likely to present with this adverse drug event during months when pollen counts are increased. Copyright © 2013 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.

  9. All Our Children: Massachusetts Kids Count 1994.

    ERIC Educational Resources Information Center

    Diamond, Franna, Ed.

    This Kids Count report examines statewide trends from 1990 to 1994 in the well-being of Massachusetts' children. The statistical portrait is based on indicators of well-being in five areas: (1) economic well-being of children and their families, including child poverty rate, family income, job loss, earnings of male high school dropouts and…

  10. KIDS COUNT in Virginia: 1999 Data Book.

    ERIC Educational Resources Information Center

    Action Alliance for Virginia's Children and Youth, Richmond.

    This Kids Count data book examines statewide trends in the well-being of Virginia's children. The statistical portrait is based on five general areas of children's well being: health, safety, education, families, and economic factors. Key indicators in these five areas include: (1) prenatal care rates; (2) low birthweight; (3) child deaths; (4)…

  11. Relationship between platelet count and hemodialysis membranes

    PubMed Central

    Nasr, Rabih; Saifan, Chadi; Barakat, Iskandar; Azzi, Yorg Al; Naboush, Ali; Saad, Marc; Sayegh, Suzanne El

    2013-01-01

    Background One factor associated with poor outcomes in hemodialysis patients is exposure to a foreign membrane. Older membranes are very bioincompatible and increase complement activation, cause leukocytosis by activating circulating factors, which sequesters leukocytes in the lungs, and activates platelets. Recently, newer membranes have been developed that were designed to be more biocompatible. We tested if the different “optiflux” hemodialysis membranes had different effects on platelet levels. Methods Ninety-nine maintenance hemodialysis patients with no known systemic or hematologic diseases affecting their platelets had blood drawn immediately prior to, 90 minutes into, and immediately following their first hemodialysis session of the week. All patients were dialyzed using a Fresenius Medical Care Optiflux polysulfone membrane F160, F180, or F200 (polysulfone synthetic dialyzer membranes, 1.6 m2, 1.8 m2, and 2.0 m2 surface area, respectively, electron beam sterilized). Platelet counts were measured from each sample by analysis using a CBC analyzer. Results The average age of the patients was 62.7 years; 36 were female and 63 were male. The mean platelet count pre, mid, and post dialysis was 193 (standard deviation ±74.86), 191 (standard deviation ±74.67), and 197 (standard deviation ±79.34) thousand/mm3, respectively, with no statistical differences. Conclusion Newer membranes have no significant effect on platelet count. This suggests that they are, in fact, more biocompatible than their predecessors and may explain their association with increased survival. PMID:23983482

  12. Inventory count strategies.

    PubMed

    Springer, W H

    1996-02-01

    An important principle of accounting is that asset inventory needs to be correctly valued to ensure that the financial statements of the institution are accurate. Errors is recording the value of ending inventory in one fiscal year result in errors to published financial statements for that year as well as the subsequent fiscal year. Therefore, it is important that accurate physical counts be periodically taken. It is equally important that any system being used to generate inventory valuation, reordering or management reports be based on consistently accurate on-hand balances. At the foundation of conducting an accurate physical count of an inventory is a comprehensive understanding of the process coupled with a written plan. This article presents a guideline of the physical count processes involved in a traditional double-count approach.

  13. Fluorescein dye improves microscopic evaluation and counting of demodex in blepharitis with cylindrical dandruff.

    PubMed

    Kheirkhah, Ahmad; Blanco, Gabriela; Casas, Victoria; Tseng, Scheffer C G

    2007-07-01

    To show whether fluorescein dye helps detect and count Demodex embedded in cylindrical dandruff (CD) of epilated eyelashes from patients with blepharitis. Two eyelashes with CD were removed from each lid of 10 consecutive patients with blepharitis and subjected to microscopic examination with and without fluorescein solution to detect and count Demodex mites. Of 80 eyelashes examined, 36 (45%) lashes retained their CD after removal. Before addition of the fluorescein solution, the mean total Demodex count per patient was 14.9 +/- 10 and the mean Demodex count per lash was 3.1 +/- 2.5 and 0.8 +/- 0.7 in epilated eyelashes with and without retained CD, respectively (P < 0.0001). After addition of the fluorescein solution, opaque and compact CD instantly expanded to reveal embedded mites in a yellowish and semitransparent background. As a result, the mean total Demodex count per patient was significantly increased to 20.2 +/- 13.8 (P = 0.003), and the mean count per lash was significantly increased to 4.4 +/- 2.8 and 1 +/- 0.8 in eyelashes with and without retained CD (P < 0.0001 and P = 0.007), respectively. This new method yielded more mites in 8 of 10 patients and allowed mites to be detected in 3 lashes with retained CD and 1 lash without retained CD that had an initial count of zero. Addition of fluorescein solution after mounting further increases the proficiency of detecting and counting mites embedded in CD of epilated eyelashes.

  14. Metal ion levels and lymphocyte counts: ASR hip resurfacing prosthesis vs. standard THA

    PubMed Central

    2013-01-01

    Background and purpose Wear particles from metal–on–metal arthroplasties are under suspicion for adverse effects both locally and systemically, and the DePuy ASR Hip Resurfacing System (RHA) has above–average failure rates. We compared lymphocyte counts in RHA and total hip arthroplasty (THA) and investigated whether cobalt and chromium ions affected the lymphocyte counts. Method In a randomized controlled trial, we followed 19 RHA patients and 19 THA patients. Lymphocyte subsets and chromium and cobalt ion concentrations were measured at baseline, at 8 weeks, at 6 months, and at 1 and 2 years. Results The T–lymphocyte counts for both implant types declined over the 2–year period. This decline was statistically significant for CD3+CD8+ in the THA group, with a regression coefficient of –0.04 × 109cells/year (95% CI: –0.08 to –0.01). Regression analysis indicated a depressive effect of cobalt ions in particular on T–cells with 2–year whole–blood cobalt regression coefficients for CD3+ of –0.10 (95% CI: –0.16 to –0.04) × 109 cells/parts per billion (ppb), for CD3+CD4+ of –0.06 (–0.09 to –0.03) × 109 cells/ppb, and for CD3+CD8+ of –0.02 (–0.03 to –0.00) × 109 cells/ppb. Interpretation Circulating T–lymphocyte levels may decline after surgery, regardless of implant type. Metal ions—particularly cobalt—may have a general depressive effect on T– and B–lymphocyte levels. Registered with ClinicalTrials.gov under # NCT01113762 PMID:23597114

  15. Quantitative basis for component factors of gas flow proportional counting efficiencies

    NASA Astrophysics Data System (ADS)

    Nichols, Michael C.

    This dissertation investigates the counting efficiency calibration of a gas flow proportional counter with beta-particle emitters in order to (1) determine by measurements and simulation the values of the component factors of beta-particle counting efficiency for a proportional counter, (2) compare the simulation results and measured counting efficiencies, and (3) determine the uncertainty of the simulation and measurements. Monte Carlo simulation results by the MCNP5 code were compared with measured counting efficiencies as a function of sample thickness for 14C, 89Sr, 90Sr, and 90Y. The Monte Carlo model simulated strontium carbonate with areal thicknesses from 0.1 to 35 mg cm-2. The samples were precipitated as strontium carbonate with areal thicknesses from 3 to 33 mg cm-2 , mounted on membrane filters, and counted on a low background gas flow proportional counter. The estimated fractional standard deviation was 2--4% (except 6% for 14C) for efficiency measurements of the radionuclides. The Monte Carlo simulations have uncertainties estimated to be 5 to 6 percent for carbon-14 and 2.4 percent for strontium-89, strontium-90, and yttrium-90. The curves of simulated counting efficiency vs. sample areal thickness agreed within 3% of the curves of best fit drawn through the 25--49 measured points for each of the four radionuclides. Contributions from this research include development of uncertainty budgets for the analytical processes; evaluation of alternative methods for determining chemical yield critical to the measurement process; correcting a bias found in the MCNP normalization of beta spectra histogram; clarifying the interpretation of the commonly used ICRU beta-particle spectra for use by MCNP; and evaluation of instrument parameters as applied to the simulation model to obtain estimates of the counting efficiency from simulated pulse height tallies.

  16. Uncertainties in internal gas counting

    NASA Astrophysics Data System (ADS)

    Unterweger, M.; Johansson, L.; Karam, L.; Rodrigues, M.; Yunoki, A.

    2015-06-01

    The uncertainties in internal gas counting will be broken down into counting uncertainties and gas handling uncertainties. Counting statistics, spectrum analysis, and electronic uncertainties will be discussed with respect to the actual counting of the activity. The effects of the gas handling and quantities of counting and sample gases on the uncertainty in the determination of the activity will be included when describing the uncertainties arising in the sample preparation.

  17. County Data Book 1997: Kentucky Kids Count.

    ERIC Educational Resources Information Center

    Kentucky Kids Count Consortium.

    This Kids Count data book examines trends in the well-being of Kentucky's children on a statewide and county basis. An introduction summarizes some of the trends for Kentucky's children in the 1990s. The bulk of the report presents statewide and county data grouped into five categories: (1) poverty rates and programs (persons in poverty; median…

  18. Low background signal readout electronics for the Majorana Demonstrator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guinn, I.; Abgrall, N.; Avignone, F. T.

    The Majorana Demonstrator is a planned 40 kg array of Germanium detectors intended to demonstrate the feasibility of constructing a tonne-scale experiment that will seek neutrinoless double beta decay (0νββ) in 76Ge. In such an experiment we require backgrounds of less than 1 count/tonne-year in the 4 keV region of interest around the 2039 keV Q-value of the ββ decay. Moreover, designing low-noise electronics, which must be placed in close proximity to the detectors, presents a challenge to reaching this background target. Finally, this paper will discuss the Majorana collaboration's solutions to some of these challenges.

  19. Low background signal readout electronics for the Majorana Demonstrator

    DOE PAGES

    Guinn, I.; Abgrall, N.; Avignone, F. T.; ...

    2015-05-01

    The Majorana Demonstrator is a planned 40 kg array of Germanium detectors intended to demonstrate the feasibility of constructing a tonne-scale experiment that will seek neutrinoless double beta decay (0νββ) in 76Ge. In such an experiment we require backgrounds of less than 1 count/tonne-year in the 4 keV region of interest around the 2039 keV Q-value of the ββ decay. Moreover, designing low-noise electronics, which must be placed in close proximity to the detectors, presents a challenge to reaching this background target. Finally, this paper will discuss the Majorana collaboration's solutions to some of these challenges.

  20. Low Background Signal Readout Electronics for the MAJORANA DEMONSTRATOR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guinn, I.; Abgrall, N.; Avignone, III, F. T.

    The MAJORANA DEMONSTRATOR is a planned 40 kg array of Germanium detectors intended to demonstrate the feasibility of constructing a tonne-scale experiment that will seek neutrinoless double beta decay (0 nu beta beta) in Ge-76. Such an experiment would require backgrounds of less than 1 count/tonne-year in the 4 keV region of interest around the 2039 keV Q-value of the beta beta decay. Designing low-noise electronics, which must be placed in close proximity to the detectors, presents a challenge to reaching this background target. This paper will discuss the MAJORANA collaboration's solutions to some of these challenges.

  1. Musculoskeletal imaging with a prototype photon-counting detector.

    PubMed

    Gruber, M; Homolka, P; Chmeissani, M; Uffmann, M; Pretterklieber, M; Kainberger, F

    2012-01-01

    To test a digital imaging X-ray device based on the direct capture of X-ray photons with pixel detectors, which are coupled with photon-counting readout electronics. The chip consists of a matrix of 256 × 256 pixels with a pixel pitch of 55 μm. A monolithic image of 11.2 cm × 7 cm was obtained by the consecutive displacement approach. Images of embalmed anatomical specimens of eight human hands were obtained at four different dose levels (skin dose 2.4, 6, 12, 25 μGy) with the new detector, as well as with a flat-panel detector. The overall rating scores for the evaluated anatomical regions ranged from 5.23 at the lowest dose level, 6.32 at approximately 6 μGy, 6.70 at 12 μGy, to 6.99 at the highest dose level with the photon-counting system. The corresponding rating scores for the flat-panel detector were 3.84, 5.39, 6.64, and 7.34. When images obtained at the same dose were compared, the new system outperformed the conventional DR system at the two lowest dose levels. At the higher dose levels, there were no significant differences between the two systems. The photon-counting detector has great potential to obtain musculoskeletal images of excellent quality at very low dose levels.

  2. Evolution of High Tooth Replacement Rates in Sauropod Dinosaurs

    PubMed Central

    Smith, Kathlyn M.; Fisher, Daniel C.; Wilson, Jeffrey A.

    2013-01-01

    Background Tooth replacement rate can be calculated in extinct animals by counting incremental lines of deposition in tooth dentin. Calculating this rate in several taxa allows for the study of the evolution of tooth replacement rate. Sauropod dinosaurs, the largest terrestrial animals that ever evolved, exhibited a diversity of tooth sizes and shapes, but little is known about their tooth replacement rates. Methodology/Principal Findings We present tooth replacement rate, formation time, crown volume, total dentition volume, and enamel thickness for two coexisting but distantly related and morphologically disparate sauropod dinosaurs Camarasaurus and Diplodocus. Individual tooth formation time was determined by counting daily incremental lines in dentin. Tooth replacement rate is calculated as the difference between the number of days recorded in successive replacement teeth. Each tooth family in Camarasaurus has a maximum of three replacement teeth, whereas each Diplodocus tooth family has up to five. Tooth formation times are about 1.7 times longer in Camarasaurus than in Diplodocus (315 vs. 185 days). Average tooth replacement rate in Camarasaurus is about one tooth every 62 days versus about one tooth every 35 days in Diplodocus. Despite slower tooth replacement rates in Camarasaurus, the volumetric rate of Camarasaurus tooth replacement is 10 times faster than in Diplodocus because of its substantially greater tooth volumes. A novel method to estimate replacement rate was developed and applied to several other sauropodomorphs that we were not able to thin section. Conclusions/Significance Differences in tooth replacement rate among sauropodomorphs likely reflect disparate feeding strategies and/or food choices, which would have facilitated the coexistence of these gigantic herbivores in one ecosystem. Early neosauropods are characterized by high tooth replacement rates (despite their large tooth size), and derived titanosaurs and diplodocoids independently

  3. 18F-FDG positron autoradiography with a particle counting silicon pixel detector.

    PubMed

    Russo, P; Lauria, A; Mettivier, G; Montesi, M C; Marotta, M; Aloj, L; Lastoria, S

    2008-11-07

    We report on tests of a room-temperature particle counting silicon pixel detector of the Medipix2 series as the detector unit of a positron autoradiography (AR) system, for samples labelled with (18)F-FDG radiopharmaceutical used in PET studies. The silicon detector (1.98 cm(2) sensitive area, 300 microm thick) has high intrinsic resolution (55 microm pitch) and works by counting all hits in a pixel above a certain energy threshold. The present work extends the detector characterization with (18)F-FDG of a previous paper. We analysed the system's linearity, dynamic range, sensitivity, background count rate, noise, and its imaging performance on biological samples. Tests have been performed in the laboratory with (18)F-FDG drops (37-37 000 Bq initial activity) and ex vivo in a rat injected with 88.8 MBq of (18)F-FDG. Particles interacting in the detector volume produced a hit in a cluster of pixels whose mean size was 4.3 pixels/event at 11 keV threshold and 2.2 pixels/event at 37 keV threshold. Results show a sensitivity for beta(+) of 0.377 cps Bq(-1), a dynamic range of at least five orders of magnitude and a lower detection limit of 0.0015 Bq mm(-2). Real-time (18)F-FDG positron AR images have been obtained in 500-1000 s exposure time of thin (10-20 microm) slices of a rat brain and compared with 20 h film autoradiography of adjacent slices. The analysis of the image contrast and signal-to-noise ratio in a rat brain slice indicated that Poisson noise-limited imaging can be approached in short (e.g. 100 s) exposures, with approximately 100 Bq slice activity, and that the silicon pixel detector produced a higher image quality than film-based AR.

  4. Radon Detection and Counting

    NASA Astrophysics Data System (ADS)

    Peterson, David

    2004-11-01

    One of the daughter products of the naturally occuring U 238 decay chain is the colorless, odorless, inert gas radon. The daughter products of the radon, from Po 218 through Po 214, can remain in the lungs after breathing radon that has diffused into the atmosphere. Radon testing of homes before sale or purchase is necessary in many parts of the U.S. Testing can be accomplished by the simple procedure of exposing a canister of activated charcoal to the ambient air. Radon atoms in the air are adsorbed onto the surface of the charcoal, which is then sealed in the canister. Gamma rays of the daughter products of the radon, in particular Pb 214 and Bi 214, can then be detected in low background counting system. Radon remediation procedures are encouraged for radon activities in the air greater than 4 pCi/L.

  5. Rad-hard Dual-threshold High-count-rate Silicon Pixel-array Detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Adam

    In this program, a Voxtel-led team demonstrates a full-format (192 x 192, 100-µm pitch, VX-810) high-dynamic-range x-ray photon-counting sensor—the Dual Photon Resolved Energy Acquisition (DUPREA) sensor. Within the Phase II program the following tasks were completed: 1) system analysis and definition of the DUPREA sensor requirements; 2) design, simulation, and fabrication of the full-format VX-810 ROIC design; 3) design, optimization, and fabrication of thick, fully depleted silicon photodiodes optimized for x-ray photon collection; 4) hybridization of the VX-810 ROIC to the photodiode array in the creation of the optically sensitive focal-plane array; 5) development of an evaluation camera; and 6)more » electrical and optical characterization of the sensor.« less

  6. Enhanced trigger for the NIFFTE fissionTPC in presence of high-rate alpha backgrounds

    NASA Astrophysics Data System (ADS)

    Bundgaard, Jeremy; Niffte Collaboration

    2015-10-01

    Nuclear physics and nuclear energy communities call for new, high precision measurements to improve existing fission models and design next generation reactors. The Neutron Induced Fission Fragment Tracking experiment (NIFFTE) has developed the fission Time Projection Chamber (fissionTPC) to measure neutron induced fission with unrivaled precision. The fissionTPC is annually deployed to the Weapons Neutron Research facility at Los Alamos Neutron Science Center where it operates with a neutron beam passing axially through the drift volume, irradiating heavy actinide targets to induce fission. The fissionTPC was developed at the Lawrence Livermore National Laboratory's TPC lab, where it measures spontaneous fission from radioactive sources to characterize detector response, improve performance, and evolve the design. To measure 244Cm, we've developed a fission trigger to reduce the data rate from alpha tracks while maintaining a high fission detection efficiency. In beam, alphas from 239Pu are a large background when detecting fission fragments; implementing the fission trigger will greatly reduce this background. The implementation of the cathode fission trigger in the fissionTPC will be presented along with a detailed study of its efficiency.

  7. Multi-Parameter Linear Least-Squares Fitting to Poisson Data One Count at a Time

    NASA Technical Reports Server (NTRS)

    Wheaton, W.; Dunklee, A.; Jacobson, A.; Ling, J.; Mahoney, W.; Radocinski, R.

    1993-01-01

    A standard problem in gamma-ray astronomy data analysis is the decomposition of a set of observed counts, described by Poisson statistics, according to a given multi-component linear model, with underlying physical count rates or fluxes which are to be estimated from the data.

  8. Studies of the extreme ultraviolet/soft x-ray background

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stern, R.A.

    1978-01-01

    The results of an extensive sky survey of the extreme ultraviolet (EUV)/soft x-ray background are reported. The data were obtained with a focusing telescope designed and calibrated at U.C. Berkeley which observed EUV sources and the diffuse background as part of the Apollo-Soyuz mission in July, 1975. With a primary field-of-view of 2.3 + 0.1/sup 0/ FWHM and four EUV bandpass filters (16 to 25, 20 to 73, 80 to 108, and 80 to 250 eV) the EUV telescope obtained background data included in the final observational sample for 21 discrete sky locations and 11 large angular scans, as wellmore » as for a number of shorter observations. Analysis of the data reveals as intense flux above 80 eV energy, with upper limits to the background intensity given for the lower energy filters Ca 2 x 10/sup 4/ and 6 x 10/sup 2/ ph cm/sup -2/ sec/sup -1/ ster/sup -1/ eV/sup -1/ at 21 and 45 eV respectively). The 80 to 108 eV flux agrees within statistical errors with the earlier results of Cash, Malina and Stern (1976): the Apollo-Soyuz average reported intensity is 4.0 +- 1.3 ph cm/sup -2/ sec/sup -1/ ster/sup -1/ eV/sup -1/ at Ca 100 eV, or roughly a factor of ten higher than the corresponding 250 eV intensity. The uniformity of the background flux is uncertain due to limitations in the statistical accuracy of the data; upper limits to the point-to-point standard deviation of the background intensity are (..delta..I/I approximately less than 0.8 +- 0.4 (80 to 108 eV) and approximately less than 0.4 +- 0.2 (80 to 250 eV). No evidence is found for a correlation between the telescope count rate and earth-based parameters (zenith angle, sun angle, etc.) for E approximately greater than 80 eV (the lower energy bandpasses are significantly affected by scattered solar radiation. Unlike some previous claims for the soft x-ray background, no simple dependence upon galactic latitude is seen.« less

  9. Associations between CXCR1 polymorphisms and pathogen-specific incidence rate of clinical mastitis, test-day somatic cell count, and test-day milk yield.

    PubMed

    Verbeke, Joren; Van Poucke, Mario; Peelman, Luc; Piepers, Sofie; De Vliegher, Sarne

    2014-12-01

    The CXCR1 gene plays an important role in the innate immunity of the bovine mammary gland. Associations between single nucleotide polymorphisms (SNP) CXCR1c.735C>G and c.980A>G and udder health have been identified before in small populations. A fluorescent multiprobe PCR assay was designed specifically and validated to genotype both SNP simultaneously in a reliable and cost-effective manner. In total, 3,106 cows from 50 commercial Flemish dairy herds were genotyped using this assay. Associations between genotype and detailed phenotypic data, including pathogen-specific incidence rate of clinical mastitis (IRCM), test-day somatic cell count, and test-day milk yield (MY) were analyzed. Staphylococcus aureus IRCM tended to associate with SNP c.735C>G. Cows with genotype c.735GG had lower Staph. aureus IRCM compared with cows with genotype c.735CC (rate ratio = 0.35, 95% confidence interval = 0.14–0.90). Additionally, a parity-specific association between Staph. aureus IRCM and SNP c.980A>G was detected. Heifers with genotype c.980GG had a lower Staph. aureus IRCM compared with heifers with genotype c.980AG (rate ratio = 0.15, 95% confidence interval = 0.04–0.56). Differences were less pronounced in multiparous cows. Associations between CXCR1 genotype and somatic cell count were not detected. However, MY was associated with SNP c.735C>G. Cows with genotype c.735GG out-produced cows with genotype c.735CC by 0.8 kg of milk/d. Results provide a basis for further research on the relation between CXCR1 polymorphism and pathogen-specific mastitis resistance and MY.

  10. A simulator for airborne laser swath mapping via photon counting

    NASA Astrophysics Data System (ADS)

    Slatton, K. C.; Carter, W. E.; Shrestha, R.

    2005-06-01

    Commercially marketed airborne laser swath mapping (ALSM) instruments currently use laser rangers with sufficient energy per pulse to work with return signals of thousands of photons per shot. The resulting high signal to noise level virtually eliminates spurious range values caused by noise, such as background solar radiation and sensor thermal noise. However, the high signal level approach requires laser repetition rates of hundreds of thousands of pulses per second to obtain contiguous coverage of the terrain at sub-meter spatial resolution, and with currently available technology, affords little scalability for significantly downsizing the hardware, or reducing the costs. A photon-counting ALSM sensor has been designed by the University of Florida and Sigma Space, Inc. for improved topographic mapping with lower power requirements and weight than traditional ALSM sensors. Major elements of the sensor design are presented along with preliminary simulation results. The simulator is being developed so that data phenomenology and target detection potential can be investigated before the system is completed. Early simulations suggest that precise estimates of terrain elevation and target detection will be possible with the sensor design.

  11. Background rates of adverse pregnancy outcomes for assessing the safety of maternal vaccine trials in sub-Saharan Africa.

    PubMed

    Orenstein, Lauren A V; Orenstein, Evan W; Teguete, Ibrahima; Kodio, Mamoudou; Tapia, Milagritos; Sow, Samba O; Levine, Myron M

    2012-01-01

    Maternal immunization has gained traction as a strategy to diminish maternal and young infant mortality attributable to infectious diseases. Background rates of adverse pregnancy outcomes are crucial to interpret results of clinical trials in Sub-Saharan Africa. We developed a mathematical model that calculates a clinical trial's expected number of neonatal and maternal deaths at an interim safety assessment based on the person-time observed during different risk windows. This model was compared to crude multiplication of the maternal mortality ratio and neonatal mortality rate by the number of live births. Systematic reviews of severe acute maternal morbidity (SAMM), low birth weight (LBW), prematurity, and major congenital malformations (MCM) in Sub-Saharan African countries were also performed. Accounting for the person-time observed during different risk periods yields lower, more conservative estimates of expected maternal and neonatal deaths, particularly at an interim safety evaluation soon after a large number of deliveries. Median incidence of SAMM in 16 reports was 40.7 (IQR: 10.6-73.3) per 1,000 total births, and the most common causes were hemorrhage (34%), dystocia (22%), and severe hypertensive disorders of pregnancy (22%). Proportions of liveborn infants who were LBW (median 13.3%, IQR: 9.9-16.4) or premature (median 15.4%, IQR: 10.6-19.1) were similar across geographic region, study design, and institutional setting. The median incidence of MCM per 1,000 live births was 14.4 (IQR: 5.5-17.6), with the musculoskeletal system comprising 30%. Some clinical trials assessing whether maternal immunization can improve pregnancy and young infant outcomes in the developing world have made ethics-based decisions not to use a pure placebo control. Consequently, reliable background rates of adverse pregnancy outcomes are necessary to distinguish between vaccine benefits and safety concerns. Local studies that quantify population-based background rates of adverse

  12. Bayesian approach for counting experiment statistics applied to a neutrino point source analysis

    NASA Astrophysics Data System (ADS)

    Bose, D.; Brayeur, L.; Casier, M.; de Vries, K. D.; Golup, G.; van Eijndhoven, N.

    2013-12-01

    In this paper we present a model independent analysis method following Bayesian statistics to analyse data from a generic counting experiment and apply it to the search for neutrinos from point sources. We discuss a test statistic defined following a Bayesian framework that will be used in the search for a signal. In case no signal is found, we derive an upper limit without the introduction of approximations. The Bayesian approach allows us to obtain the full probability density function for both the background and the signal rate. As such, we have direct access to any signal upper limit. The upper limit derivation directly compares with a frequentist approach and is robust in the case of low-counting observations. Furthermore, it allows also to account for previous upper limits obtained by other analyses via the concept of prior information without the need of the ad hoc application of trial factors. To investigate the validity of the presented Bayesian approach, we have applied this method to the public IceCube 40-string configuration data for 10 nearby blazars and we have obtained a flux upper limit, which is in agreement with the upper limits determined via a frequentist approach. Furthermore, the upper limit obtained compares well with the previously published result of IceCube, using the same data set.

  13. Dropout Count Procedural Handbook.

    ERIC Educational Resources Information Center

    Nevada State Dept. of Education, Carson City. Planning, Research and Evaluation Branch.

    This manual outlines the procedure for counting dropouts from the Nevada schools. The State Department of Education instituted a new dropout counting procedure to its student accounting system in January 1988 as part of its response to recommendations of a task force on at-risk youth. The count is taken from each secondary school and includes…

  14. A new approach to counting measurements: Addressing the problems with ISO-11929

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klumpp, John Allan; Poudel, Deepesh; Miller, Guthrie

    We present an alternative approach to making counting measurements of radioactivity which offers probabilistic interpretations of the measurements. Unlike the approach in the current international standard (ISO-11929), our approach, which uses an assumed prior probability distribution of the true amount in the sample, is able to answer the question of interest for most users of the standard: “what is the probability distribution of the true amount in the sample, given the data?” The final interpretation of the measurement requires information not necessarily available at the measurement stage. However, we provide an analytical formula for what we term the “measurement strength”more » that depends only on measurement-stage count quantities. Here, we show that, when the sources are rare, the posterior odds that the sample true value exceeds ε are the measurement strength times the prior odds, independently of ε, the prior odds, and the distribution of the calibration coefficient. We recommend that the measurement lab immediately follow-up on unusually high samples using an “action threshold” on the measurement strength which is similar to the decision threshold recommended by the current standard. Finally, we further recommend that the measurement lab perform large background studies in order to characterize non constancy of background, including possible time correlation of background.« less

  15. A new approach to counting measurements: Addressing the problems with ISO-11929

    DOE PAGES

    Klumpp, John Allan; Poudel, Deepesh; Miller, Guthrie

    2017-12-23

    We present an alternative approach to making counting measurements of radioactivity which offers probabilistic interpretations of the measurements. Unlike the approach in the current international standard (ISO-11929), our approach, which uses an assumed prior probability distribution of the true amount in the sample, is able to answer the question of interest for most users of the standard: “what is the probability distribution of the true amount in the sample, given the data?” The final interpretation of the measurement requires information not necessarily available at the measurement stage. However, we provide an analytical formula for what we term the “measurement strength”more » that depends only on measurement-stage count quantities. Here, we show that, when the sources are rare, the posterior odds that the sample true value exceeds ε are the measurement strength times the prior odds, independently of ε, the prior odds, and the distribution of the calibration coefficient. We recommend that the measurement lab immediately follow-up on unusually high samples using an “action threshold” on the measurement strength which is similar to the decision threshold recommended by the current standard. Finally, we further recommend that the measurement lab perform large background studies in order to characterize non constancy of background, including possible time correlation of background.« less

  16. Marginalized zero-altered models for longitudinal count data.

    PubMed

    Tabb, Loni Philip; Tchetgen, Eric J Tchetgen; Wellenius, Greg A; Coull, Brent A

    2016-10-01

    Count data often exhibit more zeros than predicted by common count distributions like the Poisson or negative binomial. In recent years, there has been considerable interest in methods for analyzing zero-inflated count data in longitudinal or other correlated data settings. A common approach has been to extend zero-inflated Poisson models to include random effects that account for correlation among observations. However, these models have been shown to have a few drawbacks, including interpretability of regression coefficients and numerical instability of fitting algorithms even when the data arise from the assumed model. To address these issues, we propose a model that parameterizes the marginal associations between the count outcome and the covariates as easily interpretable log relative rates, while including random effects to account for correlation among observations. One of the main advantages of this marginal model is that it allows a basis upon which we can directly compare the performance of standard methods that ignore zero inflation with that of a method that explicitly takes zero inflation into account. We present simulations of these various model formulations in terms of bias and variance estimation. Finally, we apply the proposed approach to analyze toxicological data of the effect of emissions on cardiac arrhythmias.

  17. Marginalized zero-altered models for longitudinal count data

    PubMed Central

    Tabb, Loni Philip; Tchetgen, Eric J. Tchetgen; Wellenius, Greg A.; Coull, Brent A.

    2015-01-01

    Count data often exhibit more zeros than predicted by common count distributions like the Poisson or negative binomial. In recent years, there has been considerable interest in methods for analyzing zero-inflated count data in longitudinal or other correlated data settings. A common approach has been to extend zero-inflated Poisson models to include random effects that account for correlation among observations. However, these models have been shown to have a few drawbacks, including interpretability of regression coefficients and numerical instability of fitting algorithms even when the data arise from the assumed model. To address these issues, we propose a model that parameterizes the marginal associations between the count outcome and the covariates as easily interpretable log relative rates, while including random effects to account for correlation among observations. One of the main advantages of this marginal model is that it allows a basis upon which we can directly compare the performance of standard methods that ignore zero inflation with that of a method that explicitly takes zero inflation into account. We present simulations of these various model formulations in terms of bias and variance estimation. Finally, we apply the proposed approach to analyze toxicological data of the effect of emissions on cardiac arrhythmias. PMID:27867423

  18. Analysis of counting errors in the phase/Doppler particle analyzer

    NASA Technical Reports Server (NTRS)

    Oldenburg, John R.

    1987-01-01

    NASA is investigating the application of the Phase Doppler measurement technique to provide improved drop sizing and liquid water content measurements in icing research. The magnitude of counting errors were analyzed because these errors contribute to inaccurate liquid water content measurements. The Phase Doppler Particle Analyzer counting errors due to data transfer losses and coincidence losses were analyzed for data input rates from 10 samples/sec to 70,000 samples/sec. Coincidence losses were calculated by determining the Poisson probability of having more than one event occurring during the droplet signal time. The magnitude of the coincidence loss can be determined, and for less than a 15 percent loss, corrections can be made. The data transfer losses were estimated for representative data transfer rates. With direct memory access enabled, data transfer losses are less than 5 percent for input rates below 2000 samples/sec. With direct memory access disabled losses exceeded 20 percent at a rate of 50 samples/sec preventing accurate number density or mass flux measurements. The data transfer losses of a new signal processor were analyzed and found to be less than 1 percent for rates under 65,000 samples/sec.

  19. Platelet Counts in Insoluble Platelet-Rich Fibrin Clots: A Direct Method for Accurate Determination.

    PubMed

    Kitamura, Yutaka; Watanabe, Taisuke; Nakamura, Masayuki; Isobe, Kazushige; Kawabata, Hideo; Uematsu, Kohya; Okuda, Kazuhiro; Nakata, Koh; Tanaka, Takaaki; Kawase, Tomoyuki

    2018-01-01

    Platelet-rich fibrin (PRF) clots have been used in regenerative dentistry most often, with the assumption that growth factor levels are concentrated in proportion to the platelet concentration. Platelet counts in PRF are generally determined indirectly by platelet counting in other liquid fractions. This study shows a method for direct estimation of platelet counts in PRF. To validate this method by determination of the recovery rate, whole-blood samples were obtained with an anticoagulant from healthy donors, and platelet-rich plasma (PRP) fractions were clotted with CaCl 2 by centrifugation and digested with tissue-plasminogen activator. Platelet counts were estimated before clotting and after digestion using an automatic hemocytometer. The method was then tested on PRF clots. The quality of platelets was examined by scanning electron microscopy and flow cytometry. In PRP-derived fibrin matrices, the recovery rate of platelets and white blood cells was 91.6 and 74.6%, respectively, after 24 h of digestion. In PRF clots associated with small and large red thrombi, platelet counts were 92.6 and 67.2% of the respective total platelet counts. These findings suggest that our direct method is sufficient for estimating the number of platelets trapped in an insoluble fibrin matrix and for determining that platelets are distributed in PRF clots and red thrombi roughly in proportion to their individual volumes. Therefore, we propose this direct digestion method for more accurate estimation of platelet counts in most types of platelet-enriched fibrin matrix.

  20. Platelet Counts in Insoluble Platelet-Rich Fibrin Clots: A Direct Method for Accurate Determination

    PubMed Central

    Kitamura, Yutaka; Watanabe, Taisuke; Nakamura, Masayuki; Isobe, Kazushige; Kawabata, Hideo; Uematsu, Kohya; Okuda, Kazuhiro; Nakata, Koh; Tanaka, Takaaki; Kawase, Tomoyuki

    2018-01-01

    Platelet-rich fibrin (PRF) clots have been used in regenerative dentistry most often, with the assumption that growth factor levels are concentrated in proportion to the platelet concentration. Platelet counts in PRF are generally determined indirectly by platelet counting in other liquid fractions. This study shows a method for direct estimation of platelet counts in PRF. To validate this method by determination of the recovery rate, whole-blood samples were obtained with an anticoagulant from healthy donors, and platelet-rich plasma (PRP) fractions were clotted with CaCl2 by centrifugation and digested with tissue-plasminogen activator. Platelet counts were estimated before clotting and after digestion using an automatic hemocytometer. The method was then tested on PRF clots. The quality of platelets was examined by scanning electron microscopy and flow cytometry. In PRP-derived fibrin matrices, the recovery rate of platelets and white blood cells was 91.6 and 74.6%, respectively, after 24 h of digestion. In PRF clots associated with small and large red thrombi, platelet counts were 92.6 and 67.2% of the respective total platelet counts. These findings suggest that our direct method is sufficient for estimating the number of platelets trapped in an insoluble fibrin matrix and for determining that platelets are distributed in PRF clots and red thrombi roughly in proportion to their individual volumes. Therefore, we propose this direct digestion method for more accurate estimation of platelet counts in most types of platelet-enriched fibrin matrix. PMID:29450197

  1. ELLIPTICAL WEIGHTED HOLICs FOR WEAK LENSING SHEAR MEASUREMENT. III. THE EFFECT OF RANDOM COUNT NOISE ON IMAGE MOMENTS IN WEAK LENSING ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Okura, Yuki; Futamase, Toshifumi, E-mail: yuki.okura@nao.ac.jp, E-mail: tof@astr.tohoku.ac.jp

    This is the third paper on the improvement of systematic errors in weak lensing analysis using an elliptical weight function, referred to as E-HOLICs. In previous papers, we succeeded in avoiding errors that depend on the ellipticity of the background image. In this paper, we investigate the systematic error that depends on the signal-to-noise ratio of the background image. We find that the origin of this error is the random count noise that comes from the Poisson noise of sky counts. The random count noise makes additional moments and centroid shift error, and those first-order effects are canceled in averaging,more » but the second-order effects are not canceled. We derive the formulae that correct this systematic error due to the random count noise in measuring the moments and ellipticity of the background image. The correction formulae obtained are expressed as combinations of complex moments of the image, and thus can correct the systematic errors caused by each object. We test their validity using a simulated image and find that the systematic error becomes less than 1% in the measured ellipticity for objects with an IMCAT significance threshold of {nu} {approx} 11.7.« less

  2. Implementing Graduation Counts: State Progress to Date, 2009

    ERIC Educational Resources Information Center

    Curran, Bridget; Reyna, Ryan

    2009-01-01

    In 2005, all 50 state governors made an unprecedented commitment to voluntarily implement a common, more reliable formula for calculating their states' high school graduation rates by signing the National Governors Association (NGA) Graduation Counts Compact. Four years later, progress is steady. Twenty states now report that they use the Compact…

  3. Kids Count [and] Families Count in Delaware: Fact Book, 1998.

    ERIC Educational Resources Information Center

    Nelson, Carl, Ed.; Wilson, Nancy, Ed.

    This Kids Count report is combined with Families Count, and provides information on statewide trends affecting children and families in Delaware. The first statistical profile is based on 10 main indicators of child well-being: (1) births to teens; (2) low birth weight babies; (3) infant mortality; (4) child deaths; (5) teen deaths; (6) juvenile…

  4. Youth Count: Exploring How KIDS COUNT Grantees Address Youth Issues

    ERIC Educational Resources Information Center

    Wilson-Ahlstrom, Alicia; Gaines, Elizabeth; Ferber, Thaddeus; Yohalem, Nicole

    2005-01-01

    Inspired by the 2004 Kids Count Databook essay, "Moving Youth From Risk to Opportunity," this new report highlights the history of data collection, challenges and innovative strategies of 12 Annie E. Casey Foundation KIDS COUNT grantees in their work to serve the needs of older youth. (Contains 3 figures, 2 tables, and 9 notes.)

  5. Reducing the Child Death Rate. KIDS COUNT Indicator Brief

    ERIC Educational Resources Information Center

    Shore, Rima; Shore, Barbara

    2009-01-01

    In the 20th century's final decades, advances in the prevention and treatment of infectious diseases sharply reduced the child death rate. Despite this progress, the child death rate in the U.S. remains higher than in many other wealthy nations. The under-five mortality rate in the U.S. is almost three times higher than that of Iceland and Sweden…

  6. Detection and Estimation of an Optical Image by Photon-Counting Techniques. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Wang, Lily Lee

    1973-01-01

    Statistical description of a photoelectric detector is given. The photosensitive surface of the detector is divided into many small areas, and the moment generating function of the photo-counting statistic is derived for large time-bandwidth product. The detection of a specified optical image in the presence of the background light by using the hypothesis test is discussed. The ideal detector based on the likelihood ratio from a set of numbers of photoelectrons ejected from many small areas of the photosensitive surface is studied and compared with the threshold detector and a simple detector which is based on the likelihood ratio by counting the total number of photoelectrons from a finite area of the surface. The intensity of the image is assumed to be Gaussian distributed spatially against the uniformly distributed background light. The numerical approximation by the method of steepest descent is used, and the calculations of the reliabilities for the detectors are carried out by a digital computer.

  7. Optimization of simultaneous tritium–radiocarbon internal gas proportional counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bonicalzi, R. M.; Aalseth, C. E.; Day, A. R.

    Specific environmental applications can benefit from dual tritium and radiocarbon measurements in a single compound. Assuming typical environmental levels, it is often the low tritium activity relative to the higher radiocarbon activity that limits the dual measurement. In this paper, we explore the parameter space for a combined tritium and radiocarbon measurement using a methane sample mixed with an argon fill gas in low-background proportional counters of a specific design. We present an optimized methane percentage, detector fill pressure, and analysis energy windows to maximize measurement sensitivity while minimizing count time. The final optimized method uses a 9-atm fill ofmore » P35 (35% methane, 65% argon), and a tritium analysis window from 1.5 to 10.3 keV, which stops short of the tritium beta decay endpoint energy of 18.6 keV. This method optimizes tritium counting efficiency while minimizing radiocarbon beta decay interference.« less

  8. Single-photon counting multicolor multiphoton fluorescence microscope.

    PubMed

    Buehler, Christof; Kim, Ki H; Greuter, Urs; Schlumpf, Nick; So, Peter T C

    2005-01-01

    We present a multicolor multiphoton fluorescence microscope with single-photon counting sensitivity. The system integrates a standard multiphoton fluorescence microscope, an optical grating spectrograph operating in the UV-Vis wavelength region, and a 16-anode photomultiplier tube (PMT). The major technical innovation is in the development of a multichannel photon counting card (mC-PhCC) for direct signal collection from multi-anode PMTs. The electronic design of the mC-PhCC employs a high-throughput, fully-parallel, single-photon counting scheme along with a high-speed electrical or fiber-optical link interface to the data acquisition computer. There is no electronic crosstalk among the detection channels of the mC-PhCC. The collected signal remains linear up to an incident photon rate of 10(8) counts per second. The high-speed data interface offers ample bandwidth for real-time readout: 2 MByte lambda-stacks composed of 16 spectral channels, 256 x 256 pixel image with 12-bit dynamic range can be transferred at 30 frames per second. The modular design of the mC-PhCC can be readily extended to accommodate PMTs of more anodes. Data acquisition from a 64-anode PMT has been verified. As a demonstration of system performance, spectrally resolved images of fluorescent latex spheres and ex-vivo human skin are reported. The multicolor multiphoton microscope is suitable for highly sensitive, real-time, spectrally-resolved three-dimensional imaging in biomedical applications.

  9. Avian leucocyte counting using the hemocytometer

    USGS Publications Warehouse

    Dein, F.J.; Wilson, A.; Fischer, D.; Langenberg, P.

    1994-01-01

    Automated methods for counting leucocytes in avian blood are not available because of the presence of nucleated erythrocytes and thrombocytes. Therefore, total white blood cell counts are performed by hand using a hemocytometer. The Natt and Herrick and the Unopette methods are the most common stain and diluent preparations for this procedure. Replicate hemocytometer counts using these two methods were performed on blood from four birds of different species. Cells present in each square of the hemocytometer were counted. Counting cells in the corner, side, or center hemocytometer squares produced statistically equivalent results; counting four squares per chamber provided a result similar to that obtained by counting nine squares; and the Unopette method was more precise for hemocytometer counting than was the Natt and Herrick method. The Unopette method is easier to learn and perform but is an indirect process, utilizing the differential count from a stained smear. The Natt and Herrick method is a direct total count, but cell identification is more difficult.

  10. How Fred Hoyle Reconciled Radio Source Counts and the Steady State Cosmology

    NASA Astrophysics Data System (ADS)

    Ekers, Ron

    2012-09-01

    In 1969 Fred Hoyle invited me to his Institute of Theoretical Astronomy (IOTA) in Cambridge to work with him on the interpretation of the radio source counts. This was a period of extreme tension with Ryle just across the road using the steep slope of the radio source counts to argue that the radio source population was evolving and Hoyle maintaining that the counts were consistent with the steady state cosmology. Both of these great men had made some correct deductions but they had also both made mistakes. The universe was evolving, but the source counts alone could tell us very little about cosmology. I will try to give some indication of the atmosphere and the issues at the time and look at what we can learn from this saga. I will conclude by briefly summarising the exponential growth of the size of the radio source counts since the early days and ask whether our understanding has grown at the same rate.

  11. Characterization of spectrometric photon-counting X-ray detectors at different pitches

    NASA Astrophysics Data System (ADS)

    Jurdit, M.; Brambilla, A.; Moulin, V.; Ouvrier-Buffet, P.; Radisson, P.; Verger, L.

    2017-09-01

    There is growing interest in energy-sensitive photon-counting detectors based on high flux X-ray imaging. Their potential applications include medical imaging, non-destructive testing and security. Innovative detectors of this type will need to count individual photons and sort them into selected energy bins, at several million counts per second and per mm2. Cd(Zn)Te detector grade materials with a thickness of 1.5 to 3 mm and pitches from 800 μm down to 200 μm were assembled onto interposer boards. These devices were tested using in-house-developed full-digital fast readout electronics. The 16-channel demonstrators, with 256 energy bins, were experimentally characterized by determining spectral resolution, count rate, and charge sharing, which becomes challenging at low pitch. Charge sharing correction was found to efficiently correct X-ray spectra up to 40 × 106 incident photons.s-1.mm-2.

  12. Implementing Graduation Counts: State Progress to Date, 2010

    ERIC Educational Resources Information Center

    Curran, Bridget; Reyna, Ryan

    2010-01-01

    In 2005, the governors of all 50 states made an unprecedented commitment to voluntarily implement a common, more reliable formula for calculating their state's high school graduation rate by signing the Graduation Counts Compact of the National Governors Association (NGA). Five years later, progress is steady. Twenty-six states say they have…

  13. Estimation of DMFT, Salivary Streptococcus Mutans Count, Flow Rate, Ph, and Salivary Total Calcium Content in Pregnant and Non-Pregnant Women: A Prospective Study.

    PubMed

    Kamate, Wasim Ismail; Vibhute, Nupura Aniket; Baad, Rajendra Krishna

    2017-04-01

    Pregnancy, a period from conception till birth, causes changes in the functioning of the human body as a whole and specifically in the oral cavity that may favour the emergence of dental caries. Many studies have shown pregnant women at increased risk for dental caries, however, specific salivary caries risk factors and the particular period of pregnancy at heightened risk for dental caries are yet to be explored and give a scope of further research in this area. The aim of the present study was to assess the severity of dental caries in pregnant women compared to non-pregnant women by evaluating parameters like Decayed, Missing, Filled Teeth (DMFT) index, salivary Streptococcus mutans count, flow rate, pH and total calcium content. A total of 50 first time pregnant women in the first trimester were followed during their second trimester, third trimester and postpartum period for the evaluation of DMFT by World Health Organization (WHO) scoring criteria, salivary flow rate by drooling method, salivary pH by pH meter, salivary total calcium content by bioassay test kit and salivary Streptococcus mutans count by semiautomatic counting of colonies grown on Mitis Salivarius (MS) agar supplemented by 0.2U/ml of bacitracin and 10% sucrose. The observations of pregnant women were then compared with same parameters evaluated in the 50 non-pregnant women. Paired t-test and Wilcoxon sign rank test were performed to assess the association between the study parameters. Evaluation of different caries risk factors between pregnant and non-pregnant women clearly showed that pregnant women were at a higher risk for dental caries. Comparison of caries risk parameters during the three trimesters and postpartum period showed that the salivary Streptococcus mutans count had significantly increased in the second trimester , third trimester and postpartum period while the mean pH and mean salivary total calcium content decreased in the third trimester and postpartum period. These

  14. Estimation of DMFT, Salivary Streptococcus Mutans Count, Flow Rate, Ph, and Salivary Total Calcium Content in Pregnant and Non-Pregnant Women: A Prospective Study

    PubMed Central

    Vibhute, Nupura Aniket; Baad, Rajendra Krishna

    2017-01-01

    Introduction Pregnancy, a period from conception till birth, causes changes in the functioning of the human body as a whole and specifically in the oral cavity that may favour the emergence of dental caries. Many studies have shown pregnant women at increased risk for dental caries, however, specific salivary caries risk factors and the particular period of pregnancy at heightened risk for dental caries are yet to be explored and give a scope of further research in this area. Aim The aim of the present study was to assess the severity of dental caries in pregnant women compared to non-pregnant women by evaluating parameters like Decayed, Missing, Filled Teeth (DMFT) index, salivary Streptococcus mutans count, flow rate, pH and total calcium content. Materials and Methods A total of 50 first time pregnant women in the first trimester were followed during their second trimester, third trimester and postpartum period for the evaluation of DMFT by World Health Organization (WHO) scoring criteria, salivary flow rate by drooling method, salivary pH by pH meter, salivary total calcium content by bioassay test kit and salivary Streptococcus mutans count by semiautomatic counting of colonies grown on Mitis Salivarius (MS) agar supplemented by 0.2U/ml of bacitracin and 10% sucrose. The observations of pregnant women were then compared with same parameters evaluated in the 50 non-pregnant women. Paired t-test and Wilcoxon sign rank test were performed to assess the association between the study parameters. Results Evaluation of different caries risk factors between pregnant and non-pregnant women clearly showed that pregnant women were at a higher risk for dental caries. Comparison of caries risk parameters during the three trimesters and postpartum period showed that the salivary Streptococcus mutans count had significantly increased in the second trimester, third trimester and postpartum period while the mean pH and mean salivary total calcium content decreased in the third

  15. Probing Jupiter's Radiation Environment with Juno-UVS

    NASA Astrophysics Data System (ADS)

    Kammer, J.; Gladstone, R.; Greathouse, T. K.; Hue, V.; Versteeg, M. H.; Davis, M. W.; Santos-Costa, D.; Becker, H. N.; Bolton, S. J.; Connerney, J. E. P.; Levin, S.

    2017-12-01

    While primarily designed to observe photon emission from the Jovian aurora, Juno's Ultraviolet Spectrograph (Juno-UVS) has also measured background count rates associated with penetrating high-energy radiation. These background counts are distinguishable from photon events, as they are generally spread evenly across the entire array of the Juno-UVS detector, and as the spacecraft spins, they set a baseline count rate higher than the sky background rate. During eight perijove passes, this background radiation signature has varied significantly on both short (spin-modulated) timescales, as well as longer timescales ( minutes to hours). We present comparisons of the Juno-UVS data across each of the eight perijove passes, with a focus on the count rate that can be clearly attributed to radiation effects rather than photon events. Once calibrated to determine the relationship between count rate and penetrating high-energy radiation (e.g., using existing GEANT models), these in situ measurements by Juno-UVS will provide additional constraints to radiation belt models close to the planet.

  16. Experimental evaluation of the extended Dytlewski-style dead time correction formalism for neutron multiplicity counting

    NASA Astrophysics Data System (ADS)

    Lockhart, M.; Henzlova, D.; Croft, S.; Cutler, T.; Favalli, A.; McGahee, Ch.; Parker, R.

    2018-01-01

    Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli(DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory and implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. The current paper discusses and presents the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. In order to assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. The DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.

  17. Detection of bremsstrahlung radiation of 90Sr-90Y for emergency lung counting.

    PubMed

    Ho, A; Hakmana Witharana, S S; Jonkmans, G; Li, L; Surette, R A; Dubeau, J; Dai, X

    2012-09-01

    This study explores the possibility of developing a field-deployable (90)Sr detector for rapid lung counting in emergency situations. The detection of beta-emitters (90)Sr and its daughter (90)Y inside the human lung via bremsstrahlung radiation was performed using a 3″ × 3″ NaI(Tl) crystal detector and a polyethylene-encapsulated source to emulate human lung tissue. The simulation results show that this method is a viable technique for detecting (90)Sr with a minimum detectable activity (MDA) of 1.07 × 10(4) Bq, using a realistic dual-shielded detector system in a 0.25-µGy h(-1) background field for a 100-s scan. The MDA is sufficiently sensitive to meet the requirement for emergency lung counting of Type S (90)Sr intake. The experimental data were verified using Monte Carlo calculations, including an estimate for internal bremsstrahlung, and an optimisation of the detector geometry was performed. Optimisations in background reduction techniques and in the electronic acquisition systems are suggested.

  18. MicroCT with energy-resolved photon-counting detectors

    PubMed Central

    Wang, X; Meier, D; Mikkelsen, S; Maehlum, G E; Wagenaar, D J; Tsui, BMW; Patt, B E; Frey, E C

    2011-01-01

    The goal of this paper was to investigate the benefits that could be realistically achieved on a microCT imaging system with an energy-resolved photon-counting x-ray detector. To this end, we built and evaluated a prototype microCT system based on such a detector. The detector is based on cadmium telluride (CdTe) radiation sensors and application-specific integrated circuit (ASIC) readouts. Each detector pixel can simultaneously count x-ray photons above six energy thresholds, providing the capability for energy-selective x-ray imaging. We tested the spectroscopic performance of the system using polychromatic x-ray radiation and various filtering materials with Kabsorption edges. Tomographic images were then acquired of a cylindrical PMMA phantom containing holes filled with various materials. Results were also compared with those acquired using an intensity-integrating x-ray detector and single-energy (i.e. non-energy-selective) CT. This paper describes the functionality and performance of the system, and presents preliminary spectroscopic and tomographic results. The spectroscopic experiments showed that the energy-resolved photon-counting detector was capable of measuring energy spectra from polychromatic sources like a standard x-ray tube, and resolving absorption edges present in the energy range used for imaging. However, the spectral quality was degraded by spectral distortions resulting from degrading factors, including finite energy resolution and charge sharing. We developed a simple charge-sharing model to reproduce these distortions. The tomographic experiments showed that the availability of multiple energy thresholds in the photon-counting detector allowed us to simultaneously measure target-to-background contrasts in different energy ranges. Compared with single-energy CT with an integrating detector, this feature was especially useful to improve differentiation of materials with different attenuation coefficient energy dependences. PMID:21464527

  19. MicroCT with energy-resolved photon-counting detectors.

    PubMed

    Wang, X; Meier, D; Mikkelsen, S; Maehlum, G E; Wagenaar, D J; Tsui, B M W; Patt, B E; Frey, E C

    2011-05-07

    The goal of this paper was to investigate the benefits that could be realistically achieved on a microCT imaging system with an energy-resolved photon-counting x-ray detector. To this end, we built and evaluated a prototype microCT system based on such a detector. The detector is based on cadmium telluride (CdTe) radiation sensors and application-specific integrated circuit (ASIC) readouts. Each detector pixel can simultaneously count x-ray photons above six energy thresholds, providing the capability for energy-selective x-ray imaging. We tested the spectroscopic performance of the system using polychromatic x-ray radiation and various filtering materials with K-absorption edges. Tomographic images were then acquired of a cylindrical PMMA phantom containing holes filled with various materials. Results were also compared with those acquired using an intensity-integrating x-ray detector and single-energy (i.e. non-energy-selective) CT. This paper describes the functionality and performance of the system, and presents preliminary spectroscopic and tomographic results. The spectroscopic experiments showed that the energy-resolved photon-counting detector was capable of measuring energy spectra from polychromatic sources like a standard x-ray tube, and resolving absorption edges present in the energy range used for imaging. However, the spectral quality was degraded by spectral distortions resulting from degrading factors, including finite energy resolution and charge sharing. We developed a simple charge-sharing model to reproduce these distortions. The tomographic experiments showed that the availability of multiple energy thresholds in the photon-counting detector allowed us to simultaneously measure target-to-background contrasts in different energy ranges. Compared with single-energy CT with an integrating detector, this feature was especially useful to improve differentiation of materials with different attenuation coefficient energy dependences.

  20. Isospectral discrete and quantum graphs with the same flip counts and nodal counts

    NASA Astrophysics Data System (ADS)

    Juul, Jonas S.; Joyner, Christopher H.

    2018-06-01

    The existence of non-isomorphic graphs which share the same Laplace spectrum (to be referred to as isospectral graphs) leads naturally to the following question: what additional information is required in order to resolve isospectral graphs? It was suggested by Band, Shapira and Smilansky that this might be achieved by either counting the number of nodal domains or the number of times the eigenfunctions change sign (the so-called flip count) (Band et al 2006 J. Phys. A: Math. Gen. 39 13999–4014 Band and Smilansky 2007 Eur. Phys. J. Spec. Top. 145 171–9). Recent examples of (discrete) isospectral graphs with the same flip count and nodal count have been constructed by Ammann by utilising Godsil–McKay switching (Ammann private communication). Here, we provide a simple alternative mechanism that produces systematic examples of both discrete and quantum isospectral graphs with the same flip and nodal counts.

  1. Modeling zero-modified count and semicontinuous data in health services research part 2: case studies.

    PubMed

    Neelon, Brian; O'Malley, A James; Smith, Valerie A

    2016-11-30

    This article is the second installment of a two-part tutorial on the analysis of zero-modified count and semicontinuous data. Part 1, which appears as a companion piece in this issue of Statistics in Medicine, provides a general background and overview of the topic, with particular emphasis on applications to health services research. Here, we present three case studies highlighting various approaches for the analysis of zero-modified data. The first case study describes methods for analyzing zero-inflated longitudinal count data. Case study 2 considers the use of hurdle models for the analysis of spatiotemporal count data. The third case study discusses an application of marginalized two-part models to the analysis of semicontinuous health expenditure data. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  2. Counting It Twice.

    ERIC Educational Resources Information Center

    Schattschneider, Doris

    1991-01-01

    Provided are examples from many domains of mathematics that illustrate the Fubini Principle in its discrete version: the value of a summation over a rectangular array is independent of the order of summation. Included are: counting using partitions as in proof by pictures, combinatorial arguments, indirect counting as in the inclusion-exclusion…

  3. The relationship between pollen count levels and prevalence of Japanese cedar pollinosis in Northeast Japan.

    PubMed

    Honda, Kohei; Saito, Hidekazu; Fukui, Naoko; Ito, Eiko; Ishikawa, Kazuo

    2013-09-01

    The prevalence of Japanese cedar (JC) pollinosis in Japanese children is increasing. However, few studies have reported the relationship between pollen count levels and the prevalence of pollinosis. To evaluate the relationship between JC pollen count levels and the prevalence of pollinosis in children, we investigated the sensitization and development of symptoms for JC pollen in two areas of Akita in northeast Japan with contrasting levels of exposure to JC pollen. The study population consisted of 339 elementary school students (10-11 years of age) from the coastal and mountainous areas of Akita in 2005-2006. A questionnaire about symptoms of allergic rhinitis was filled out by the students' parents. A blood sample was taken to determine specific IgE antibodies against five common aeroallergens. The mean pollen count in the mountainous areas was two times higher than that in the coastal areas in 1996-2006. The prevalence rates of nasal allergy symptoms and sensitization for mites were almost the same in both areas. On the other hand, the rates of nasal allergy symptoms and sensitization for JC pollen were significantly higher in the mountainous areas than in the coastal areas. The rate of the development of symptoms among children sensitized for JC pollen was almost the same in both areas. These results suggest that pollen count levels may correlate with the rate of sensitization for JC pollinosis, but may not affect the rate of onset among sensitized children in northeast Japan.

  4. Single Photon Counting Detectors for Low Light Level Imaging Applications

    NASA Astrophysics Data System (ADS)

    Kolb, Kimberly

    2015-10-01

    This dissertation presents the current state-of-the-art of semiconductor-based photon counting detector technologies. HgCdTe linear-mode avalanche photodiodes (LM-APDs), silicon Geiger-mode avalanche photodiodes (GM-APDs), and electron-multiplying CCDs (EMCCDs) are compared via their present and future performance in various astronomy applications. LM-APDs are studied in theory, based on work done at the University of Hawaii. EMCCDs are studied in theory and experimentally, with a device at NASA's Jet Propulsion Lab. The emphasis of the research is on GM-APD imaging arrays, developed at MIT Lincoln Laboratory and tested at the RIT Center for Detectors. The GM-APD research includes a theoretical analysis of SNR and various performance metrics, including dark count rate, afterpulsing, photon detection efficiency, and intrapixel sensitivity. The effects of radiation damage on the GM-APD were also characterized by introducing a cumulative dose of 50 krad(Si) via 60 MeV protons. Extensive development of Monte Carlo simulations and practical observation simulations was completed, including simulated astronomical imaging and adaptive optics wavefront sensing. Based on theoretical models and experimental testing, both the current state-of-the-art performance and projected future performance of each detector are compared for various applications. LM-APD performance is currently not competitive with other photon counting technologies, and are left out of the application-based comparisons. In the current state-of-the-art, EMCCDs in photon counting mode out-perform GM-APDs for long exposure scenarios, though GM-APDs are better for short exposure scenarios (fast readout) due to clock-induced-charge (CIC) in EMCCDs. In the long term, small improvements in GM-APD dark current will make them superior in both long and short exposure scenarios for extremely low flux. The efficiency of GM-APDs will likely always be less than EMCCDs, however, which is particularly disadvantageous for

  5. Partial-Interval Estimation of Count: Uncorrected and Poisson-Corrected Error Levels

    ERIC Educational Resources Information Center

    Yoder, Paul J.; Ledford, Jennifer R.; Harbison, Amy L.; Tapp, Jon T.

    2018-01-01

    A simulation study that used 3,000 computer-generated event streams with known behavior rates, interval durations, and session durations was conducted to test whether the main and interaction effects of true rate and interval duration affect the error level of uncorrected and Poisson-transformed (i.e., "corrected") count as estimated by…

  6. Ultra-fast photon counting with a passive quenching silicon photomultiplier in the charge integration regime

    NASA Astrophysics Data System (ADS)

    Zhang, Guoqing; Lina, Liu

    2018-02-01

    An ultra-fast photon counting method is proposed based on the charge integration of output electrical pulses of passive quenching silicon photomultipliers (SiPMs). The results of the numerical analysis with actual parameters of SiPMs show that the maximum photon counting rate of a state-of-art passive quenching SiPM can reach ~THz levels which is much larger than that of the existing photon counting devices. The experimental procedure is proposed based on this method. This photon counting regime of SiPMs is promising in many fields such as large dynamic light power detection.

  7. ON THE PROGNOSTIC SIGNIFICANCE OF THE ERYTHROCYTE SEDIMENTATION RATE, THE LEUKOCYTE COUNT, THE HEMOGLOBIN VALUE, AND BODY WEIGHT IN IRRADIATED AND NONIRRADIATED CANCER PATIENTS (in German)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hofmann, D.

    1962-06-01

    Changes in these parameters were followed in 672 women with genital carcinoma during and after radiotherapy to determine whether any of them could be used to predict the eventual success of the treatment. All of these parameters were found to be of prognostic value in the 394 patients with carcinoma of the uterine cervix of grades I, II, and III. Erythrocyte sedimentation rate (ESR) was initially elevated in these patients, and in those without recurrence, irradiation caused a prompt and progressive drop in ESR. It continued to rise after radiotherapy in those who later showed tumor recurrence. Similar changes inmore » leukocyte count were seen in this group, the counts falling and staying down after successful treatment or rising when the treatment failed. An inverse relation with respect to the hemoglobin level and body wt was seen, both values progressively increasing in cases later shown to be cured and falling in those which were not. These relations did not hold or were of less prognostic value in patients with carcinomas of the body of the uterus, ovary, or vulva. However, in general, a gradual continual fall in ESR and a rapid fall in leukocyte count were favorable signs following irradiation. (BBB)« less

  8. A semi-automated technique for labeling and counting of apoptosing retinal cells

    PubMed Central

    2014-01-01

    Background Retinal ganglion cell (RGC) loss is one of the earliest and most important cellular changes in glaucoma. The DARC (Detection of Apoptosing Retinal Cells) technology enables in vivo real-time non-invasive imaging of single apoptosing retinal cells in animal models of glaucoma and Alzheimer’s disease. To date, apoptosing RGCs imaged using DARC have been counted manually. This is time-consuming, labour-intensive, vulnerable to bias, and has considerable inter- and intra-operator variability. Results A semi-automated algorithm was developed which enabled automated identification of apoptosing RGCs labeled with fluorescent Annexin-5 on DARC images. Automated analysis included a pre-processing stage involving local-luminance and local-contrast “gain control”, a “blob analysis” step to differentiate between cells, vessels and noise, and a method to exclude non-cell structures using specific combined ‘size’ and ‘aspect’ ratio criteria. Apoptosing retinal cells were counted by 3 masked operators, generating ‘Gold-standard’ mean manual cell counts, and were also counted using the newly developed automated algorithm. Comparison between automated cell counts and the mean manual cell counts on 66 DARC images showed significant correlation between the two methods (Pearson’s correlation coefficient 0.978 (p < 0.001), R Squared = 0.956. The Intraclass correlation coefficient was 0.986 (95% CI 0.977-0.991, p < 0.001), and Cronbach’s alpha measure of consistency = 0.986, confirming excellent correlation and consistency. No significant difference (p = 0.922, 95% CI: −5.53 to 6.10) was detected between the cell counts of the two methods. Conclusions The novel automated algorithm enabled accurate quantification of apoptosing RGCs that is highly comparable to manual counting, and appears to minimise operator-bias, whilst being both fast and reproducible. This may prove to be a valuable method of quantifying apoptosing retinal

  9. Complexities of Counting.

    ERIC Educational Resources Information Center

    Stake, Bernadine Evans

    This document focuses on one child's skip counting methods. The pupil, a second grade student at Steuben School, in Kankakee, Illinois, was interviewed as she made several attempts at counting twenty-five poker chips on a circular piece of paper. The interview was part of a larger study of "Children's Conceptions of Number and Numeral,"…

  10. Is Parenting Child's Play? Kids Count in Missouri Report on Adolescent Pregnancy.

    ERIC Educational Resources Information Center

    Citizens for Missouri's Children, St. Louis.

    This Kids Count report presents current information on adolescent pregnancy rates in Missouri. Part 1, "Overview of Adolescent Pregnancy in Missouri," discusses the changing pregnancy, abortion, and birth rates for 15- to 19-year-old adolescents, racial differences in pregnancy risk, regional differences suggesting a link between…

  11. [Prognostic value of absolute monocyte count in chronic lymphocytic leukaemia].

    PubMed

    Szerafin, László; Jakó, János; Riskó, Ferenc

    2015-04-01

    The low peripheral absolute lymphocyte and high monocyte count have been reported to correlate with poor clinical outcome in various lymphomas and other cancers. However, a few data known about the prognostic value of absolute monocyte count in chronic lymphocytic leukaemia. The aim of the authors was to investigate the impact of absolute monocyte count measured at the time of diagnosis in patients with chronic lymphocytic leukaemia on the time to treatment and overal survival. Between January 1, 2005 and December 31, 2012, 223 patients with newly-diagnosed chronic lymphocytic leukaemia were included. The rate of patients needing treatment, time to treatment, overal survival and causes of mortality based on Rai stages, CD38, ZAP-70 positivity and absolute monocyte count were analyzed. Therapy was necessary in 21.1%, 57.4%, 88.9%, 88.9% and 100% of patients in Rai stage 0, I, II, III an IV, respectively; in 61.9% and 60.8% of patients exhibiting CD38 and ZAP-70 positivity, respectively; and in 76.9%, 21.2% and 66.2% of patients if the absolute monocyte count was <0.25 G/l, between 0.25-0.75 G/l and >0.75 G/l, respectively. The median time to treatment and the median overal survival were 19.5, 65, and 35.5 months; and 41.5, 65, and 49.5 months according to the three groups of monocyte counts. The relative risk of beginning the therapy was 1.62 (p<0.01) in patients with absolute monocyte count <0.25 G/l or >0.75 G/l, as compared to those with 0.25-0.75 G/l, and the risk of overal survival was 2.41 (p<0.01) in patients with absolute monocyte count lower than 0.25 G/l as compared to those with higher than 0.25 G/l. The relative risks remained significant in Rai 0 patients, too. The leading causes of mortality were infections (41.7%) and the chronic lymphocytic leukaemia (58.3%) in patients with low monocyte count, while tumours (25.9-35.3%) and other events (48.1 and 11.8%) occurred in patients with medium or high monocyte counts. Patients with low and high monocyte

  12. Effects of lek count protocols on greater sage-grouse population trend estimates

    USGS Publications Warehouse

    Monroe, Adrian; Edmunds, David; Aldridge, Cameron L.

    2016-01-01

    Annual counts of males displaying at lek sites are an important tool for monitoring greater sage-grouse populations (Centrocercus urophasianus), but seasonal and diurnal variation in lek attendance may increase variance and bias of trend analyses. Recommendations for protocols to reduce observation error have called for restricting lek counts to within 30 minutes of sunrise, but this may limit the number of lek counts available for analysis, particularly from years before monitoring was widely standardized. Reducing the temporal window for conducting lek counts also may constrain the ability of agencies to monitor leks efficiently. We used lek count data collected across Wyoming during 1995−2014 to investigate the effect of lek counts conducted between 30 minutes before and 30, 60, or 90 minutes after sunrise on population trend estimates. We also evaluated trends across scales relevant to management, including statewide, within Working Group Areas and Core Areas, and for individual leks. To further evaluate accuracy and precision of trend estimates from lek count protocols, we used simulations based on a lek attendance model and compared simulated and estimated values of annual rate of change in population size (λ) from scenarios of varying numbers of leks, lek count timing, and count frequency (counts/lek/year). We found that restricting analyses to counts conducted within 30 minutes of sunrise generally did not improve precision of population trend estimates, although differences among timings increased as the number of leks and count frequency decreased. Lek attendance declined >30 minutes after sunrise, but simulations indicated that including lek counts conducted up to 90 minutes after sunrise can increase the number of leks monitored compared to trend estimates based on counts conducted within 30 minutes of sunrise. This increase in leks monitored resulted in greater precision of estimates without reducing accuracy. Increasing count

  13. Inter-rater reliability of malaria parasite counts and comparison of methods

    PubMed Central

    2009-01-01

    Background The introduction of artemesinin-based treatment for falciparum malaria has led to a shift away from symptom-based diagnosis. Diagnosis may be achieved by using rapid non-microscopic diagnostic tests (RDTs), of which there are many available. Light microscopy, however, has a central role in parasite identification and quantification and remains the main method of parasite-based diagnosis in clinic and hospital settings and is necessary for monitoring the accuracy of RDTs. The World Health Organization has prepared a proficiency testing panel containing a range of malaria-positive blood samples of known parasitaemia, to be used for the assessment of commercially available malaria RDTs. Different blood film and counting methods may be used for this purpose, which raises questions regarding accuracy and reproducibility. A comparison was made of the established methods for parasitaemia estimation to determine which would give the least inter-rater and inter-method variation Methods Experienced malaria microscopists counted asexual parasitaemia on different slides using three methods; the thin film method using the total erythrocyte count, the thick film method using the total white cell count and the Earle and Perez method. All the slides were stained using Giemsa pH 7.2. Analysis of variance (ANOVA) models were used to find the inter-rater reliability for the different methods. The paired t-test was used to assess any systematic bias between the two methods, and a regression analysis was used to see if there was a changing bias with parasite count level. Results The thin blood film gave parasite counts around 30% higher than those obtained by the thick film and Earle and Perez methods, but exhibited a loss of sensitivity with low parasitaemia. The thick film and Earle and Perez methods showed little or no bias in counts between the two methods, however, estimated inter-rater reliability was slightly better for the thick film method. Conclusion The thin film

  14. Repeatability of paired counts.

    PubMed

    Alexander, Neal; Bethony, Jeff; Corrêa-Oliveira, Rodrigo; Rodrigues, Laura C; Hotez, Peter; Brooker, Simon

    2007-08-30

    The Bland and Altman technique is widely used to assess the variation between replicates of a method of clinical measurement. It yields the repeatability, i.e. the value within which 95 per cent of repeat measurements lie. The valid use of the technique requires that the variance is constant over the data range. This is not usually the case for counts of items such as CD4 cells or parasites, nor is the log transformation applicable to zero counts. We investigate the properties of generalized differences based on Box-Cox transformations. For an example, in a data set of hookworm eggs counted by the Kato-Katz method, the square root transformation is found to stabilize the variance. We show how to back-transform the repeatability on the square root scale to the repeatability of the counts themselves, as an increasing function of the square mean root egg count, i.e. the square of the average of square roots. As well as being more easily interpretable, the back-transformed results highlight the dependence of the repeatability on the sample volume used.

  15. Statistical aspects of point count sampling

    USGS Publications Warehouse

    Barker, R.J.; Sauer, J.R.; Ralph, C.J.; Sauer, J.R.; Droege, S.

    1995-01-01

    The dominant feature of point counts is that they do not census birds, but instead provide incomplete counts of individuals present within a survey plot. Considering a simple model for point count sampling, we demon-strate that use of these incomplete counts can bias estimators and testing procedures, leading to inappropriate conclusions. A large portion of the variability in point counts is caused by the incomplete counting, and this within-count variation can be confounded with ecologically meaningful varia-tion. We recommend caution in the analysis of estimates obtained from point counts. Using; our model, we also consider optimal allocation of sampling effort. The critical step in the optimization process is in determining the goals of the study and methods that will be used to meet these goals. By explicitly defining the constraints on sampling and by estimating the relationship between precision and bias of estimators and time spent counting, we can predict the optimal time at a point for each of several monitoring goals. In general, time spent at a point will differ depending on the goals of the study.

  16. 45 CFR 261.34 - Are there any limitations in counting job search and job readiness assistance toward the...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 2 2011-10-01 2011-10-01 false Are there any limitations in counting job search and job readiness assistance toward the participation rates? 261.34 Section 261.34 Public Welfare... Work Activities and How Do They Count? § 261.34 Are there any limitations in counting job search and...

  17. 45 CFR 261.34 - Are there any limitations in counting job search and job readiness assistance toward the...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 2 2013-10-01 2012-10-01 true Are there any limitations in counting job search and job readiness assistance toward the participation rates? 261.34 Section 261.34 Public Welfare... Work Activities and How Do They Count? § 261.34 Are there any limitations in counting job search and...

  18. 45 CFR 261.34 - Are there any limitations in counting job search and job readiness assistance toward the...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 45 Public Welfare 2 2010-10-01 2010-10-01 false Are there any limitations in counting job search and job readiness assistance toward the participation rates? 261.34 Section 261.34 Public Welfare... Work Activities and How Do They Count? § 261.34 Are there any limitations in counting job search and...

  19. 45 CFR 261.34 - Are there any limitations in counting job search and job readiness assistance toward the...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 2 2012-10-01 2012-10-01 false Are there any limitations in counting job search and job readiness assistance toward the participation rates? 261.34 Section 261.34 Public Welfare... Work Activities and How Do They Count? § 261.34 Are there any limitations in counting job search and...

  20. 45 CFR 261.34 - Are there any limitations in counting job search and job readiness assistance toward the...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 2 2014-10-01 2012-10-01 true Are there any limitations in counting job search and job readiness assistance toward the participation rates? 261.34 Section 261.34 Public Welfare... Work Activities and How Do They Count? § 261.34 Are there any limitations in counting job search and...

  1. Experimental evaluation of the extended Dytlewski-style dead time correction formalism for neutron multiplicity counting

    DOE PAGES

    Lockhart, M.; Henzlova, D.; Croft, S.; ...

    2017-09-20

    Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli (DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory andmore » implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. Here, we discuss and present the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. To assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. In conclusion, the DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.« less

  2. Experimental evaluation of the extended Dytlewski-style dead time correction formalism for neutron multiplicity counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lockhart, M.; Henzlova, D.; Croft, S.

    Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli (DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory andmore » implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. Here, we discuss and present the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. To assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. In conclusion, the DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.« less

  3. A miniaturized 4 K platform for superconducting infrared photon counting detectors

    NASA Astrophysics Data System (ADS)

    Gemmell, Nathan R.; Hills, Matthew; Bradshaw, Tom; Rawlings, Tom; Green, Ben; Heath, Robert M.; Tsimvrakidis, Konstantinos; Dobrovolskiy, Sergiy; Zwiller, Val; Dorenbos, Sander N.; Crook, Martin; Hadfield, Robert H.

    2017-11-01

    We report on a miniaturized platform for superconducting infrared photon counting detectors. We have implemented a fibre-coupled superconducting nanowire single photon detector in a Stirling/Joule-Thomson platform with a base temperature of 4.2 K. We have verified a cooling power of 4 mW at 4.7 K. We report 20% system detection efficiency at 1310 nm wavelength at a dark count rate of 1 kHz. We have carried out compelling application demonstrations in single photon depth metrology and singlet oxygen luminescence detection.

  4. Study of the Residual Background Events in Ground Data from the ASTRO-HSXS Microcalorimeter

    NASA Technical Reports Server (NTRS)

    Kilbourne, Caroline A.; Boyce, Kevin R.; Chiao, M. P.; Eckart, M. E.; Kelley, R. L.; Leutenegger, M. A.; Porter, F. S.; Watanabe, T.; Ishisaki, Y.; Yamada, S.; hide

    2015-01-01

    The measured instrumental background of the XRS calorimeter spectrometer of Suzaku had several sources, including primary cosmic rays and secondary particles interacting with the pixels and with the silicon structure of the array. Prior to the launch of Suzaku, several data sets were taken without x-ray illumination to study the characteristics and timing of background signals produced in the array and anti-coincidence detector. Even though the source of the background in the laboratory was different from that in low-earth orbit (muons and environmental gamma-rays on the ground versus Galactic cosmic-ray (GCR) protons and alpha particles in space), the study of correlations and properties of populations of rare events was useful for establishing the preliminary screening parameters needed for selection of good science data. Sea-level muons are singly charged minimum-ionizing particles, like the GCR protons, and thus were good probes of the effectiveness of screening via the signals from the anti-coincidence detector. Here we present the first analysis of the on-ground background of the SXS calorimeter of Astro-H. On XRS, the background prior to screening was completely dominated by coincident events on many pixels resulting from the temperature pulse arising from each large energy deposition (greater than 200 keV) into the silicon frame around the array. The improved heat-sinking of the SXS array compared with XRS eliminated these thermal disturbances, greatly reducing the measured count rate in the absence of illumination. The removal of these events has made it easier to study the nature of the residual background and to look for additional event populations. We compare the SXS residual background to that measured in equivalent ground data for XRS and discuss these preliminary results.

  5. The Effects of Gamma and Proton Radiation Exposure on Hematopoietic Cell Counts in the Ferret Model

    PubMed Central

    Sanzari, Jenine K.; Wan, X. Steven; Krigsfeld, Gabriel S.; Wroe, Andrew J.; Gridley, Daila S.; Kennedy, Ann R.

    2014-01-01

    Exposure to total-body radiation induces hematological changes, which can detriment one's immune response to wounds and infection. Here, the decreases in blood cell counts after acute radiation doses of γ-ray or proton radiation exposure, at the doses and dose-rates expected during a solar particle event (SPE), are reported in the ferret model system. Following the exposure to γ-ray or proton radiation, the ferret peripheral total white blood cell (WBC) and lymphocyte counts decreased whereas neutrophil count increased within 3 hours. At 48 hours after irradiation, the WBC, neutrophil, and lymphocyte counts decreased in a dose-dependent manner but were not significantly affected by the radiation type (γ-rays verses protons) or dose rate (0.5 Gy/minute verses 0.5 Gy/hour). The loss of these blood cells could accompany and contribute to the physiological symptoms of the acute radiation syndrome (ARS). PMID:25356435

  6. Radon exhalation rate and natural radionuclide content in building materials of high background areas of Ramsar, Iran.

    PubMed

    Bavarnegin, E; Fathabadi, N; Vahabi Moghaddam, M; Vasheghani Farahani, M; Moradi, M; Babakhni, A

    2013-03-01

    Radon exhalation rates from building materials used in high background radiation areas (HBRA) of Ramsar were measured using an active radon gas analyzer with an emanation container. Radon exhalation rates from these samples varied from below the lower detection limit up to 384 Bq.m(-2) h(-1). The (226)Ra, (232)Th and (40)K contents were also measured using a high resolution HPGe gamma- ray spectrometer system. The activity concentration of (226)Ra, (232)Th and (40)K content varied from below the minimum detection limit up to 86,400 Bq kg(-1), 187 Bq kg(-1) and 1350 Bq kg(-1), respectively. The linear correlation coefficient between radon exhalation rate and radium concentration was 0.90. The result of this survey shows that radon exhalation rate and radium content in some local stones used as basements are extremely high and these samples are main sources of indoor radon emanation as well as external gamma radiation from uranium series. Copyright © 2012 Elsevier Ltd. All rights reserved.

  7. A high dynamic range pulse counting detection system for mass spectrometry.

    PubMed

    Collings, Bruce A; Dima, Martian D; Ivosev, Gordana; Zhong, Feng

    2014-01-30

    A high dynamic range pulse counting system has been developed that demonstrates an ability to operate at up to 2e8 counts per second (cps) on a triple quadrupole mass spectrometer. Previous pulse counting detection systems have typically been limited to about 1e7 cps at the upper end of the systems dynamic range. Modifications to the detection electronics and dead time correction algorithm are described in this paper. A high gain transimpedance amplifier is employed that allows a multi-channel electron multiplier to be operated at a significantly lower bias potential than in previous pulse counting systems. The system utilises a high-energy conversion dynode, a multi-channel electron multiplier, a high gain transimpedance amplifier, non-paralysing detection electronics and a modified dead time correction algorithm. Modification of the dead time correction algorithm is necessary due to a characteristic of the pulse counting electronics. A pulse counting detection system with the capability to count at ion arrival rates of up to 2e8 cps is described. This is shown to provide a linear dynamic range of nearly five orders of magnitude for a sample of aprazolam with concentrations ranging from 0.0006970 ng/mL to 3333 ng/mL while monitoring the m/z 309.1 → m/z 205.2 transition. This represents an upward extension of the detector's linear dynamic range of about two orders of magnitude. A new high dynamic range pulse counting system has been developed demonstrating the ability to operate at up to 2e8 cps on a triple quadrupole mass spectrometer. This provides an upward extension of the detector's linear dynamic range by about two orders of magnitude over previous pulse counting systems. Copyright © 2013 John Wiley & Sons, Ltd.

  8. Blood eosinophil count thresholds and exacerbations in patients with chronic obstructive pulmonary disease.

    PubMed

    Yun, Jeong H; Lamb, Andrew; Chase, Robert; Singh, Dave; Parker, Margaret M; Saferali, Aabida; Vestbo, Jørgen; Tal-Singer, Ruth; Castaldi, Peter J; Silverman, Edwin K; Hersh, Craig P

    2018-06-01

    Eosinophilic airway inflammation in patients with chronic obstructive pulmonary disease (COPD) is associated with exacerbations and responsivity to steroids, suggesting potential shared mechanisms with eosinophilic asthma. However, there is no consistent blood eosinophil count that has been used to define the increased exacerbation risk. We sought to investigate blood eosinophil counts associated with exacerbation risk in patients with COPD. Blood eosinophil counts and exacerbation risk were analyzed in patients with moderate-to-severe COPD by using 2 independent studies of former and current smokers with longitudinal data. The Genetic Epidemiology of COPD (COPDGene) study was analyzed for discovery (n = 1,553), and the Evaluation of COPD Longitudinally to Identify Predictive Surrogate Endpoints (ECLIPSE) study was analyzed for validation (n = 1,895). A subset of the ECLIPSE study subjects were used to assess the stability of blood eosinophil counts over time. COPD exacerbation risk increased with higher eosinophil counts. An eosinophil count threshold of 300 cells/μL or greater showed adjusted incidence rate ratios for exacerbations of 1.32 in the COPDGene study (95% CI, 1.10-1.63). The cutoff of 300 cells/μL or greater was validated for prospective risk of exacerbation in the ECLIPSE study, with adjusted incidence rate ratios of 1.22 (95% CI, 1.06-1.41) using 3-year follow-up data. Stratified analysis confirmed that the increased exacerbation risk associated with an eosinophil count of 300 cells/μL or greater was driven by subjects with a history of frequent exacerbations in both the COPDGene and ECLIPSE studies. Patients with moderate-to-severe COPD and blood eosinophil counts of 300 cells/μL or greater had an increased risk exacerbations in the COPDGene study, which was prospectively validated in the ECLIPSE study. Copyright © 2018 American Academy of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.

  9. Extended range radiation dose-rate monitor

    DOEpatents

    Valentine, Kenneth H.

    1988-01-01

    An extended range dose-rate monitor is provided which utilizes the pulse pileup phenomenon that occurs in conventional counting systems to alter the dynamic response of the system to extend the dose-rate counting range. The current pulses from a solid-state detector generated by radiation events are amplified and shaped prior to applying the pulses to the input of a comparator. The comparator generates one logic pulse for each input pulse which exceeds the comparator reference threshold. These pulses are integrated and applied to a meter calibrated to indicate the measured dose-rate in response to the integrator output. A portion of the output signal from the integrator is fed back to vary the comparator reference threshold in proportion to the output count rate to extend the sensitive dynamic detection range by delaying the asymptotic approach of the integrator output toward full scale as measured by the meter.

  10. Importance of background rates of disease in assessment of vaccine safety during mass immunisation with pandemic H1N1 influenza vaccines

    PubMed Central

    Black, Steven; Eskola, Juhani; Siegrist, Claire-Anne; Halsey, Neal; MacDonald, Noni; Law, Barbara; Miller, Elizabeth; Andrews, Nick; Stowe, Julia; Salmon, Daniel; Vannice, Kirsten; Izurieta, Hector S; Akhtar, Aysha; Gold, Mike; Oselka, Gabriel; Zuber, Patrick; Pfeifer, Dina; Vellozzi, Claudia

    2010-01-01

    Because of the advent of a new influenza A H1N1 strain, many countries have begun mass immunisation programmes. Awareness of the background rates of possible adverse events will be a crucial part of assessment of possible vaccine safety concerns and will help to separate legitimate safety concerns from events that are temporally associated with but not caused by vaccination. We identified background rates of selected medical events for several countries. Rates of disease events varied by age, sex, method of ascertainment, and geography. Highly visible health conditions, such as Guillain-Barré syndrome, spontaneous abortion, or even death, will occur in coincident temporal association with novel influenza vaccination. On the basis of the reviewed data, if a cohort of 10 million individuals was vaccinated in the UK, 21·5 cases of Guillain-Barré syndrome and 5·75 cases of sudden death would be expected to occur within 6 weeks of vaccination as coincident background cases. In female vaccinees in the USA, 86·3 cases of optic neuritis per 10 million population would be expected within 6 weeks of vaccination. 397 per 1 million vaccinated pregnant women would be predicted to have a spontaneous abortion within 1 day of vaccination. PMID:19880172

  11. Arkansas Kids Count Data Book 1995: A Portrait of Arkansas' Children.

    ERIC Educational Resources Information Center

    Arkansas Advocates for Children and Families, Little Rock.

    This Kids Count report is the third to examine the well-being of Arkansas' children and the first to provide trend information. The statistical report is based on 10 core indicators of well-being: (1) unemployment rate and per capita personal income; (2) federal and state assistance program participation rates; (3) percent of high school students…

  12. A matrix-inversion method for gamma-source mapping from gamma-count data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adsley, Ian; Burgess, Claire; Bull, Richard K

    In a previous paper it was proposed that a simple matrix inversion method could be used to extract source distributions from gamma-count maps, using simple models to calculate the response matrix. The method was tested using numerically generated count maps. In the present work a 100 kBq Co{sup 60} source has been placed on a gridded surface and the count rate measured using a NaI scintillation detector. The resulting map of gamma counts was used as input to the matrix inversion procedure and the source position recovered. A multi-source array was simulated by superposition of several single-source count maps andmore » the source distribution was again recovered using matrix inversion. The measurements were performed for several detector heights. The effects of uncertainties in source-detector distances on the matrix inversion method are also examined. The results from this work give confidence in the application of the method to practical applications, such as the segregation of highly active objects amongst fuel-element debris. (authors)« less

  13. Count-doubling time safety circuit

    DOEpatents

    Rusch, Gordon K.; Keefe, Donald J.; McDowell, William P.

    1981-01-01

    There is provided a nuclear reactor count-factor-increase time monitoring circuit which includes a pulse-type neutron detector, and means for counting the number of detected pulses during specific time periods. Counts are compared and the comparison is utilized to develop a reactor scram signal, if necessary.

  14. Evaluation of Pulse Counting for the Mars Organic Mass Analyzer (MOMA) Ion Trap Detection Scheme

    NASA Technical Reports Server (NTRS)

    Van Amerom, Friso H.; Short, Tim; Brinckerhoff, William; Mahaffy, Paul; Kleyner, Igor; Cotter, Robert J.; Pinnick, Veronica; Hoffman, Lars; Danell, Ryan M.; Lyness, Eric I.

    2011-01-01

    The Mars Organic Mass Analyzer is being developed at Goddard Space Flight Center to identify organics and possible biological compounds on Mars. In the process of characterizing mass spectrometer size, weight, and power consumption, the use of pulse counting was considered for ion detection. Pulse counting has advantages over analog-mode amplification of the electron multiplier signal. Some advantages are reduced size of electronic components, low power consumption, ability to remotely characterize detector performance, and avoidance of analog circuit noise. The use of pulse counting as a detection method with ion trap instruments is relatively rare. However, with the recent development of high performance electrical components, this detection method is quite suitable and can demonstrate significant advantages over analog methods. Methods A prototype quadrupole ion trap mass spectrometer with an internal electron ionization source was used as a test setup to develop and evaluate the pulse-counting method. The anode signal from the electron multiplier was preamplified. The an1plified signal was fed into a fast comparator for pulse-level discrimination. The output of the comparator was fed directly into a Xilinx FPGA development board. Verilog HDL software was written to bin the counts at user-selectable intervals. This system was able to count pulses at rates in the GHz range. The stored ion count nun1ber per bin was transferred to custom ion trap control software. Pulse-counting mass spectra were compared with mass spectra obtained using the standard analog-mode ion detection. Prelin1inary Data Preliminary mass spectra have been obtained for both analog mode and pulse-counting mode under several sets of instrument operating conditions. Comparison of the spectra revealed better peak shapes for pulse-counting mode. Noise levels are as good as, or better than, analog-mode detection noise levels. To artificially force ion pile-up conditions, the ion trap was overfilled

  15. A Multi-Contact, Low Capacitance HPGe Detector for High Rate Gamma Spectroscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cox, Christopher

    2014-12-04

    The detection, identification and non-destructive assay of special nuclear materials and nuclear fission by-products are critically important activities in support of nuclear non-proliferation programs. Both national and international nuclear safeguard agencies recognize that current accounting methods for spent nuclear fuel are inadequate from a safeguards perspective. Radiation detection and analysis by gamma-ray spectroscopy is a key tool in this field, but no instrument exists that can deliver the required performance (energy resolution and detection sensitivity) in the presence of very high background count rates encountered in the nuclear safeguards arena. The work of this project addresses this critical need bymore » developing a unique gamma-ray detector based on high purity germanium that has the previously unachievable property of operating in the 1 million counts-per-second range while achieving state-of-the-art energy resolution necessary to identify and analyze the isotopes of interest. The technical approach was to design and fabricate a germanium detector with multiple segmented electrodes coupled to multi-channel high rate spectroscopy electronics. Dividing the germanium detector’s signal electrode into smaller sections offers two advantages; firstly, the energy resolution of the detector is potentially improved, and secondly, the detector is able to operate at higher count rates. The design challenges included the following; determining the optimum electrode configuration to meet the stringent energy resolution and count rate requirements; determining the electronic noise (and therefore energy resolution) of the completed system after multiple signals are recombined; designing the germanium crystal housing and vacuum cryostat; and customizing electronics to perform the signal recombination function in real time. In this phase I work, commercial off-the-shelf electrostatic modeling software was used to develop the segmented germanium crystal

  16. Expected background in the LZ experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kudryavtsev, Vitaly A.

    2015-08-17

    The LZ experiment, featuring a 7-tonne active liquid xenon target, is aimed at achieving unprecedented sensitivity to WIMPs with the background expected to be dominated by astrophysical neutrinos. To reach this goal, extensive simulations are carried out to accurately calculate the electron recoil and nuclear recoil rates in the detector. Both internal (from target material) and external (from detector components and surrounding environment) backgrounds are considered. A very efficient suppression of background rate is achieved with an outer liquid scintillator veto, liquid xenon skin and fiducialisation. Based on the current measurements of radioactivity of different materials, it is shown thatmore » LZ can achieve the reduction of a total background for a WIMP search down to about 2 events in 1000 live days for 5.6 tonne fiducial mass.« less

  17. Expected background in the LZ experiment

    NASA Astrophysics Data System (ADS)

    Kudryavtsev, Vitaly A.

    2015-08-01

    The LZ experiment, featuring a 7-tonne active liquid xenon target, is aimed at achieving unprecedented sensitivity to WIMPs with the background expected to be dominated by astrophysical neutrinos. To reach this goal, extensive simulations are carried out to accurately calculate the electron recoil and nuclear recoil rates in the detector. Both internal (from target material) and external (from detector components and surrounding environment) backgrounds are considered. A very efficient suppression of background rate is achieved with an outer liquid scintillator veto, liquid xenon skin and fiducialisation. Based on the current measurements of radioactivity of different materials, it is shown that LZ can achieve the reduction of a total background for a WIMP search down to about 2 events in 1000 live days for 5.6 tonne fiducial mass.

  18. Performance Evaluation of High Fluorescence Lymphocyte Count: Comparability to Atypical Lymphocyte Count and Clinical Significance.

    PubMed

    Tantanate, Chaicharoen; Klinbua, Cherdsak

    2018-06-15

    To investigate the association between high-fluorescence lymphocyte cell (HFLC) and atypical lymphocyte (AL) counts, and to determine the clinical significance of HFLC. We compared automated HFLC and microscopic AL counts and analyzed the findings. Patient clinical data for each specimen were reviewed. A total of 320 blood specimens were included. The correlation between HFLC and microscopic AL counts was 0.865 and 0.893 for absolute and percentage counts, respectively. Sensitivity, specificity, and accuracy of HFLC at the cutoff value of 0.1 × 109 per L for detection of AL were 0.8, 0.77, and 0.8, respectively. Studied patients were classified into 4 groups: infection, immunological disorders, malignant neoplasms, and others. Patients with infections had the highest HFLC. Most of those patients (67.7%) had dengue infection. HFLC counts were well-correlated with AL counts with the acceptable test characteristics. Applying HFLC flagging may alert laboratory staff to be aware of ALs.

  19. Carica papaya Leaves Juice Significantly Accelerates the Rate of Increase in Platelet Count among Patients with Dengue Fever and Dengue Haemorrhagic Fever

    PubMed Central

    Subenthiran, Soobitha; Choon, Tan Chwee; Cheong, Kee Chee; Thayan, Ravindran; Teck, Mok Boon; Muniandy, Prem Kumar; Afzan, Adlin; Abdullah, Noor Rain; Ismail, Zakiah

    2013-01-01

    The study was conducted to investigate the platelet increasing property of Carica papaya leaves juice (CPLJ) in patients with dengue fever (DF). An open labeled randomized controlled trial was carried out on 228 patients with DF and dengue haemorrhagic fever (DHF). Approximately half the patients received the juice, for 3 consecutive days while the others remained as controls and received the standard management. Their full blood count was monitored 8 hours for 48 hours. Gene expression studies were conducted on the ALOX 12 and PTAFR genes. The mean increase in platelet counts were compared in both groups using repeated measure ANCOVA. There was a significant increase in mean platelet count observed in the intervention group (P < 0.001) but not in the control group 40 hours since the first dose of CPLJ. Comparison of mean platelet count between intervention and control group showed that mean platelet count in intervention group was significantly higher than control group after 40 and 48 hours of admission (P < 0.01). The ALOX 12 (FC  =  15.00) and PTAFR (FC  =  13.42) genes were highly expressed among those on the juice. It was concluded that CPLJ does significantly increase the platelet count in patients with DF and DHF. PMID:23662145

  20. Possible correlation between annual gravity change and shallow background seismicity rate at subduction zone by surface load

    NASA Astrophysics Data System (ADS)

    Mitsui, Yuta; Yamada, Kyohei

    2017-12-01

    The Gravity Recovery and Climate Experiment (GRACE) has monitored global gravity changes since 2002. Gravity changes are considered to represent hydrological water mass movements around the surface of the globe, although fault slip of a large earthquake also causes perturbation of gravity. Since surface water movements are expected to affect earthquake occurrences via elastic surface load or pore-fluid pressure increase, correlation between gravity changes and occurrences of small (not large) earthquakes may reflect the effects of surface water movements. In the present study, we focus on earthquakes smaller than magnitude 7.5 and examine the relation between annual gravity changes and earthquake occurrences at worldwide subduction zones. First, we extract amplitudes of annual gravity changes from GRACE data for land. Next, we estimate background seismicity rates in the epidemic-type aftershock sequence model from shallow seismicity data having magnitudes of over 4.5. Then, we perform correlation analysis of the amplitudes of the annual gravity changes and the shallow background seismicity rates, excluding source areas of large earthquakes, and find moderate positive correlation. It implies that annual water movements can activate shallow earthquakes, although the surface load elastostatic stress changes are on the order of or below 1 kPa, as small as a regional case in a previous study. We speculate that periodic stress perturbation is amplified through nonlinear responses of frictional faults.[Figure not available: see fulltext.

  1. Effect of background correction on peak detection and quantification in online comprehensive two-dimensional liquid chromatography using diode array detection.

    PubMed

    Allen, Robert C; John, Mallory G; Rutan, Sarah C; Filgueira, Marcelo R; Carr, Peter W

    2012-09-07

    A singular value decomposition-based background correction (SVD-BC) technique is proposed for the reduction of background contributions in online comprehensive two-dimensional liquid chromatography (LC×LC) data. The SVD-BC technique was compared to simply subtracting a blank chromatogram from a sample chromatogram and to a previously reported background correction technique for one dimensional chromatography, which uses an asymmetric weighted least squares (AWLS) approach. AWLS was the only background correction technique to completely remove the background artifacts from the samples as evaluated by visual inspection. However, the SVD-BC technique greatly reduced or eliminated the background artifacts as well and preserved the peak intensity better than AWLS. The loss in peak intensity by AWLS resulted in lower peak counts at the detection thresholds established using standards samples. However, the SVD-BC technique was found to introduce noise which led to detection of false peaks at the lower detection thresholds. As a result, the AWLS technique gave more precise peak counts than the SVD-BC technique, particularly at the lower detection thresholds. While the AWLS technique resulted in more consistent percent residual standard deviation values, a statistical improvement in peak quantification after background correction was not found regardless of the background correction technique used. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Digital coincidence counting

    NASA Astrophysics Data System (ADS)

    Buckman, S. M.; Ius, D.

    1996-02-01

    This paper reports on the development of a digital coincidence-counting system which comprises a custom-built data acquisition card and associated PC software. The system has been designed to digitise the pulse-trains from two radiation detectors at a rate of 20 MSamples/s with 12-bit resolution. Through hardware compression of the data, the system can continuously record both individual pulse-shapes and the time intervals between pulses. Software-based circuits are used to process the stored pulse trains. These circuits are constructed simply by linking together icons representing various components such as coincidence mixers, time delays, single-channel analysers, deadtimes and scalers. This system enables a pair of pulse trains to be processed repeatedly using any number of different methods. Some preliminary results are presented in order to demonstrate the versatility and efficiency of this new method.

  3. Numerical investigation on the implications of spring temperature and discharge rate with respect to the geothermal background in a fault zone

    NASA Astrophysics Data System (ADS)

    Jiang, Zhenjiao; Xu, Tianfu; Mariethoz, Gregoire

    2018-04-01

    Geothermal springs are some of the most obvious indicators of the existence of high-temperature geothermal resources in the subsurface. However, geothermal springs can also occur in areas of low average subsurface temperatures, which makes it difficult to assess exploitable zones. To address this problem, this study quantitatively analyzes the conditions associated with the formation of geothermal springs in fault zones, and numerically investigates the implications that outflow temperature and discharge rate from geothermal springs have on the geothermal background in the subsurface. It is concluded that the temperature of geothermal springs in fault zones is mainly controlled by the recharge rate from the country rock and the hydraulic conductivity in the fault damage zone. Importantly, the topography of the fault trace on the land surface plays an important role in determining the thermal temperature. In fault zones with a permeability higher than 1 mD and a lateral recharge rate from the country rock higher than 1 m3/day, convection plays a dominant role in the heat transport rather than thermal conduction. The geothermal springs do not necessarily occur in the place having an abnormal geothermal background (with the temperature at certain depth exceeding the temperature inferred by the global average continental geothermal gradient of 30 °C/km). Assuming a constant temperature (90 °C here, to represent a normal geothermal background in the subsurface at a depth of 3,000 m), the conditions required for the occurrence of geothermal springs were quantitatively determined.

  4. Estimating the Effective System Dead Time Parameter for Correlated Neutron Counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Croft, Stephen; Cleveland, Steve; Favalli, Andrea

    We present that neutron time correlation analysis is one of the main technical nuclear safeguards techniques used to verify declarations of, or to independently assay, special nuclear materials. Quantitative information is generally extracted from the neutron-event pulse train, collected from moderated assemblies of 3He proportional counters, in the form of correlated count rates that are derived from event-triggered coincidence gates. These count rates, most commonly referred to as singles, doubles and triples rates etc., when extracted using shift-register autocorrelation logic, are related to the reduced factorial moments of the time correlated clusters of neutrons emerging from the measurement items. Correctingmore » these various rates for dead time losses has received considerable attention recently. The dead time losses for the higher moments in particular, and especially for large mass (high rate and highly multiplying) items, can be significant. Consequently, even in thoughtfully designed systems, accurate dead time treatments are needed if biased mass determinations are to be avoided. In support of this effort, in this paper we discuss a new approach to experimentally estimate the effective system dead time of neutron coincidence counting systems. It involves counting a random neutron source (e.g. AmLi is a good approximation to a source without correlated emission) and relating the second and higher moments of the neutron number distribution recorded in random triggered interrogation coincidence gates to the effective value of dead time parameter. We develop the theoretical basis of the method and apply it to the Oak Ridge Large Volume Active Well Coincidence Counter using sealed AmLi radionuclide neutron sources and standard multiplicity shift register electronics. The method is simple to apply compared to the predominant present approach which involves using a set of 252Cf sources of wide emission rate, it gives excellent precision in a conveniently

  5. Estimating the effective system dead time parameter for correlated neutron counting

    NASA Astrophysics Data System (ADS)

    Croft, Stephen; Cleveland, Steve; Favalli, Andrea; McElroy, Robert D.; Simone, Angela T.

    2017-11-01

    Neutron time correlation analysis is one of the main technical nuclear safeguards techniques used to verify declarations of, or to independently assay, special nuclear materials. Quantitative information is generally extracted from the neutron-event pulse train, collected from moderated assemblies of 3He proportional counters, in the form of correlated count rates that are derived from event-triggered coincidence gates. These count rates, most commonly referred to as singles, doubles and triples rates etc., when extracted using shift-register autocorrelation logic, are related to the reduced factorial moments of the time correlated clusters of neutrons emerging from the measurement items. Correcting these various rates for dead time losses has received considerable attention recently. The dead time losses for the higher moments in particular, and especially for large mass (high rate and highly multiplying) items, can be significant. Consequently, even in thoughtfully designed systems, accurate dead time treatments are needed if biased mass determinations are to be avoided. In support of this effort, in this paper we discuss a new approach to experimentally estimate the effective system dead time of neutron coincidence counting systems. It involves counting a random neutron source (e.g. AmLi is a good approximation to a source without correlated emission) and relating the second and higher moments of the neutron number distribution recorded in random triggered interrogation coincidence gates to the effective value of dead time parameter. We develop the theoretical basis of the method and apply it to the Oak Ridge Large Volume Active Well Coincidence Counter using sealed AmLi radionuclide neutron sources and standard multiplicity shift register electronics. The method is simple to apply compared to the predominant present approach which involves using a set of 252Cf sources of wide emission rate, it gives excellent precision in a conveniently short time, and it

  6. Estimating the Effective System Dead Time Parameter for Correlated Neutron Counting

    DOE PAGES

    Croft, Stephen; Cleveland, Steve; Favalli, Andrea; ...

    2017-04-29

    We present that neutron time correlation analysis is one of the main technical nuclear safeguards techniques used to verify declarations of, or to independently assay, special nuclear materials. Quantitative information is generally extracted from the neutron-event pulse train, collected from moderated assemblies of 3He proportional counters, in the form of correlated count rates that are derived from event-triggered coincidence gates. These count rates, most commonly referred to as singles, doubles and triples rates etc., when extracted using shift-register autocorrelation logic, are related to the reduced factorial moments of the time correlated clusters of neutrons emerging from the measurement items. Correctingmore » these various rates for dead time losses has received considerable attention recently. The dead time losses for the higher moments in particular, and especially for large mass (high rate and highly multiplying) items, can be significant. Consequently, even in thoughtfully designed systems, accurate dead time treatments are needed if biased mass determinations are to be avoided. In support of this effort, in this paper we discuss a new approach to experimentally estimate the effective system dead time of neutron coincidence counting systems. It involves counting a random neutron source (e.g. AmLi is a good approximation to a source without correlated emission) and relating the second and higher moments of the neutron number distribution recorded in random triggered interrogation coincidence gates to the effective value of dead time parameter. We develop the theoretical basis of the method and apply it to the Oak Ridge Large Volume Active Well Coincidence Counter using sealed AmLi radionuclide neutron sources and standard multiplicity shift register electronics. The method is simple to apply compared to the predominant present approach which involves using a set of 252Cf sources of wide emission rate, it gives excellent precision in a conveniently

  7. [A multicenter study of correlation between peripheral lymphocyte counts and CD(+)4T cell counts in HIV/AIDS patients].

    PubMed

    Xie, Jing; Qiu, Zhifeng; Han, Yang; Li, Yanling; Song, Xiaojing; Li, Taisheng

    2015-02-01

    To evaluate the accuracy of lymphocyte count as a surrogate for CD(+)4T cell count in treatment-naїve HIV-infected adults. A total of 2 013 HIV-infected patients were screened at 23 sites in China. CD(+)4T cell counts were measured by flow cytometry. Correlation between CD(+)4T cell count and peripheral lymphocyte count were analyzed by spearman coefficient. AUCROC were used to evaluate the performance of lymphocyte count as a surrogate for CD(+)4T cell count. The lymphocyte count and CD(+)4T cell count of these 2 013 patients were (1 600 ± 670) × 10(6)/L and (244 ± 148) × 10(6)/L respectively. CD(+)4T cell count were positively correlated with lymphocyte count (r = 0.482, P < 0.000 1). AUCROC of lymphocyte count as a surrogate for CD(+)4T cell counts of <100×10(6)/L, <200×10(6)/L and <350×10(6)/L were 0.790 (95%CI 0.761-0.818, P < 0.000 1), 0.733 (95%CI 0.710-0.755, P < 0.000 1) and 0.732 (95%CI 0.706-0.758, P < 0.000 1) respectively. Lymphocyte count could be considerad as a potential surrogate marker for CD(+)4T cell count in HIV/AIDS patients not having access to T cell subset test by flowcytometry.

  8. Mitosis Counting in Breast Cancer: Object-Level Interobserver Agreement and Comparison to an Automatic Method

    PubMed Central

    Veta, Mitko; van Diest, Paul J.; Jiwa, Mehdi; Al-Janabi, Shaimaa; Pluim, Josien P. W.

    2016-01-01

    Background Tumor proliferation speed, most commonly assessed by counting of mitotic figures in histological slide preparations, is an important biomarker for breast cancer. Although mitosis counting is routinely performed by pathologists, it is a tedious and subjective task with poor reproducibility, particularly among non-experts. Inter- and intraobserver reproducibility of mitosis counting can be improved when a strict protocol is defined and followed. Previous studies have examined only the agreement in terms of the mitotic count or the mitotic activity score. Studies of the observer agreement at the level of individual objects, which can provide more insight into the procedure, have not been performed thus far. Methods The development of automatic mitosis detection methods has received large interest in recent years. Automatic image analysis is viewed as a solution for the problem of subjectivity of mitosis counting by pathologists. In this paper we describe the results from an interobserver agreement study between three human observers and an automatic method, and make two unique contributions. For the first time, we present an analysis of the object-level interobserver agreement on mitosis counting. Furthermore, we train an automatic mitosis detection method that is robust with respect to staining appearance variability and compare it with the performance of expert observers on an “external” dataset, i.e. on histopathology images that originate from pathology labs other than the pathology lab that provided the training data for the automatic method. Results The object-level interobserver study revealed that pathologists often do not agree on individual objects, even if this is not reflected in the mitotic count. The disagreement is larger for objects from smaller size, which suggests that adding a size constraint in the mitosis counting protocol can improve reproducibility. The automatic mitosis detection method can perform mitosis counting in an unbiased

  9. Components of the Extragalactic Gamma-Ray Background

    NASA Technical Reports Server (NTRS)

    Stecker, Floyd W.; Venters, Tonia M.

    2011-01-01

    We present new theoretical estimates of the relative contributions of unresolved blazars and star-forming galaxies to the extragalactic gamma-ray background (EGB) and discuss constraints on the contributions from alternative mechanisms such as dark matter annihilation and truly diffuse gamma-ray production. We find that the Fermi source count data do not rule out a scenario in which the EGB is dominated by emission from unresolved blazars, though unresolved star-forming galaxies may also contribute significantly to the background, within order-of-magnitude uncertainties. In addition, we find that the spectrum of the unresolved star-forming galaxy contribution cannot explain the EGB spectrum found by EGRET at energies between 50 and 200 MeV, whereas the spectrum of unresolved flat spectrum radio quasars, when accounting for the energy-dependent effects of source confusion, could be consistent with the combined spectrum of the low-energy EGRET EGB measurements and the Fermi-Large Area Telescope EGB measurements.

  10. Statistical Aspects of Point Count Sampling

    Treesearch

    Richard J. Barker; John R. Sauer

    1995-01-01

    The dominant feature of point counts is that they do not census birds, but instead provide incomplete counts of individuals present within a survey plot. Considering a simple model for point count sampling, we demonstrate that use of these incomplete counts can bias estimators and testing procedures, leading to inappropriate conclusions. A large portion of the...

  11. Photon Counting Energy Dispersive Detector Arrays for X-ray Imaging

    PubMed Central

    Iwanczyk, Jan S.; Nygård, Einar; Meirav, Oded; Arenson, Jerry; Barber, William C.; Hartsough, Neal E.; Malakhov, Nail; Wessel, Jan C.

    2009-01-01

    The development of an innovative detector technology for photon-counting in X-ray imaging is reported. This new generation of detectors, based on pixellated cadmium telluride (CdTe) and cadmium zinc telluride (CZT) detector arrays electrically connected to application specific integrated circuits (ASICs) for readout, will produce fast and highly efficient photon-counting and energy-dispersive X-ray imaging. There are a number of applications that can greatly benefit from these novel imagers including mammography, planar radiography, and computed tomography (CT). Systems based on this new detector technology can provide compositional analysis of tissue through spectroscopic X-ray imaging, significantly improve overall image quality, and may significantly reduce X-ray dose to the patient. A very high X-ray flux is utilized in many of these applications. For example, CT scanners can produce ~100 Mphotons/mm2/s in the unattenuated beam. High flux is required in order to collect sufficient photon statistics in the measurement of the transmitted flux (attenuated beam) during the very short time frame of a CT scan. This high count rate combined with a need for high detection efficiency requires the development of detector structures that can provide a response signal much faster than the transit time of carriers over the whole detector thickness. We have developed CdTe and CZT detector array structures which are 3 mm thick with 16×16 pixels and a 1 mm pixel pitch. These structures, in the two different implementations presented here, utilize either a small pixel effect or a drift phenomenon. An energy resolution of 4.75% at 122 keV has been obtained with a 30 ns peaking time using discrete electronics and a 57Co source. An output rate of 6×106 counts per second per individual pixel has been obtained with our ASIC readout electronics and a clinical CT X-ray tube. Additionally, the first clinical CT images, taken with several of our prototype photon-counting and energy

  12. Photon Counting Energy Dispersive Detector Arrays for X-ray Imaging.

    PubMed

    Iwanczyk, Jan S; Nygård, Einar; Meirav, Oded; Arenson, Jerry; Barber, William C; Hartsough, Neal E; Malakhov, Nail; Wessel, Jan C

    2009-01-01

    The development of an innovative detector technology for photon-counting in X-ray imaging is reported. This new generation of detectors, based on pixellated cadmium telluride (CdTe) and cadmium zinc telluride (CZT) detector arrays electrically connected to application specific integrated circuits (ASICs) for readout, will produce fast and highly efficient photon-counting and energy-dispersive X-ray imaging. There are a number of applications that can greatly benefit from these novel imagers including mammography, planar radiography, and computed tomography (CT). Systems based on this new detector technology can provide compositional analysis of tissue through spectroscopic X-ray imaging, significantly improve overall image quality, and may significantly reduce X-ray dose to the patient. A very high X-ray flux is utilized in many of these applications. For example, CT scanners can produce ~100 Mphotons/mm(2)/s in the unattenuated beam. High flux is required in order to collect sufficient photon statistics in the measurement of the transmitted flux (attenuated beam) during the very short time frame of a CT scan. This high count rate combined with a need for high detection efficiency requires the development of detector structures that can provide a response signal much faster than the transit time of carriers over the whole detector thickness. We have developed CdTe and CZT detector array structures which are 3 mm thick with 16×16 pixels and a 1 mm pixel pitch. These structures, in the two different implementations presented here, utilize either a small pixel effect or a drift phenomenon. An energy resolution of 4.75% at 122 keV has been obtained with a 30 ns peaking time using discrete electronics and a (57)Co source. An output rate of 6×10(6) counts per second per individual pixel has been obtained with our ASIC readout electronics and a clinical CT X-ray tube. Additionally, the first clinical CT images, taken with several of our prototype photon-counting and

  13. Improved background rejection in neutrinoless double beta decay experiments using a magnetic field in a high pressure xenon TPC

    NASA Astrophysics Data System (ADS)

    Renner, J.; Cervera, A.; Hernando, J. A.; Imzaylov, A.; Monrabal, F.; Muñoz, J.; Nygren, D.; Gomez-Cadenas, J. J.

    2015-12-01

    We demonstrate that the application of an external magnetic field could lead to an improved background rejection in neutrinoless double-beta (0νββ) decay experiments using a high-pressure xenon (HPXe) TPC. HPXe chambers are capable of imaging electron tracks, a feature that enhances the separation between signal events (the two electrons emitted in the 0νββ decay of 136Xe) and background events, arising chiefly from single electrons of kinetic energy compatible with the end-point of the 0νββ decay (0Qββ). Applying an external magnetic field of sufficiently high intensity (in the range of 0.5-1 Tesla for operating pressures in the range of 5-15 atmospheres) causes the electrons to produce helical tracks. Assuming the tracks can be properly reconstructed, the sign of the curvature can be determined at several points along these tracks, and such information can be used to separate signal (0νββ) events containing two electrons producing a track with two different directions of curvature from background (single-electron) events producing a track that should spiral in a single direction. Due to electron multiple scattering, this strategy is not perfectly efficient on an event-by-event basis, but a statistical estimator can be constructed which can be used to reject background events by one order of magnitude at a moderate cost (about 30%) in signal efficiency. Combining this estimator with the excellent energy resolution and topological signature identification characteristic of the HPXe TPC, it is possible to reach a background rate of less than one count per ton-year of exposure. Such a low background rate is an essential feature of the next generation of 0νββ experiments, aiming to fully explore the inverse hierarchy of neutrino masses.

  14. You can count on the motor cortex: Finger counting habits modulate motor cortex activation evoked by numbers

    PubMed Central

    Tschentscher, Nadja; Hauk, Olaf; Fischer, Martin H.; Pulvermüller, Friedemann

    2012-01-01

    The embodied cognition framework suggests that neural systems for perception and action are engaged during higher cognitive processes. In an event-related fMRI study, we tested this claim for the abstract domain of numerical symbol processing: is the human cortical motor system part of the representation of numbers, and is organization of numerical knowledge influenced by individual finger counting habits? Developmental studies suggest a link between numerals and finger counting habits due to the acquisition of numerical skills through finger counting in childhood. In the present study, digits 1 to 9 and the corresponding number words were presented visually to adults with different finger counting habits, i.e. left- and right-starters who reported that they usually start counting small numbers with their left and right hand, respectively. Despite the absence of overt hand movements, the hemisphere contralateral to the hand used for counting small numbers was activated when small numbers were presented. The correspondence between finger counting habits and hemispheric motor activation is consistent with an intrinsic functional link between finger counting and number processing. PMID:22133748

  15. Total lymphocyte count and subpopulation lymphocyte counts in relation to dietary intake and nutritional status of peritoneal dialysis patients.

    PubMed

    Grzegorzewska, Alicja E; Leander, Magdalena

    2005-01-01

    Dietary deficiency causes abnormalities in circulating lymphocyte counts. For the present paper, we evaluated correlations between total and subpopulation lymphocyte counts (TLC, SLCs) and parameters of nutrition in peritoneal dialysis (PD) patients. Studies were carried out in 55 patients treated with PD for 22.2 +/- 11.4 months. Parameters of nutritional status included total body mass, lean body mass (LBM), body mass index (BMI), and laboratory indices [total protein, albumin, iron, ferritin, and total iron binding capacity (TIBC)]. The SLCs were evaluated using flow cytometry. Positive correlations were seen between TLC and dietary intake of niacin; TLC and CD8 and CD16+56 counts and energy delivered from protein; CD4 count and beta-carotene and monounsaturated fatty acids 17:1 intake; and CD19 count and potassium, copper, vitamin A, and beta-carotene intake. Anorexia negatively influenced CD19 count. Serum albumin showed correlations with CD4 and CD19 counts, and LBM with CD19 count. A higher CD19 count was connected with a higher red blood cell count, hemoglobin, and hematocrit. Correlations were observed between TIBC and TLC and CD3 and CD8 counts, and between serum Fe and TLC and CD3 and CD4 counts. Patients with a higher CD19 count showed a better clinical-laboratory score, especially less weakness. Patients with a higher CD4 count had less expressed insomnia. Quantities of ingested vitamins and minerals influence lymphocyte counts in the peripheral blood of PD patients. Evaluation of TLC and SLCs is helpful in monitoring the effectiveness of nutrition in these patients.

  16. Apollo-Soyuz survey of the extreme-ultraviolet/soft X-ray background

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stern, R.; Bowyer, S.

    1979-06-15

    The results of an extensive sky survey of the extreme-ultraviolet (EUV)/soft X-ray background are reported. The data were obtained with a telescope, designed and calibrated at the University of California at Berkeley, which observed EUV sources and the diffuse background as part of the Apollo-Soyuz mission in 1975 July. With a primary field of view of 2 /sup 0/.3 +- 0 /sup 0/.1 FWHM and four EUV bandpass filters (16--25, 20--73, 80--108, and 80--250 eV), the EUV telescope obtained useful background data for 21 sky points, 11 large angle scans, and an additional group of short observations of both types.more » Analysis of the data reveals an intense 80--108 eV diffuse flux of 4.0 +- 1.3 photons cm/sup -2/ sr/sup -1/ eV/sup -1/ (broad-band weighted average). This is roughly a factor of 10 higher than the corresponding 150--280 eV average intensity and confirms the earlier results of Cash, Malina, and Stern. Galactic contributions to the background intensity at still lower energies are most likely masked by large fluxes of geocoronal or interplanetary solar-scattered resonance radiation; however, we drive upper limits to the local galactic background of 2 x 10/sup 4/ and 6 x 10/sup 2/ photons cm/sup -2/ sr/sup -1/ eV/sup -1/ averaged over the 16--25 eV and 20--73 eV bands, respectively. The uniformity of the background flux is uncertain due to limitations in the statistical accuracy of the data; we discuss probable upper limits to any spatial anisotropy. No evidence is found for a correlation between the telescope count rate and Earth-based parameters (zenith angle, Sun angle, etc.) for E> or approx. =80 eV. Unlike some previous claims for the soft X-ray background, no simple dependence upon galactic latitude is seen.Fitting models of thermal emission to the Apollo-Soyuz data yields constraints on model parameters that are consistent for a limited range of temperatures with the EUV results of Cash, Malina, and Stern and the soft X-ray data of Burstein et al.« less

  17. Predicting U.S. tuberculosis case counts through 2020.

    PubMed

    Woodruff, Rachel S Y E L K; Winston, Carla A; Miramontes, Roque

    2013-01-01

    In 2010, foreign-born persons accounted for 60% of all tuberculosis (TB) cases in the United States. Understanding which national groups make up the highest proportion of TB cases will assist TB control programs in concentrating limited resources where they can provide the greatest impact on preventing transmission of TB disease. The objective of our study was to predict through 2020 the numbers of U.S. TB cases among U.S.-born, foreign-born and foreign-born persons from selected countries of birth. TB case counts reported through the National Tuberculosis Surveillance System from 2000-2010 were log-transformed, and linear regression was performed to calculate predicted annual case counts and 95% prediction intervals for 2011-2020. Data were analyzed in 2011 before 2011 case counts were known. Decreases were predicted between 2010 observed and 2020 predicted counts for total TB cases (11,182 to 8,117 [95% prediction interval 7,262-9,073]) as well as TB cases among foreign-born persons from Mexico (1,541 to 1,420 [1,066-1,892]), the Philippines (740 to 724 [569-922]), India (578 to 553 [455-672]), Vietnam (532 to 429 [367-502]) and China (364 to 328 [249-433]). TB cases among persons who are U.S.-born and foreign-born were predicted to decline 47% (4,393 to 2,338 [2,113-2,586]) and 6% (6,720 to 6,343 [5,382-7,476]), respectively. Assuming rates of declines observed from 2000-2010 continue until 2020, a widening gap between the numbers of U.S.-born and foreign-born TB cases was predicted. TB case count predictions will help TB control programs identify needs for cultural competency, such as languages and interpreters needed for translating materials or engaging in appropriate community outreach.

  18. Ultrafast photon counting applied to resonant scanning STED microscopy.

    PubMed

    Wu, Xundong; Toro, Ligia; Stefani, Enrico; Wu, Yong

    2015-01-01

    To take full advantage of fast resonant scanning in super-resolution stimulated emission depletion (STED) microscopy, we have developed an ultrafast photon counting system based on a multigiga sample per second analogue-to-digital conversion chip that delivers an unprecedented 450 MHz pixel clock (2.2 ns pixel dwell time in each scan). The system achieves a large field of view (∼50 × 50 μm) with fast scanning that reduces photobleaching, and advances the time-gated continuous wave STED technology to the usage of resonant scanning with hardware-based time-gating. The assembled system provides superb signal-to-noise ratio and highly linear quantification of light that result in superior image quality. Also, the system design allows great flexibility in processing photon signals to further improve the dynamic range. In conclusion, we have constructed a frontier photon counting image acquisition system with ultrafast readout rate, excellent counting linearity, and with the capacity of realizing resonant-scanning continuous wave STED microscopy with online time-gated detection. © 2014 The Authors Journal of Microscopy © 2014 Royal Microscopical Society.

  19. Kids Count in Delaware: Fact Book 1999 [and] Families Count in Delaware: Fact Book, 1999.

    ERIC Educational Resources Information Center

    Delaware Univ., Newark. Kids Count in Delaware.

    This Kids Count Fact Book is combined with the Families Count Fact Book to provide information on statewide trends affecting children and families in Delaware. The Kids Count statistical profile is based on 10 main indicators of child well-being: (1) births to teens; (2) low birth weight babies; (3) infant mortality; (4) child deaths; (5) teen…

  20. Comparison of pregnancy rates in pre-treatment male infertility and low total motile sperm count at insemination.

    PubMed

    Xiao, Cheng Wei; Agbo, Chioma; Dahan, Michael H

    2016-01-01

    In intrauterine insemination (IUI), total motile sperm count (TMSC) is an important predictor of pregnancy. However, the clinical significance of a poor TMSC on the day of IUI in a patient with prior normal semen analysis (SA) is unclear. We performed this study to determine if these patients perform as poorly as those who had male factor infertility diagnosed prior to commencing treatment. 147 males with two abnormal SA based on the 2010 World Health Organization criteria underwent 356 IUI with controlled ovarian hyper-stimulation (COH). Their pregnancy rates were compared to 120 males who had abnormal TMSC at the time of 265 IUI with COH, in a retrospective university-based study. The two groups were comparable in female age (p = 0.11), duration of infertility (p = 0.17), previous pregnancies (p = 0.13), female basal serum FSH level (p = 0.54) and number of mature follicles on the day of ovulation trigger (p = 0.27). Despite better semen parameters on the day of IUI in the pre-treatment male factor infertility group (TMSC mean ± SD: 61 ± 30 million vs. 3.5 ± 2 million, p < 0.001), pregnancy rates were much higher in the group with low TMSC on the day of IUI (5 % vs. 17 %, p < 0.001). A patient with a recent (within 6 months) normal pre-treatment SA but low TMSC on the day of IUI likely has a reasonable chance to achieve pregnancy, and does not perform as poorly as subjects previously diagnosed with male factor infertility. More studies should be performed to confirm these findings.

  1. Point counts from clustered populations: Lessons from an experiment with Hawaiian crows

    USGS Publications Warehouse

    Hayward, G.D.; Kepler, C.B.; Scott, J.M.

    1991-01-01

    We designed an experiment to identify factors contributing most to error in counts of Hawaiian Crow or Alala (Corvus hawaiiensis) groups that are detected aurally. Seven observers failed to detect calling Alala on 197 of 361 3-min point counts on four transects extending from cages with captive Alala. A detection curve describing the relation between frequency of flock detection and distance typified the distribution expected in transect or point counts. Failure to detect calling Alala was affected most by distance, observer, and Alala calling frequency. The number of individual Alala calling was not important in detection rate. Estimates of the number of Alala calling (flock size) were biased and imprecise: average difference between number of Alala calling and number heard was 3.24 (.+-. 0.277). Distance, observer, number of Alala calling, and Alala calling frequency all contributed to errors in estimates of group size (P < 0.0001). Multiple regression suggested that number of Alala calling contributed most to errors. These results suggest that well-designed point counts may be used to estimate the number of Alala flocks but cast doubt on attempts to estimate flock size when individuals are counted aurally.

  2. 32-channel single photon counting module for ultrasensitive detection of DNA sequences

    NASA Astrophysics Data System (ADS)

    Gudkov, Georgiy; Dhulla, Vinit; Borodin, Anatoly; Gavrilov, Dmitri; Stepukhovich, Andrey; Tsupryk, Andrey; Gorbovitski, Boris; Gorfinkel, Vera

    2006-10-01

    We continue our work on the design and implementation of multi-channel single photon detection systems for highly sensitive detection of ultra-weak fluorescence signals, for high-performance, multi-lane DNA sequencing instruments. A fiberized, 32-channel single photon detection (SPD) module based on single photon avalanche diode (SPAD), model C30902S-DTC, from Perkin Elmer Optoelectronics (PKI) has been designed and implemented. Unavailability of high performance, large area SPAD arrays and our desire to design high performance photon counting systems drives us to use individual diodes. Slight modifications in our quenching circuit has doubled the linear range of our system from 1MHz to 2MHz, which is the upper limit for these devices and the maximum saturation count rate has increased to 14 MHz. The detector module comprises of a single board computer PC-104 that enables data visualization, recording, processing, and transfer. Very low dark count (300-1000 counts/s), robust, efficient, simple data collection and processing, ease of connectivity to any other application demanding similar requirements and similar performance results to the best commercially available single photon counting module (SPCM from PKI) are some of the features of this system.

  3. Command Decision-Making: Experience Counts

    DTIC Science & Technology

    2005-03-18

    USAWC STRATEGY RESEARCH PROJECT COMMAND DECISION - MAKING : EXPERIENCE COUNTS by Lieutenant Colonel Kelly A. Wolgast United States Army Colonel Charles...1. REPORT DATE 18 MAR 2005 2. REPORT TYPE 3. DATES COVERED - 4. TITLE AND SUBTITLE Command Decision Making Experience Counts 5a. CONTRACT...Colonel Kelly A. Wolgast TITLE: Command Decision - making : Experience Counts FORMAT: Strategy Research Project DATE: 18 March 2005 PAGES: 30 CLASSIFICATION

  4. Negative Binomial Process Count and Mixture Modeling.

    PubMed

    Zhou, Mingyuan; Carin, Lawrence

    2015-02-01

    The seemingly disjoint problems of count and mixture modeling are united under the negative binomial (NB) process. A gamma process is employed to model the rate measure of a Poisson process, whose normalization provides a random probability measure for mixture modeling and whose marginalization leads to an NB process for count modeling. A draw from the NB process consists of a Poisson distributed finite number of distinct atoms, each of which is associated with a logarithmic distributed number of data samples. We reveal relationships between various count- and mixture-modeling distributions and construct a Poisson-logarithmic bivariate distribution that connects the NB and Chinese restaurant table distributions. Fundamental properties of the models are developed, and we derive efficient Bayesian inference. It is shown that with augmentation and normalization, the NB process and gamma-NB process can be reduced to the Dirichlet process and hierarchical Dirichlet process, respectively. These relationships highlight theoretical, structural, and computational advantages of the NB process. A variety of NB processes, including the beta-geometric, beta-NB, marked-beta-NB, marked-gamma-NB and zero-inflated-NB processes, with distinct sharing mechanisms, are also constructed. These models are applied to topic modeling, with connections made to existing algorithms under Poisson factor analysis. Example results show the importance of inferring both the NB dispersion and probability parameters.

  5. Metals processing control by counting molten metal droplets

    DOEpatents

    Schlienger, Eric; Robertson, Joanna M.; Melgaard, David; Shelmidine, Gregory J.; Van Den Avyle, James A.

    2000-01-01

    Apparatus and method for controlling metals processing (e.g., ESR) by melting a metal ingot and counting molten metal droplets during melting. An approximate amount of metal in each droplet is determined, and a melt rate is computed therefrom. Impedance of the melting circuit is monitored, such as by calculating by root mean square a voltage and current of the circuit and dividing the calculated current into the calculated voltage. Analysis of the impedance signal is performed to look for a trace characteristic of formation of a molten metal droplet, such as by examining skew rate, curvature, or a higher moment.

  6. The optimal on-source region size for detections with counting-type telescopes

    NASA Astrophysics Data System (ADS)

    Klepser, S.

    2017-03-01

    Source detection in counting type experiments such as Cherenkov telescopes often involves the application of the classical Eq. (17) from the paper of Li & Ma (1983) to discrete on- and off-source regions. The on-source region is typically a circular area with radius θ in which the signal is expected to appear with the shape of the instrument point spread function (PSF). This paper addresses the question of what is the θ that maximises the probability of detection for a given PSF width and background event density. In the high count number limit and assuming a Gaussian PSF profile, the optimum is found to be at ζ∞2 ≈ 2.51 times the squared PSF width σPSF392. While this number is shown to be a good choice in many cases, a dynamic formula for cases of lower count numbers, which favour larger on-source regions, is given. The recipe to get to this parametrisation can also be applied to cases with a non-Gaussian PSF. This result can standardise and simplify analysis procedures, reduce trials and eliminate the need for experience-based ad hoc cut definitions or expensive case-by-case Monte Carlo simulations.

  7. You can count on the motor cortex: finger counting habits modulate motor cortex activation evoked by numbers.

    PubMed

    Tschentscher, Nadja; Hauk, Olaf; Fischer, Martin H; Pulvermüller, Friedemann

    2012-02-15

    The embodied cognition framework suggests that neural systems for perception and action are engaged during higher cognitive processes. In an event-related fMRI study, we tested this claim for the abstract domain of numerical symbol processing: is the human cortical motor system part of the representation of numbers, and is organization of numerical knowledge influenced by individual finger counting habits? Developmental studies suggest a link between numerals and finger counting habits due to the acquisition of numerical skills through finger counting in childhood. In the present study, digits 1 to 9 and the corresponding number words were presented visually to adults with different finger counting habits, i.e. left- and right-starters who reported that they usually start counting small numbers with their left and right hand, respectively. Despite the absence of overt hand movements, the hemisphere contralateral to the hand used for counting small numbers was activated when small numbers were presented. The correspondence between finger counting habits and hemispheric motor activation is consistent with an intrinsic functional link between finger counting and number processing. Copyright © 2011 Elsevier Inc. All rights reserved.

  8. Performance of 3DOSEM and MAP algorithms for reconstructing low count SPECT acquisitions.

    PubMed

    Grootjans, Willem; Meeuwis, Antoi P W; Slump, Cornelis H; de Geus-Oei, Lioe-Fee; Gotthardt, Martin; Visser, Eric P

    2016-12-01

    Low count single photon emission computed tomography (SPECT) is becoming more important in view of whole body SPECT and reduction of radiation dose. In this study, we investigated the performance of several 3D ordered subset expectation maximization (3DOSEM) and maximum a posteriori (MAP) algorithms for reconstructing low count SPECT images. Phantom experiments were conducted using the National Electrical Manufacturers Association (NEMA) NU2 image quality (IQ) phantom. The background compartment of the phantom was filled with varying concentrations of pertechnetate and indiumchloride, simulating various clinical imaging conditions. Images were acquired using a hybrid SPECT/CT scanner and reconstructed with 3DOSEM and MAP reconstruction algorithms implemented in Siemens Syngo MI.SPECT (Flash3D) and Hermes Hybrid Recon Oncology (Hyrid Recon 3DOSEM and MAP). Image analysis was performed by calculating the contrast recovery coefficient (CRC),percentage background variability (N%), and contrast-to-noise ratio (CNR), defined as the ratio between CRC and N%. Furthermore, image distortion is characterized by calculating the aspect ratio (AR) of ellipses fitted to the hot spheres. Additionally, the performance of these algorithms to reconstruct clinical images was investigated. Images reconstructed with 3DOSEM algorithms demonstrated superior image quality in terms of contrast and resolution recovery when compared to images reconstructed with filtered-back-projection (FBP), OSEM and 2DOSEM. However, occurrence of correlated noise patterns and image distortions significantly deteriorated the quality of 3DOSEM reconstructed images. The mean AR for the 37, 28, 22, and 17mm spheres was 1.3, 1.3, 1.6, and 1.7 respectively. The mean N% increase in high and low count Flash3D and Hybrid Recon 3DOSEM from 5.9% and 4.0% to 11.1% and 9.0%, respectively. Similarly, the mean CNR decreased in high and low count Flash3D and Hybrid Recon 3DOSEM from 8.7 and 8.8 to 3.6 and 4.2, respectively

  9. Sensitivity analysis of pulse pileup model parameter in photon counting detectors

    NASA Astrophysics Data System (ADS)

    Shunhavanich, Picha; Pelc, Norbert J.

    2017-03-01

    Photon counting detectors (PCDs) may provide several benefits over energy-integrating detectors (EIDs), including spectral information for tissue characterization and the elimination of electronic noise. PCDs, however, suffer from pulse pileup, which distorts the detected spectrum and degrades the accuracy of material decomposition. Several analytical models have been proposed to address this problem. The performance of these models are dependent on the assumptions used, including the estimated pulse shape whose parameter values could differ from the actual physical ones. As the incident flux increases and the corrections become more significant the needed parameter value accuracy may be more crucial. In this work, the sensitivity of model parameter accuracies is analyzed for the pileup model of Taguchi et al. The spectra distorted by pileup at different count rates are simulated using either the model or Monte Carlo simulations, and the basis material thicknesses are estimated by minimizing the negative log-likelihood with Poisson or multivariate Gaussian distributions. From simulation results, we find that the accuracy of the deadtime, the height of pulse negative tail, and the timing to the end of the pulse are more important than most other parameters, and they matter more with increasing count rate. This result can help facilitate further work on parameter calibrations.

  10. Compensated gadolinium-loaded plastic scintillators for thermal neutron detection (and counting)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dumazert, Jonathan; Coulon, Romain; Bertrand, Guillaume H. V.

    2015-07-01

    Plastic scintillator loading with gadolinium-rich organometallic complexes shows a high potential for the deployment of efficient and cost-effective neutron detectors. Due to the low-energy photon and electron signature of thermal neutron capture by gadolinium-155 and gadolinium-157, alternative treatment to Pulse Shape Discrimination has to be proposed in order to display a trustable count rate. This paper discloses the principle of a compensation method applied to a two-scintillator system: a detection scintillator interacts with photon radiation and is loaded with gadolinium organometallic compound to become a thermal neutron absorber, while a non-gadolinium loaded compensation scintillator solely interacts with the photon partmore » of the incident radiation. Posterior to the nonlinear smoothing of the counting signals, a hypothesis test determines whether the resulting count rate after photon response compensation falls into statistical fluctuations or provides a robust image of a neutron activity. A laboratory prototype is tested under both photon and neutron irradiations, allowing us to investigate the performance of the overall compensation system in terms of neutron detection, especially with regards to a commercial helium-3 counter. The study reveals satisfactory results in terms of sensitivity and orientates future investigation toward promising axes. (authors)« less

  11. A study of cellular counting to determine minimum thresholds for adequacy for liquid-based cervical cytology using a survey and counting protocol.

    PubMed

    Kitchener, Henry C; Gittins, Matthew; Desai, Mina; Smith, John H F; Cook, Gary; Roberts, Chris; Turnbull, Lesley

    2015-03-01

    progressively diluted cervical samples. Laboratory practice varied in terms of threshold of cellular adequacy and of morphological markers of adequacy. While SP laboratories generally used a minimum acceptable cell count (MACC) of 15,000, the MACC employed by TP laboratories varied between 5000 and 15,000. The cell counting study showed that a standard protocol achieved moderate to strong inter-rater reproducibility. Analysis of slide reporting from laboratories revealed that a large proportion of the samples reported as inadequate had cell counts above a threshold of 15,000 for SP, and 5000 and 10,000 for TP. Inter-rater unanimity was greater among more cellular preparations. Dilution studies demonstrated greater detection of abnormalities in slides with counts above the MACC and among slides with more than 25 dyskaryotic cells. Variation in laboratory practice demonstrates a requirement for evidence-based standards for designating a MACC. This study has indicated that a MACC of 15,000 and 5000 for SP and TP, respectively, achieves a balance in terms of maintaining sensitivity and low inadequacy rates. The findings of this study should inform the development of laboratory practice guidelines. The National Institute for Health Research Health Technology Assessment programme.

  12. The prevalence of abnormal leukocyte count, and its predisposing factors, in patients with sickle cell disease in Saudi Arabia.

    PubMed

    Ahmed, Anwar E; Ali, Yosra Z; Al-Suliman, Ahmad M; Albagshi, Jafar M; Al Salamah, Majid; Elsayid, Mohieldin; Alanazi, Wala R; Ahmed, Rayan A; McClish, Donna K; Al-Jahdali, Hamdan

    2017-01-01

    High white blood cell (WBC) count is an indicator of sickle cell disease (SCD) severity, however, there are limited studies on WBC counts in Saudi Arabian patients with SCD. The aim of this study was to estimate the prevalence of abnormal leukocyte count (either low or high) and identify factors associated with high WBC counts in a sample of Saudi patients with SCD. A cross-sectional and retrospective chart review study was carried out on 290 SCD patients who were routinely treated at King Fahad Hospital in Hofuf, Saudi Arabia. An interview was conducted to assess clinical presentations, and we reviewed patient charts to collect data on blood test parameters for the previous 6 months. Almost half (131 [45.2%]) of the sample had abnormal leukocyte counts: low WBC counts 15 (5.2%) and high 116 (40%). High WBC counts were associated with shortness of breath ( P =0.022), tiredness ( P =0.039), swelling in hands/feet ( P =0.020), and back pain ( P =0.007). The mean hemoglobin was higher in patients with normal WBC counts ( P =0.024), while the mean hemoglobin S was high in patients with high WBC counts ( P =0.003). After adjustment for potential confounders, predictors of high WBC counts were male gender (adjusted odds ratio [aOR]=3.63) and patients with cough (aOR=2.18), low hemoglobin (aOR=0.76), and low heart rate (aOR=0.97). Abnormal leukocyte count was common: approximately five in ten Saudi SCD patients assessed in this sample. Male gender, cough, low hemoglobin, and low heart rate were associated with high WBC count. Strategies targeting high WBC count could prevent disease complication and thus could be beneficial for SCD patients.

  13. Dying dyons don't count

    NASA Astrophysics Data System (ADS)

    Cheng, Miranda C. N.; Verlinde, Erik P.

    2007-09-01

    The dyonic 1/4-BPS states in 4D string theory with Script N = 4 spacetime supersymmetry are counted by a Siegel modular form. The pole structure of the modular form leads to a contour dependence in the counting formula obscuring its duality invariance. We exhibit the relation between this ambiguity and the (dis-)appearance of bound states of 1/2-BPS configurations. Using this insight we propose a precise moduli-dependent contour prescription for the counting formula. We then show that the degeneracies are duality-invariant and are correctly adjusted at the walls of marginal stability to account for the (dis-)appearance of the two-centered bound states. Especially, for large black holes none of these bound states exists at the attractor point and none of these ambiguous poles contributes to the counting formula. Using this fact we also propose a second, moduli-independent contour which counts the ``immortal dyons" that are stable everywhere.

  14. EFFECT OF HAIR COLOR AND SUN SENSITIVITY ON NEVUS COUNTS IN WHITE CHILDREN IN COLORADO

    PubMed Central

    Aalborg, Jenny; Morelli, Joseph G.; Byers, Tim E.; Mokrohisky, Stefan T.; Crane, Lori A.

    2013-01-01

    BACKGROUND It has been widely reported that individuals with a light phenotype (i.e., light hair color, light base skin color, and propensity to burn) have more nevi and are at greater risk for developing skin cancer. No studies have systematically investigated how phenotypic traits may interact in relation to nevus development. OBJECTIVE We sought to systematically examine whether any combinations of phenotype are associated with a greater or lesser risk for nevus development in white children. METHODS In the summer of 2007, 654 children were examined to determine full body nevus counts, skin color by colorimetry, and hair and eye color by comparison to charts. Interviews of parents were conducted to capture sun sensitivity, sun exposure and sun protection practices. RESULTS Among 9-year-old children with sun sensitivity rating type 2 (painful burn/light tan), those with light hair had lower nevus counts than did those with dark hair (p-value for interaction = 0.03). This relationship was independent of eye color, presence of freckling, gender, usual daily sun exposure, sunburn in 2004–2007, sun protection index and waterside vacation sun exposure. The difference in nevus counts was further determined to be specific to small nevi (less than 2 mm) and nevi in intermittently exposed body sites. LIMITATIONS Geographic and genetic differences in other study populations may produce different results. CONCLUSION The standard acceptance that dark phenotype is a marker for low melanoma risk and light phenotype a marker for high risk may need to be reevaluated. In non-Hispanic white children, dark haired individuals who burn readily and then tan slightly are more prone to nevus development, and may therefore be a previously under-recognized high risk group for melanoma. PMID:20584558

  15. Gamma-ray spectroscopy at MHz counting rates with a compact LaBr3 detector and silicon photomultipliers for fusion plasma applications.

    PubMed

    Nocente, M; Rigamonti, D; Perseo, V; Tardocchi, M; Boltruczyk, G; Broslawski, A; Cremona, A; Croci, G; Gosk, M; Kiptily, V; Korolczuk, S; Mazzocco, M; Muraro, A; Strano, E; Zychor, I; Gorini, G

    2016-11-01

    Gamma-ray spectroscopy measurements at MHz counting rates have been carried out, for the first time, with a compact spectrometer based on a LaBr 3 scintillator and silicon photomultipliers. The instrument, which is also insensitive to magnetic fields, has been developed in view of the upgrade of the gamma-ray camera diagnostic for α particle measurements in deuterium-tritium plasmas of the Joint European Torus. Spectra were measured up to 2.9 MHz with a projected energy resolution of 3%-4% in the 3-5 MeV range, of interest for fast ion physics studies in fusion plasmas. The results reported here pave the way to first time measurements of the confined α particle profile in high power plasmas of the next deuterium-tritium campaign at the Joint European Torus.

  16. Mealtime Insulin Dosing by Carbohydrate Counting in Hospitalized Cardiology Patients: A Retrospective Cohort Study.

    PubMed

    Thurber, Kristina M; Dierkhising, Ross A; Reiland, Sarah A; Pearson, Kristina K; Smith, Steven A; O'Meara, John G

    2016-01-01

    Carbohydrate counting may improve glycemic control in hospitalized cardiology patients by providing individualized insulin doses tailored to meal consumption. The purpose of this study was to compare glycemic outcomes with mealtime insulin dosed by carbohydrate counting versus fixed dosing in the inpatient setting. This single-center retrospective cohort study included 225 adult medical cardiology patients who received mealtime, basal, and correction-scale insulin concurrently for at least 72 h and up to 7 days in the interval March 1, 2010-November 7, 2013. Mealtime insulin was dosed by carbohydrate counting or with fixed doses determined prior to meal intake. An inpatient diabetes consult service was responsible for insulin management. Exclusion criteria included receipt of an insulin infusion. The primary end point compared mean daily postprandial glucose values, whereas secondary end points included comparison of preprandial glucose values and mean daily rates of hypoglycemia. Mean postprandial glucose level on Day 7 was 204 and 183 mg/dL in the carbohydrate counting and fixed mealtime dose groups, respectively (unadjusted P=0.04, adjusted P=0.12). There were no statistical differences between groups on Days 2-6. Greater rates of preprandial hypoglycemia were observed in the carbohydrate counting cohort on Day 5 (8.6% vs. 1.5%, P=0.02), Day 6 (1.7% vs. 0%, P=0.01), and Day 7 (7.1% vs. 0%, P=0.008). No differences in postprandial hypoglycemia were seen. Mealtime insulin dosing by carbohydrate counting was associated with similar glycemic outcomes as fixed mealtime insulin dosing, except for a greater incidence of preprandial hypoglycemia. Additional comparative studies that include hospital outcomes are needed.

  17. Experimental analysis of the auditory detection process on avian point counts

    USGS Publications Warehouse

    Simons, T.R.; Alldredge, M.W.; Pollock, K.H.; Wettroth, J.M.

    2007-01-01

    We have developed a system for simulating the conditions of avian surveys in which birds are identified by sound. The system uses a laptop computer to control a set of amplified MP3 players placed at known locations around a survey point. The system can realistically simulate a known population of songbirds under a range of factors that affect detection probabilities. The goals of our research are to describe the sources and range of variability affecting point-count estimates and to find applications of sampling theory and methodologies that produce practical improvements in the quality of bird-census data. Initial experiments in an open field showed that, on average, observers tend to undercount birds on unlimited-radius counts, though the proportion of birds counted by individual observers ranged from 81% to 132% of the actual total. In contrast to the unlimited-radius counts, when data were truncated at a 50-m radius around the point, observers overestimated the total population by 17% to 122%. Results also illustrate how detection distances decline and identification errors increase with increasing levels of ambient noise. Overall, the proportion of birds heard by observers decreased by 28 ± 4.7% under breezy conditions, 41 ± 5.2% with the presence of additional background birds, and 42 ± 3.4% with the addition of 10 dB of white noise. These findings illustrate some of the inherent difficulties in interpreting avian abundance estimates based on auditory detections, and why estimates that do not account for variations in detection probability will not withstand critical scrutiny.

  18. Robust Data Detection for the Photon-Counting Free-Space Optical System With Implicit CSI Acquisition and Background Radiation Compensation

    NASA Astrophysics Data System (ADS)

    Song, Tianyu; Kam, Pooi-Yuen

    2016-02-01

    Since atmospheric turbulence and pointing errors cause signal intensity fluctuations and the background radiation surrounding the free-space optical (FSO) receiver contributes an undesired noisy component, the receiver requires accurate channel state information (CSI) and background information to adjust the detection threshold. In most previous studies, for CSI acquisition, pilot symbols were employed, which leads to a reduction of spectral and energy efficiency; and an impractical assumption that the background radiation component is perfectly known was made. In this paper, we develop an efficient and robust sequence receiver, which acquires the CSI and the background information implicitly and requires no knowledge about the channel model information. It is robust since it can automatically estimate the CSI and background component and detect the data sequence accordingly. Its decision metric has a simple form and involves no integrals, and thus can be easily evaluated. A Viterbi-type trellis-search algorithm is adopted to improve the search efficiency, and a selective-store strategy is adopted to overcome a potential error floor problem as well as to increase the memory efficiency. To further simplify the receiver, a decision-feedback symbol-by-symbol receiver is proposed as an approximation of the sequence receiver. By simulations and theoretical analysis, we show that the performance of both the sequence receiver and the symbol-by-symbol receiver, approach that of detection with perfect knowledge of the CSI and background radiation, as the length of the window for forming the decision metric increases.

  19. Characterization of a hybrid energy-resolving photon-counting detector

    NASA Astrophysics Data System (ADS)

    Zang, A.; Pelzer, G.; Anton, G.; Ballabriga Sune, R.; Bisello, F.; Campbell, M.; Fauler, A.; Fiederle, M.; Llopart Cudie, X.; Ritter, I.; Tennert, F.; Wölfel, S.; Wong, W. S.; Michel, T.

    2014-03-01

    Photon-counting detectors in medical x-ray imaging provide a higher dose efficiency than integrating detectors. Even further possibilities for imaging applications arise, if the energy of each photon counted is measured, as for example K-edge-imaging or optimizing image quality by applying energy weighting factors. In this contribution, we show results of the characterization of the Dosepix detector. This hybrid photon- counting pixel detector allows energy resolved measurements with a novel concept of energy binning included in the pixel electronics. Based on ideas of the Medipix detector family, it provides three different modes of operation: An integration mode, a photon-counting mode, and an energy-binning mode. In energy-binning mode, it is possible to set 16 energy thresholds in each pixel individually to derive a binned energy spectrum in every pixel in one acquisition. The hybrid setup allows using different sensor materials. For the measurements 300 μm Si and 1 mm CdTe were used. The detector matrix consists of 16 x 16 square pixels for CdTe (16 x 12 for Si) with a pixel pitch of 220 μm. The Dosepix was originally intended for applications in the field of radiation measurement. Therefore it is not optimized towards medical imaging. The detector concept itself still promises potential as an imaging detector. We present spectra measured in one single pixel as well as in the whole pixel matrix in energy-binning mode with a conventional x-ray tube. In addition, results concerning the count rate linearity for the different sensor materials are shown as well as measurements regarding energy resolution.

  20. SPERM COUNT DISTRIBUTIONS IN FERTILE MEN

    EPA Science Inventory

    Sperm concentration and count are often used as indicators of environmental impacts on male reproductive health. Existing clinical databases may be biased towards subfertile men with low sperm counts and less is known about expected sperm count distributions in cohorts of fertil...

  1. A quantile count model of water depth constraints on Cape Sable seaside sparrows

    USGS Publications Warehouse

    Cade, B.S.; Dong, Q.

    2008-01-01

    1. A quantile regression model for counts of breeding Cape Sable seaside sparrows Ammodramus maritimus mirabilis (L.) as a function of water depth and previous year abundance was developed based on extensive surveys, 1992-2005, in the Florida Everglades. The quantile count model extends linear quantile regression methods to discrete response variables, providing a flexible alternative to discrete parametric distributional models, e.g. Poisson, negative binomial and their zero-inflated counterparts. 2. Estimates from our multiplicative model demonstrated that negative effects of increasing water depth in breeding habitat on sparrow numbers were dependent on recent occupation history. Upper 10th percentiles of counts (one to three sparrows) decreased with increasing water depth from 0 to 30 cm when sites were not occupied in previous years. However, upper 40th percentiles of counts (one to six sparrows) decreased with increasing water depth for sites occupied in previous years. 3. Greatest decreases (-50% to -83%) in upper quantiles of sparrow counts occurred as water depths increased from 0 to 15 cm when previous year counts were 1, but a small proportion of sites (5-10%) held at least one sparrow even as water depths increased to 20 or 30 cm. 4. A zero-inflated Poisson regression model provided estimates of conditional means that also decreased with increasing water depth but rates of change were lower and decreased with increasing previous year counts compared to the quantile count model. Quantiles computed for the zero-inflated Poisson model enhanced interpretation of this model but had greater lack-of-fit for water depths > 0 cm and previous year counts 1, conditions where the negative effect of water depths were readily apparent and fitted better with the quantile count model.

  2. The complete blood count and reticulocyte count--are they necessary in the evaluation of acute vasoocclusive sickle-cell crisis?

    PubMed

    Lopez, B L; Griswold, S K; Navek, A; Urbanski, L

    1996-08-01

    To assess the usefulness of the complete blood count (CBC) and the reticulocyte count in the evaluation of adult patients with acute vasoocclusive sickle-cell crisis (SCC) presenting to the ED. A 2-part study was performed. Part 1 was retrospective chart review of patients with a sole ED diagnosis of acute SCC. Part 2 was a prospective evaluation of consecutive patients presenting in SCC. In both parts of the study, patients with coexisting acute disease were excluded. The remaining patients were divided into 2 groups: admitted and released. The mean values for white blood cell (WBC) count, hemoglobin (Hb) level, and reticulocyte count were compared. In Part 2, the change (delta) from the patient's baseline in WBC count, Hb level, and reticulocyte count also was determined. Data were analyzed by 2-tailed Student's t-test. Part 1: There was no difference between the admitted (n = 33) and the released (n = 86) groups in mean WBC count (p = 0.10), Hb level (p = 0.25), or reticulocyte count (p = 0.08). Part 2: There was no difference between the admitted (n = 44) and the released (n = 160) groups in mean Hb level (p = 0.88), reticulocyte count (p = 0.47), delta Hb level (p = 0.88), and delta reticulocyte count (p = 0.76). There was a difference in mean WBC counts (15.8 +/- 4.9 x 10(9)/L admitted vs 12.8 +/- 4.9 x 10(9)/L released, p = 0.003) and delta WBC counts (5.1 +/- 4.6 x 10(9)/L admitted vs 1.8 +/- 4.6 x 10(9)/L released, p < 0.002). Determination of the Hb level and the reticulocyte count do not appear useful in the evaluation of acute SCC in the ED. Admission decisions appear associated with elevations in the WBC count. Further study is required to determine the true value of the WBC count in such decisions.

  3. Duffy-Null–Associated Low Neutrophil Counts Influence HIV-1 Susceptibility in High-Risk South African Black Women

    PubMed Central

    Ramsuran, Veron; Kulkarni, Hemant; He, Weijing; Mlisana, Koleka; Wright, Edwina J.; Werner, Lise; Castiblanco, John; Dhanda, Rahul; Le, Tuan; Dolan, Matthew J.; Guan, Weihua; Weiss, Robin A.; Clark, Robert A.; Abdool Karim, Salim S.; Ndung'u, Thumbi

    2011-01-01

    Background. The Duffy-null trait and ethnic netropenia are both highly prevalent in Africa. The influence of pre-seroconversion levels of peripheral blood cell counts (PBCs) on the risk of acquiring human immunodeficiency virus (HIV)–1 infection among Africans is unknown. Methods. The triangular relationship among pre-seroconversion PBC counts, host genotypes, and risk of HIV acquisition was determined in a prospective cohort of black South African high-risk female sex workers. Twenty-seven women had seroconversion during follow-up, and 115 remained HIV negative for 2 years, despite engaging in high-risk activity. Results. Pre-seroconversion neutrophil counts in women who subsequently had seroconversion were significantly lower, whereas platelet counts were higher, compared with those who remained HIV negative. Comprising 27% of the cohort, subjects with pre-seroconversion neutrophil counts of <2500 cells/mm3 had a ∼3-fold greater risk of acquiring HIV infection. In a genome-wide association analyses, an African-specific polymorphism (rs2814778) in the promoter of Duffy Antigen Receptor for Chemokines (DARC −46T > C) was significantly associated with neutrophil counts (P = 7.9 × 10−11). DARC −46C/C results in loss of DARC expression on erthyrocytes (Duffy-null) and resistance to Plasmodium vivax malaria, and in our cohort, only subjects with this genotype had pre-seroconversion neutrophil counts of <2500 cells/mm3. The risk of acquiring HIV infection was ∼3-fold greater in those with the trait of Duffy-null–associated low neutrophil counts, compared with all other study participants. Conclusions. Pre-seroconversion neutrophil and platelet counts influence risk of HIV infection. The trait of Duffy-null–associated low neutrophil counts influences HIV susceptibility. Because of the high prevalence of this trait among persons of African ancestry, it may contribute to the dynamics of the HIV epidemic in Africa. PMID:21507922

  4. [Analysis on 2011 quality control results on aerobic plate count of microbiology laboratories in China].

    PubMed

    Han, Haihong; Li, Ning; Li, Yepeng; Fu, Ping; Yu, Dongmin; Li Zhigang; Du, Chunming; Guo, Yunchang

    2015-01-01

    To test the aerobic plate count examining capability of microbiology laboratories, to ensure the accuracy and comparability of quantitative bacteria examination results, and to improve the quality of monitoring. The 4 different concentration aerobic plate count piece samples were prepared and noted as I, II, III and IV. After homogeneity and stability tests, the samples were delivered to monitoring institutions. The results of I, II, III samples were logarithmic transformed, and evaluated with Z-score method using the robust average and standard deviation. The results of IV samples were evaluated as "satisfactory" when reported as < 10 CFU/piece or as "not satisfactory" otherwise. Pearson χ2 test was used to analyze the ratio results. 309 monitoring institutions, which was 99.04% of the total number, reported their results. 271 institutions reported a satisfactory result, and the satisfactory rate was 87.70%. There was no statistical difference in satisfactory rates of I, II and III samples which were 81.52%, 88.30% and 91.40% respectively. The satisfactory rate of IV samples was 93.33%. There was no statistical difference in satisfactory rates between provincial and municipal CDC. The quality control program has provided scientific data that the aerobic plate count capability of the laboratories meets the requirements of monitoring tasks.

  5. Effect of font size, italics, and colour count on web usability.

    PubMed

    Bhatia, Sanjiv K; Samal, Ashok; Rajan, Nithin; Kiviniemi, Marc T

    2011-04-01

    Web usability measures the ease of use of a website. This study attempts to find the effect of three factors - font size, italics, and colour count - on web usability. The study was performed using a set of tasks and developing a survey questionnaire. We performed the study using a set of human subjects, selected from the undergraduate students taking courses in psychology. The data computed from the tasks and survey questionnaire were statistically analysed to find if there was any effect of font size, italics, and colour count on the three web usability dimensions. We found that for the student population considered, there was no significant effect of font size on usability. However, the manipulation of italics and colour count did influence some aspects of usability. The subjects performed better for pages with no italics and high italics compared to moderate italics. The subjects rated the pages that contained only one colour higher than the web pages with four or six colours. This research will help web developers better understand the effect of font size, italics, and colour count on web usability in general, and for young adults, in particular.

  6. Effect of font size, italics, and colour count on web usability

    PubMed Central

    Samal, Ashok; Rajan, Nithin; Kiviniemi, Marc T.

    2013-01-01

    Web usability measures the ease of use of a website. This study attempts to find the effect of three factors – font size, italics, and colour count – on web usability. The study was performed using a set of tasks and developing a survey questionnaire. We performed the study using a set of human subjects, selected from the undergraduate students taking courses in psychology. The data computed from the tasks and survey questionnaire were statistically analysed to find if there was any effect of font size, italics, and colour count on the three web usability dimensions. We found that for the student population considered, there was no significant effect of font size on usability. However, the manipulation of italics and colour count did influence some aspects of usability. The subjects performed better for pages with no italics and high italics compared to moderate italics. The subjects rated the pages that contained only one colour higher than the web pages with four or six colours. This research will help web developers better understand the effect of font size, italics, and colour count on web usability in general, and for young adults, in particular. PMID:24358055

  7. Reticulocyte count

    MedlinePlus

    Anemia - reticulocyte ... A higher than normal reticulocytes count may indicate: Anemia due to red blood cells being destroyed earlier than normal ( hemolytic anemia ) Bleeding Blood disorder in a fetus or newborn ( ...

  8. Effects of the frame acquisition rate on the sensitivity of gastro-oesophageal reflux scintigraphy

    PubMed Central

    Codreanu, I; Chamroonrat, W; Edwards, K

    2013-01-01

    Objective: To compare the sensitivity of gastro-oesophageal reflux (GOR) scintigraphy at 5-s and 60-s frame acquisition rates. Methods: GOR scintigraphy of 50 subjects (1 month–20 years old, mean 42 months) were analysed concurrently using 5-s and 60-s acquisition frames. Reflux episodes were graded as low if activity was detected in the distal half of the oesophagus and high if activity was detected in its upper half or in the oral cavity. For comparison purposes, detected GOR in any number of 5-s frames corresponding to one 60-s frame was counted as one episode. Results: A total of 679 episodes of GOR to the upper oesophagus were counted using a 5-s acquisition technique. Only 183 of such episodes were detected on 60-s acquisition images. To the lower oesophagus, a total of 1749 GOR episodes were detected using a 5-s acquisition technique and only 1045 episodes using 60-s acquisition frames (these also included the high-level GOR on 5-s frames counted as low level on 60-s acquisition frames). 10 patients had high-level GOR episodes that were detected only using a 5-s acquisition technique, leading to a different diagnosis in these patients. No correlation between the number of reflux episodes and the gastric emptying rates was noted. Conclusion: The 5-s frame acquisition technique is more sensitive than the 60-s frame acquisition technique for detecting both high- and low-level GOR. Advances in knowledge: Brief GOR episodes with a relatively low number of radioactive counts are frequently indistinguishable from intense background activity on 60-s acquisition frames. PMID:23520226

  9. A cascaded model of spectral distortions due to spectral response effects and pulse pileup effects in a photon-counting x-ray detector for CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cammin, Jochen, E-mail: jcammin1@jhmi.edu, E-mail: ktaguchi@jhmi.edu; Taguchi, Katsuyuki, E-mail: jcammin1@jhmi.edu, E-mail: ktaguchi@jhmi.edu; Xu, Jennifer

    Purpose: Energy discriminating, photon-counting detectors (PCDs) are an emerging technology for computed tomography (CT) with various potential benefits for clinical CT. The photon energies measured by PCDs can be distorted due to the interactions of a photon with the detector and the interaction of multiple coincident photons. These effects result in distorted recorded x-ray spectra which may lead to artifacts in reconstructed CT images and inaccuracies in tissue identification. Model-based compensation techniques have the potential to account for the distortion effects. This approach requires only a small number of parameters and is applicable to a wide range of spectra andmore » count rates, but it needs an accurate model of the spectral distortions occurring in PCDs. The purpose of this study was to develop a model of those spectral distortions and to evaluate the model using a PCD (model DXMCT-1; DxRay, Inc., Northridge, CA) and various x-ray spectra in a wide range of count rates. Methods: The authors hypothesize that the complex phenomena of spectral distortions can be modeled by: (1) separating them into count-rate independent factors that we call the spectral response effects (SRE), and count-rate dependent factors that we call the pulse pileup effects (PPE), (2) developing separate models for SRE and PPE, and (3) cascading the SRE and PPE models into a combined SRE+PPE model that describes PCD distortions at both low and high count rates. The SRE model describes the probability distribution of the recorded spectrum, with a photo peak and a continuum tail, given the incident photon energy. Model parameters were obtained from calibration measurements with three radioisotopes and then interpolated linearly for other energies. The PPE model used was developed in the authors’ previous work [K. Taguchi et al., “Modeling the performance of a photon counting x-ray detector for CT: Energy response and pulse pileup effects,” Med. Phys. 38(2), 1089–1102

  10. A cascaded model of spectral distortions due to spectral response effects and pulse pileup effects in a photon-counting x-ray detector for CT

    PubMed Central

    Cammin, Jochen; Xu, Jennifer; Barber, William C.; Iwanczyk, Jan S.; Hartsough, Neal E.; Taguchi, Katsuyuki

    2014-01-01

    Purpose: Energy discriminating, photon-counting detectors (PCDs) are an emerging technology for computed tomography (CT) with various potential benefits for clinical CT. The photon energies measured by PCDs can be distorted due to the interactions of a photon with the detector and the interaction of multiple coincident photons. These effects result in distorted recorded x-ray spectra which may lead to artifacts in reconstructed CT images and inaccuracies in tissue identification. Model-based compensation techniques have the potential to account for the distortion effects. This approach requires only a small number of parameters and is applicable to a wide range of spectra and count rates, but it needs an accurate model of the spectral distortions occurring in PCDs. The purpose of this study was to develop a model of those spectral distortions and to evaluate the model using a PCD (model DXMCT-1; DxRay, Inc., Northridge, CA) and various x-ray spectra in a wide range of count rates. Methods: The authors hypothesize that the complex phenomena of spectral distortions can be modeled by: (1) separating them into count-rate independent factors that we call the spectral response effects (SRE), and count-rate dependent factors that we call the pulse pileup effects (PPE), (2) developing separate models for SRE and PPE, and (3) cascading the SRE and PPE models into a combined SRE+PPE model that describes PCD distortions at both low and high count rates. The SRE model describes the probability distribution of the recorded spectrum, with a photo peak and a continuum tail, given the incident photon energy. Model parameters were obtained from calibration measurements with three radioisotopes and then interpolated linearly for other energies. The PPE model used was developed in the authors’ previous work [K. Taguchi , “Modeling the performance of a photon counting x-ray detector for CT: Energy response and pulse pileup effects,” Med. Phys. 38(2), 1089–1102 (2011)]. The

  11. [Effects of HiLo for two weeks on erythrocyte immune adhesion and leukocyte count of swimmers].

    PubMed

    Zhao, Yong-Cai; Gao, Bing-Hong; Wu, Ge-Lin; Zhang, Jiu-Li

    2012-07-01

    To investigate the effects of living high-training low (HiLo) on innate immunity in blood of elite swimmers. Six female swimmers undertook HiLo for two weeks, erythrocyte adhesion function and counts of leukocyte were tested in different time of training period. Red blood cell C3b receptor ring rate (RBC-C3bRR) decreased and red blood cell immune complex matter ring rate (RBC-ICR) increased significantly (P < 0.05), the two markers returned to base line 1 week after training. Counts of leukocyte and granulocyte decreased significantly (P < 0.05), and they recovered 1 week after training; Counts of lymphocyte and monocyte decreased without significance during training and did not recovered after training. Immunity of erythrocyte and granulocyte decreased quickly, but lymphocyte and monocyte recovered slowly, swimmers were adaptive to the training.

  12. Sampling and counting genome rearrangement scenarios

    PubMed Central

    2015-01-01

    Background Even for moderate size inputs, there are a tremendous number of optimal rearrangement scenarios, regardless what the model is and which specific question is to be answered. Therefore giving one optimal solution might be misleading and cannot be used for statistical inferring. Statistically well funded methods are necessary to sample uniformly from the solution space and then a small number of samples are sufficient for statistical inferring. Contribution In this paper, we give a mini-review about the state-of-the-art of sampling and counting rearrangement scenarios, focusing on the reversal, DCJ and SCJ models. Above that, we also give a Gibbs sampler for sampling most parsimonious labeling of evolutionary trees under the SCJ model. The method has been implemented and tested on real life data. The software package together with example data can be downloaded from http://www.renyi.hu/~miklosi/SCJ-Gibbs/ PMID:26452124

  13. A review on natural background radiation

    PubMed Central

    Shahbazi-Gahrouei, Daryoush; Gholami, Mehrdad; Setayandeh, Samaneh

    2013-01-01

    The world is naturally radioactive and approximately 82% of human-absorbed radiation doses, which are out of control, arise from natural sources such as cosmic, terrestrial, and exposure from inhalation or intake radiation sources. In recent years, several international studies have been carried out, which have reported different values regarding the effect of background radiation on human health. Gamma radiation emitted from natural sources (background radiation) is largely due to primordial radionuclides, mainly 232Th and 238U series, and their decay products, as well as 40K, which exist at trace levels in the earth's crust. Their concentrations in soil, sands, and rocks depend on the local geology of each region in the world. Naturally occurring radioactive materials generally contain terrestrial-origin radionuclides, left over since the creation of the earth. In addition, the existence of some springs and quarries increases the dose rate of background radiation in some regions that are known as high level background radiation regions. The type of building materials used in houses can also affect the dose rate of background radiations. The present review article was carried out to consider all of the natural radiations, including cosmic, terrestrial, and food radiation. PMID:24223380

  14. Morphological spot counting from stacked images for automated analysis of gene copy numbers by fluorescence in situ hybridization.

    PubMed

    Grigoryan, Artyom M; Dougherty, Edward R; Kononen, Juha; Bubendorf, Lukas; Hostetter, Galen; Kallioniemi, Olli

    2002-01-01

    Fluorescence in situ hybridization (FISH) is a molecular diagnostic technique in which a fluorescent labeled probe hybridizes to a target nucleotide sequence of deoxyribose nucleic acid. Upon excitation, each chromosome containing the target sequence produces a fluorescent signal (spot). Because fluorescent spot counting is tedious and often subjective, automated digital algorithms to count spots are desirable. New technology provides a stack of images on multiple focal planes throughout a tissue sample. Multiple-focal-plane imaging helps overcome the biases and imprecision inherent in single-focal-plane methods. This paper proposes an algorithm for global spot counting in stacked three-dimensional slice FISH images without the necessity of nuclei segmentation. It is designed to work in complex backgrounds, when there are agglomerated nuclei, and in the presence of illumination gradients. It is based on the morphological top-hat transform, which locates intensity spikes on irregular backgrounds. After finding signals in the slice images, the algorithm groups these together to form three-dimensional spots. Filters are employed to separate legitimate spots from fluorescent noise. The algorithm is set in a comprehensive toolbox that provides visualization and analytic facilities. It includes simulation software that allows examination of algorithm performance for various image and algorithm parameter settings, including signal size, signal density, and the number of slices.

  15. A mind you can count on: validating breath counting as a behavioral measure of mindfulness.

    PubMed

    Levinson, Daniel B; Stoll, Eli L; Kindy, Sonam D; Merry, Hillary L; Davidson, Richard J

    2014-01-01

    Mindfulness practice of present moment awareness promises many benefits, but has eluded rigorous behavioral measurement. To date, research has relied on self-reported mindfulness or heterogeneous mindfulness trainings to infer skillful mindfulness practice and its effects. In four independent studies with over 400 total participants, we present the first construct validation of a behavioral measure of mindfulness, breath counting. We found it was reliable, correlated with self-reported mindfulness, differentiated long-term meditators from age-matched controls, and was distinct from sustained attention and working memory measures. In addition, we employed breath counting to test the nomological network of mindfulness. As theorized, we found skill in breath counting associated with more meta-awareness, less mind wandering, better mood, and greater non-attachment (i.e., less attentional capture by distractors formerly paired with reward). We also found in a randomized online training study that 4 weeks of breath counting training improved mindfulness and decreased mind wandering relative to working memory training and no training controls. Together, these findings provide the first evidence for breath counting as a behavioral measure of mindfulness.

  16. Probabilistic techniques for obtaining accurate patient counts in Clinical Data Warehouses

    PubMed Central

    Myers, Risa B.; Herskovic, Jorge R.

    2011-01-01

    Proposal and execution of clinical trials, computation of quality measures and discovery of correlation between medical phenomena are all applications where an accurate count of patients is needed. However, existing sources of this type of patient information, including Clinical Data Warehouses (CDW) may be incomplete or inaccurate. This research explores applying probabilistic techniques, supported by the MayBMS probabilistic database, to obtain accurate patient counts from a clinical data warehouse containing synthetic patient data. We present a synthetic clinical data warehouse (CDW), and populate it with simulated data using a custom patient data generation engine. We then implement, evaluate and compare different techniques for obtaining patients counts. We model billing as a test for the presence of a condition. We compute billing’s sensitivity and specificity both by conducting a “Simulated Expert Review” where a representative sample of records are reviewed and labeled by experts, and by obtaining the ground truth for every record. We compute the posterior probability of a patient having a condition through a “Bayesian Chain”, using Bayes’ Theorem to calculate the probability of a patient having a condition after each visit. The second method is a “one-shot” approach that computes the probability of a patient having a condition based on whether the patient is ever billed for the condition Our results demonstrate the utility of probabilistic approaches, which improve on the accuracy of raw counts. In particular, the simulated review paired with a single application of Bayes’ Theorem produces the best results, with an average error rate of 2.1% compared to 43.7% for the straightforward billing counts. Overall, this research demonstrates that Bayesian probabilistic approaches improve patient counts on simulated patient populations. We believe that total patient counts based on billing data are one of the many possible applications of our

  17. Particle Count Limits Recommendation for Aviation Fuel

    DTIC Science & Technology

    2015-10-05

    Particle Counter Methodology • Particle counts are taken utilizing calibration methodologies and standardized cleanliness code ratings – ISO 11171 – ISO...Limits Receipt Vehicle Fuel Tank Fuel Injector Aviation Fuel DEF (AUST) 5695B 18/16/13 Parker 18/16/13 14/10/7 Pamas / Parker / Particle Solutions 19/17...12 U.S. DOD 19/17/14/13* Diesel Fuel World Wide Fuel Charter 5th 18/16/13 DEF (AUST) 5695B 18/16/13 Caterpillar 18/16/13 Detroit Diesel 18/16/13 MTU

  18. Development of a low-level 39Ar calibration standard – Analysis by absolute gas counting measurements augmented with simulation

    DOE PAGES

    Williams, Richard M.; Aalseth, C. E.; Brandenberger, J. M.; ...

    2017-02-17

    Here, this paper describes the generation of 39Ar, via reactor irradiation of potassium carbonate, followed by quantitative analysis (length-compensated proportional counting) to yield two calibration standards that are respectively 50 and 3 times atmospheric background levels. Measurements were performed in Pacific Northwest National Laboratory's shallow underground counting laboratory studying the effect of gas density on beta-transport; these results are compared with simulation. The total expanded uncertainty of the specific activity for the ~50 × 39Ar in P10 standard is 3.6% (k=2).

  19. Development of a low-level 39Ar calibration standard – Analysis by absolute gas counting measurements augmented with simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Richard M.; Aalseth, C. E.; Brandenberger, J. M.

    Here, this paper describes the generation of 39Ar, via reactor irradiation of potassium carbonate, followed by quantitative analysis (length-compensated proportional counting) to yield two calibration standards that are respectively 50 and 3 times atmospheric background levels. Measurements were performed in Pacific Northwest National Laboratory's shallow underground counting laboratory studying the effect of gas density on beta-transport; these results are compared with simulation. The total expanded uncertainty of the specific activity for the ~50 × 39Ar in P10 standard is 3.6% (k=2).

  20. In-flight calibration of Hitomi Soft X-ray Spectrometer. (1) Background

    NASA Astrophysics Data System (ADS)

    Kilbourne, Caroline A.; Sawada, Makoto; Tsujimoto, Masahiro; Angellini, Lorella; Boyce, Kevin R.; Eckart, Megan E.; Fujimoto, Ryuichi; Ishisaki, Yoshitaka; Kelley, Richard L.; Koyama, Shu; Leutenegger, Maurice A.; Loewenstein, Michael; McCammon, Dan; Mitsuda, Kazuhisa; Nakashima, Shinya; Porter, Frederick S.; Seta, Hiromi; Takei, Yoh; Tashiro, Makoto S.; Terada, Yukikatsu; Yamada, Shinya; Yamasaki, Noriko Y.

    2018-03-01

    The X-Ray Spectrometer (XRS) instrument of Suzaku provided the first measurement of the non-X-ray background (NXB) of an X-ray calorimeter spectrometer, but the data set was limited. The Soft X-ray Spectrometer (SXS) instrument of Hitomi was able to provide a more detailed picture of X-ray calorimeter background, with more than 360 ks of data while pointed at the Earth, and a comparable amount of blank-sky data. These data are important not only for analyzing SXS science data, but also for categorizing the contributions to the NXB in X-ray calorimeters as a class. In this paper, we present the contributions to the SXS NXB, the types and effectiveness of the screening, the interaction of the screening with the broad-band redistribution, and the residual background spectrum as a function of magnetic cut-off rigidity. The orbit-averaged SXS NXB in the range 0.3-12 keV was 4 × 10-2 counts s-1 cm-2. This very low background in combination with groundbreaking spectral resolution gave SXS unprecedented sensitivity to weak spectral lines.

  1. Controlling for varying effort in count surveys --an analysis of Christmas Bird Count Data

    USGS Publications Warehouse

    Link, W.A.; Sauer, J.R.

    1999-01-01

    The Christmas Bird Count (CBC) is a valuable source of information about midwinter populations of birds in the continental U.S. and Canada. Analysis of CBC data is complicated by substantial variation among sites and years in effort expended in counting; this feature of the CBC is common to many other wildlife surveys. Specification of a method for adjusting counts for effort is a matter of some controversy. Here, we present models for longitudinal count surveys with varying effort; these describe the effect of effort as proportional to exp(B effortp), where B and p are parameters. For any fixed p, our models are loglinear in the transformed explanatory variable (effort)p and other covariables. Hence we fit a collection of loglinear models corresponding to a range of values of p, and select the best effort adjustment from among these on the basis of fit statistics. We apply this procedure to data for six bird species in five regions, for the period 1959-1988.

  2. Background Selection in Partially Selfing Populations

    PubMed Central

    Roze, Denis

    2016-01-01

    Self-fertilizing species often present lower levels of neutral polymorphism than their outcrossing relatives. Indeed, selfing automatically increases the rate of coalescence per generation, but also enhances the effects of background selection and genetic hitchhiking by reducing the efficiency of recombination. Approximations for the effect of background selection in partially selfing populations have been derived previously, assuming tight linkage between deleterious alleles and neutral loci. However, loosely linked deleterious mutations may have important effects on neutral diversity in highly selfing populations. In this article, I use a general method based on multilocus population genetics theory to express the effect of a deleterious allele on diversity at a linked neutral locus in terms of moments of genetic associations between loci. Expressions for these genetic moments at equilibrium are then computed for arbitrary rates of selfing and recombination. An extrapolation of the results to the case where deleterious alleles segregate at multiple loci is checked using individual-based simulations. At high selfing rates, the tight linkage approximation underestimates the effect of background selection in genomes with moderate to high map length; however, another simple approximation can be obtained for this situation and provides accurate predictions as long as the deleterious mutation rate is not too high. PMID:27075726

  3. DC KIDS COUNT e-Databook Indicators

    ERIC Educational Resources Information Center

    DC Action for Children, 2012

    2012-01-01

    This report presents indicators that are included in DC Action for Children's 2012 KIDS COUNT e-databook, their definitions and sources and the rationale for their selection. The indicators for DC KIDS COUNT represent a mix of traditional KIDS COUNT indicators of child well-being, such as the number of children living in poverty, and indicators of…

  4. Nuclear counting filter based on a centered Skellam test and a double exponential smoothing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coulon, Romain; Kondrasovs, Vladimir; Dumazert, Jonathan

    2015-07-01

    Online nuclear counting represents a challenge due to the stochastic nature of radioactivity. The count data have to be filtered in order to provide a precise and accurate estimation of the count rate, this with a response time compatible with the application in view. An innovative filter is presented in this paper addressing this issue. It is a nonlinear filter based on a Centered Skellam Test (CST) giving a local maximum likelihood estimation of the signal based on a Poisson distribution assumption. This nonlinear approach allows to smooth the counting signal while maintaining a fast response when brutal change activitymore » occur. The filter has been improved by the implementation of a Brown's double Exponential Smoothing (BES). The filter has been validated and compared to other state of the art smoothing filters. The CST-BES filter shows a significant improvement compared to all tested smoothing filters. (authors)« less

  5. Photon counting photodiode array detector for far ultraviolet (FUV) astronomy

    NASA Technical Reports Server (NTRS)

    Hartig, G. F.; Moos, H. W.; Pembroke, R.; Bowers, C.

    1982-01-01

    A compact, stable, single-stage intensified photodiode array detector designed for photon-counting, far ultraviolet astronomy applications employs a saturable, 'C'-type MCP (Galileo S. MCP 25-25) to produce high gain pulses with a narrowly peaked pulse height distribution. The P-20 output phosphor exhibits a very short decay time, due to the high current density of the electron pulses. This intensifier is being coupled to a self-scanning linear photodiode array which has a fiber optic input window which allows direct, rigid mechanical coupling with minimal light loss. The array was scanned at a 250 KHz pixel rate. The detector exhibits more than adequate signal-to-noise ratio for pulse counting and event location. Previously announced in STAR as N82-19118

  6. The effect of microchannel plate gain depression on PAPA photon counting cameras

    NASA Astrophysics Data System (ADS)

    Sams, Bruce J., III

    1991-03-01

    PAPA (precision analog photon address) cameras are photon counting imagers which employ microchannel plates (MCPs) for image intensification. They have been used extensively in astronomical speckle imaging. The PAPA camera can produce artifacts when light incident on its MCP is highly concentrated. The effect is exacerbated by adjusting the strobe detection level too low, so that the camera accepts very small MCP pulses. The artifacts can occur even at low total count rates if the image has highly a concentrated bright spot. This paper describes how to optimize PAPA camera electronics, and describes six techniques which can avoid or minimize addressing errors.

  7. All-digital full waveform recording photon counting flash lidar

    NASA Astrophysics Data System (ADS)

    Grund, Christian J.; Harwit, Alex

    2010-08-01

    Current generation analog and photon counting flash lidar approaches suffer from limitation in waveform depth, dynamic range, sensitivity, false alarm rates, optical acceptance angle (f/#), optical and electronic cross talk, and pixel density. To address these issues Ball Aerospace is developing a new approach to flash lidar that employs direct coupling of a photocathode and microchannel plate front end to a high-speed, pipelined, all-digital Read Out Integrated Circuit (ROIC) to achieve photon-counting temporal waveform capture in each pixel on each laser return pulse. A unique characteristic is the absence of performance-limiting analog or mixed signal components. When implemented in 65nm CMOS technology, the Ball Intensified Imaging Photon Counting (I2PC) flash lidar FPA technology can record up to 300 photon arrivals in each pixel with 100 ps resolution on each photon return, with up to 6000 range bins in each pixel. The architecture supports near 100% fill factor and fast optical system designs (f/#<1), and array sizes to 3000×3000 pixels. Compared to existing technologies, >60 dB ultimate dynamic range improvement, and >104 reductions in false alarm rates are anticipated, while achieving single photon range precision better than 1cm. I2PC significantly extends long-range and low-power hard target imaging capabilities useful for autonomous hazard avoidance (ALHAT), navigation, imaging vibrometry, and inspection applications, and enables scannerless 3D imaging for distributed target applications such as range-resolved atmospheric remote sensing, vegetation canopies, and camouflage penetration from terrestrial, airborne, GEO, and LEO platforms. We discuss the I2PC architecture, development status, anticipated performance advantages, and limitations.

  8. Systemic inflammation in 222.841 healthy employed smokers and nonsmokers: white blood cell count and relationship to spirometry

    PubMed Central

    2012-01-01

    Background Smoking has been linked to low-grade systemic inflammation, a known risk factor for disease. This state is reflected in elevated white blood cell (WBC) count. Objective We analyzed the relationship between WBC count and smoking in healthy men and women across several age ranges who underwent preventive medical check-ups in the workplace. We also analysed the relationship between smoking and lung function. Methods Cross-sectional descriptive study in 163 459 men and 59 382 women aged between 16 and 70 years. Data analysed were smoking status, WBC count, and spirometry readings. Results Total WBC showed higher counts in both male and female smokers, around 1000 to 1300 cell/ml (t test, P < 0.001). Forced expiratory volume in 1 second (FEV1%) was higher in nonsmokers for both sexes between 25 to 54 years (t test, P < 0.001). Analysis of covariance showed a multiple variable effect of age, sex, smoking status, body mass index on WBC count. The relationship between WBC blood count and smoking status was confirmed after the sample was stratified for these variables. Smokers with airway obstruction measured by FEV1% were found to have higher WBC counts, in comparison to smokers with a normal FEV1% among similar age and BMI groups. Conclusions Smoking increases WBC count and affects lung function. The effects are evident across a wide age range, underlining the importance of initiating preventive measures as soon as an individual begins to smoke. PMID:22613769

  9. Equivalence of truncated count mixture distributions and mixtures of truncated count distributions.

    PubMed

    Böhning, Dankmar; Kuhnert, Ronny

    2006-12-01

    This article is about modeling count data with zero truncation. A parametric count density family is considered. The truncated mixture of densities from this family is different from the mixture of truncated densities from the same family. Whereas the former model is more natural to formulate and to interpret, the latter model is theoretically easier to treat. It is shown that for any mixing distribution leading to a truncated mixture, a (usually different) mixing distribution can be found so that the associated mixture of truncated densities equals the truncated mixture, and vice versa. This implies that the likelihood surfaces for both situations agree, and in this sense both models are equivalent. Zero-truncated count data models are used frequently in the capture-recapture setting to estimate population size, and it can be shown that the two Horvitz-Thompson estimators, associated with the two models, agree. In particular, it is possible to achieve strong results for mixtures of truncated Poisson densities, including reliable, global construction of the unique NPMLE (nonparametric maximum likelihood estimator) of the mixing distribution, implying a unique estimator for the population size. The benefit of these results lies in the fact that it is valid to work with the mixture of truncated count densities, which is less appealing for the practitioner but theoretically easier. Mixtures of truncated count densities form a convex linear model, for which a developed theory exists, including global maximum likelihood theory as well as algorithmic approaches. Once the problem has been solved in this class, it might readily be transformed back to the original problem by means of an explicitly given mapping. Applications of these ideas are given, particularly in the case of the truncated Poisson family.

  10. Increased Microerythrocyte Count in Homozygous α+-Thalassaemia Contributes to Protection against Severe Malarial Anaemia

    PubMed Central

    Fowkes, Freya J. I; Allen, Stephen J; Allen, Angela; Alpers, Michael P; Weatherall, David J; Day, Karen P

    2008-01-01

    Background The heritable haemoglobinopathy α+-thalassaemia is caused by the reduced synthesis of α-globin chains that form part of normal adult haemoglobin (Hb). Individuals homozygous for α+-thalassaemia have microcytosis and an increased erythrocyte count. α+-Thalassaemia homozygosity confers considerable protection against severe malaria, including severe malarial anaemia (SMA) (Hb concentration < 50 g/l), but does not influence parasite count. We tested the hypothesis that the erythrocyte indices associated with α+-thalassaemia homozygosity provide a haematological benefit during acute malaria. Methods and Findings Data from children living on the north coast of Papua New Guinea who had participated in a case-control study of the protection afforded by α+-thalassaemia against severe malaria were reanalysed to assess the genotype-specific reduction in erythrocyte count and Hb levels associated with acute malarial disease. We observed a reduction in median erythrocyte count of ∼1.5 × 1012/l in all children with acute falciparum malaria relative to values in community children (p < 0.001). We developed a simple mathematical model of the linear relationship between Hb concentration and erythrocyte count. This model predicted that children homozygous for α+-thalassaemia lose less Hb than children of normal genotype for a reduction in erythrocyte count of >1.1 × 1012/l as a result of the reduced mean cell Hb in homozygous α+-thalassaemia. In addition, children homozygous for α+-thalassaemia require a 10% greater reduction in erythrocyte count than children of normal genotype (p = 0.02) for Hb concentration to fall to 50 g/l, the cutoff for SMA. We estimated that the haematological profile in children homozygous for α+-thalassaemia reduces the risk of SMA during acute malaria compared to children of normal genotype (relative risk 0.52; 95% confidence interval [CI] 0.24–1.12, p = 0.09). Conclusions The increased erythrocyte count and microcytosis in

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erchinger, J. L.; Orrell, John L.; Aalseth, C. E.

    The Ultra-Low Background Liquid Scintillation Counter developed by Pacific Northwest National Laboratory will expand the application of liquid scintillation counting by enabling lower detection limits and smaller sample volumes. By reducing the overall count rate of the background environment approximately 2 orders of magnitude below that of commercially available systems, backgrounds on the order of tens of counts per day over an energy range of ~3–3600 keV can be realized. Finally, initial test results of the ULB LSC show promising results for ultra-low background detection with liquid scintillation counting.

  12. Energy response calibration of photon-counting detectors using x-ray fluorescence: a feasibility study.

    PubMed

    Cho, H-M; Ding, H; Ziemer, B P; Molloi, S

    2014-12-07

    Accurate energy calibration is critical for the application of energy-resolved photon-counting detectors in spectral imaging. The aim of this study is to investigate the feasibility of energy response calibration and characterization of a photon-counting detector using x-ray fluorescence. A comprehensive Monte Carlo simulation study was performed using Geant4 Application for Tomographic Emission (GATE) to investigate the optimal technique for x-ray fluorescence calibration. Simulations were conducted using a 100 kVp tungsten-anode spectra with 2.7 mm Al filter for a single pixel cadmium telluride (CdTe) detector with 3 × 3 mm(2) in detection area. The angular dependence of x-ray fluorescence and scatter background was investigated by varying the detection angle from 20° to 170° with respect to the beam direction. The effects of the detector material, shape, and size on the recorded x-ray fluorescence were investigated. The fluorescent material size effect was considered with and without the container for the fluorescent material. In order to provide validation for the simulation result, the angular dependence of x-ray fluorescence from five fluorescent materials was experimentally measured using a spectrometer. Finally, eleven of the fluorescent materials were used for energy calibration of a CZT-based photon-counting detector. The optimal detection angle was determined to be approximately at 120° with respect to the beam direction, which showed the highest fluorescence to scatter ratio (FSR) with a weak dependence on the fluorescent material size. The feasibility of x-ray fluorescence for energy calibration of photon-counting detectors in the diagnostic x-ray energy range was verified by successfully calibrating the energy response of a CZT-based photon-counting detector. The results of this study can be used as a guideline to implement the x-ray fluorescence calibration method for photon-counting detectors in a typical imaging laboratory.

  13. Energy response calibration of photon-counting detectors using X-ray fluorescence: a feasibility study

    PubMed Central

    Cho, H-M; Ding, H; Ziemer, BP; Molloi, S

    2014-01-01

    Accurate energy calibration is critical for the application of energy-resolved photon-counting detectors in spectral imaging. The aim of this study is to investigate the feasibility of energy response calibration and characterization of a photon-counting detector using X-ray fluorescence. A comprehensive Monte Carlo simulation study was performed using Geant4 Application for Tomographic Emission (GATE) to investigate the optimal technique for X-ray fluorescence calibration. Simulations were conducted using a 100 kVp tungsten-anode spectra with 2.7 mm Al filter for a single pixel cadmium telluride (CdTe) detector with 3 × 3 mm2 in detection area. The angular dependence of X-ray fluorescence and scatter background was investigated by varying the detection angle from 20° to 170° with respect to the beam direction. The effects of the detector material, shape, and size on the recorded X-ray fluorescence were investigated. The fluorescent material size effect was considered with and without the container for the fluorescent material. In order to provide validation for the simulation result, the angular dependence of X-ray fluorescence from five fluorescent materials was experimentally measured using a spectrometer. Finally, eleven of the fluorescent materials were used for energy calibration of a CZT-based photon-counting detector. The optimal detection angle was determined to be approximately at 120° with respect to the beam direction, which showed the highest fluorescence to scatter ratio (FSR) with a weak dependence on the fluorescent material size. The feasibility of X-ray fluorescence for energy calibration of photon-counting detectors in the diagnostic X-ray energy range was verified by successfully calibrating the energy response of a CZT-based photon-counting detector. The results of this study can be used as a guideline to implement the X-ray fluorescence calibration method for photon-counting detectors in a typical imaging laboratory. PMID:25369288

  14. Characterization of photon-counting multislit breast tomosynthesis.

    PubMed

    Berggren, Karl; Cederström, Björn; Lundqvist, Mats; Fredenberg, Erik

    2018-02-01

    It has been shown that breast tomosynthesis may improve sensitivity and specificity compared to two-dimensional mammography, resulting in increased detection-rate of cancers or lowered call-back rates. The purpose of this study is to characterize a spectral photon-counting multislit breast tomosynthesis system that is able to do single-scan spectral imaging with multiple collimated x-ray beams. The system differs in many aspects compared to conventional tomosynthesis using energy-integrating flat-panel detectors. The investigated system was a prototype consisting of a dual-threshold photon-counting detector with 21 collimated line detectors scanning across the compressed breast. A review of the system is done in terms of detector, acquisition geometry, and reconstruction methods. Three reconstruction methods were used, simple back-projection, filtered back-projection and an iterative algebraic reconstruction technique. The image quality was evaluated by measuring the modulation transfer-function (MTF), normalized noise-power spectrum, detective quantum-efficiency (DQE), and artifact spread-function (ASF) on reconstructed spectral tomosynthesis images for a total-energy bin (defined by a low-energy threshold calibrated to remove electronic noise) and for a high-energy bin (with a threshold calibrated to split the spectrum in roughly equal parts). Acquisition was performed using a 29 kVp W/Al x-ray spectrum at a 0.24 mGy exposure. The difference in MTF between the two energy bins was negligible, that is, there was no energy dependence on resolution. The MTF dropped to 50% at 1.5 lp/mm to 2.3 lp/mm in the scan direction and 2.4 lp/mm to 3.3 lp/mm in the slit direction, depending on the reconstruction method. The full width at half maximum of the ASF was found to range from 13.8 mm to 18.0 mm for the different reconstruction methods. The zero-frequency DQE of the system was found to be 0.72. The fraction of counts in the high-energy bin was measured to be 59% of the

  15. P2 and behavioral effects of stroke count in Chinese characters: Evidence for an analytic and attentional view.

    PubMed

    Yang, Shasha; Zhang, Shunmei; Wang, Quanhong

    2016-08-15

    The inconsistent stroke-count effect in Chinese character recognition has resulted in an intense debate between the analytic and holistic views of character processing. The length effects of English words on behavioral responses and event-related potentials (ERPs) are similarly inconclusive. In this study, we identified any behavioral and ERP stroke-count effects when orthographic neighborhood sizes are balanced across three stroke counts. A delayed character-matching task was conducted while ERPs were recorded. The behavioral data indicated that both response latency and error rate increased with increasing stroke count. The ERP data showed higher P2 but lower N2 amplitudes in the large count than in the median count condition. A higher P2 can reflect increased attentional load and reduced attentional resource for processing each stroke because of the additional strokes in the large count condition. The behavioral and ERP effects of stroke count provide evidence for the analytic view of character processing but also provide evidence against the holistic view. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  16. Photon-counting detector arrays based on microchannel array plates. [for image enhancement

    NASA Technical Reports Server (NTRS)

    Timothy, J. G.

    1975-01-01

    The recent development of the channel electron multiplier (CEM) and its miniaturization into the microchannel array plate (MCP) offers the possibility of fully combining the advantages of the photographic and photoelectric detection systems. The MCP has an image-intensifying capability and the potential of being developed to yield signal outputs superior to those of conventional photomultipliers. In particular, the MCP has a photon-counting capability with a negligible dark-count rate. Furthermore, the MCP can operate stably and efficiently at extreme-ultraviolet and soft X-ray wavelengths in a windowless configuration or can be integrated with a photo-cathode in a sealed tube for use at ultraviolet and visible wavelengths. The operation of one- and two-dimensional photon-counting detector arrays based on the MCP at extreme-ultraviolet wavelengths is described, and the design of sealed arrays for use at ultraviolet and visible wavelengths is briefly discussed.

  17. Ultrafast time measurements by time-correlated single photon counting coupled with superconducting single photon detector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shcheslavskiy, V., E-mail: vis@becker-hickl.de; Becker, W.; Morozov, P.

    Time resolution is one of the main characteristics of the single photon detectors besides quantum efficiency and dark count rate. We demonstrate here an ultrafast time-correlated single photon counting (TCSPC) setup consisting of a newly developed single photon counting board SPC-150NX and a superconducting NbN single photon detector with a sensitive area of 7 × 7 μm. The combination delivers a record instrument response function with a full width at half maximum of 17.8 ps and system quantum efficiency ∼15% at wavelength of 1560 nm. A calculation of the root mean square value of the timing jitter for channels withmore » counts more than 1% of the peak value yielded about 7.6 ps. The setup has also good timing stability of the detector–TCSPC board.« less

  18. Bunch mode specific rate corrections for PILATUS3 detectors

    DOE PAGES

    Trueb, P.; Dejoie, C.; Kobas, M.; ...

    2015-04-09

    PILATUS X-ray detectors are in operation at many synchrotron beamlines around the world. This article reports on the characterization of the new PILATUS3 detector generation at high count rates. As for all counting detectors, the measured intensities have to be corrected for the dead-time of the counting mechanism at high photon fluxes. The large number of different bunch modes at these synchrotrons as well as the wide range of detector settings presents a challenge for providing accurate corrections. To avoid the intricate measurement of the count rate behaviour for every bunch mode, a Monte Carlo simulation of the counting mechanismmore » has been implemented, which is able to predict the corrections for arbitrary bunch modes and a wide range of detector settings. This article compares the simulated results with experimental data acquired at different synchrotrons. It is found that the usage of bunch mode specific corrections based on this simulation improves the accuracy of the measured intensities by up to 40% for high photon rates and highly structured bunch modes. For less structured bunch modes, the instant retrigger technology of PILATUS3 detectors substantially reduces the dependency of the rate correction on the bunch mode. The acquired data also demonstrate that the instant retrigger technology allows for data acquisition up to 15 million photons per second per pixel.« less

  19. Kids Count in Delaware: Fact Book, 2000-2001 [and] Families Count in Delaware: Fact Book, 2000-2001.

    ERIC Educational Resources Information Center

    Delaware Univ., Newark. Kids Count in Delaware.

    This Kids Count Fact Book is combined with the Families Count Fact Book to provide information on statewide trends affecting children and families in Delaware. The Kids Count statistical profile is based on 11 main indicators of child well-being: (1) births to teens 15 to 17 years; (2) births to teens 15 to 19 years; (3) low birth weight babies;…

  20. White blood cell counting system

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The design, fabrication, and tests of a prototype white blood cell counting system for use in the Skylab IMSS are presented. The counting system consists of a sample collection subsystem, sample dilution and fluid containment subsystem, and a cell counter. Preliminary test results show the sample collection and the dilution subsystems are functional and fulfill design goals. Results for the fluid containment subsystem show the handling bags cause counting errors due to: (1) adsorption of cells to the walls of the container, and (2) inadequate cleaning of the plastic bag material before fabrication. It was recommended that another bag material be selected.

  1. A photon-counting photodiode array detector for far ultraviolet (FUV) astronomy

    NASA Technical Reports Server (NTRS)

    Hartig, G. F.; Moos, H. W.; Pembroke, R.; Bowers, C.

    1982-01-01

    A compact, stable, single-stage intensified photodiode array detector designed for photon-counting, far ultraviolet astronomy applications employs a saturable, 'C'-type MCP (Galileo S. MCP 25-25) to produce high gain pulses with a narrowly peaked pulse height distribution. The P-20 output phosphor exhibits a very short decay time, due to the high current density of the electron pulses. This intensifier is being coupled to a self-scanning linear photodiode array which has a fiber optic input window which allows direct, rigid mechanical coupling with minimal light loss. The array was scanned at a 250 KHz pixel rate. The detector exhibits more than adequate signal-to-noise ratio for pulse counting and event location.

  2. Database crime to crime match rate calculation.

    PubMed

    Buckleton, John; Bright, Jo-Anne; Walsh, Simon J

    2009-06-01

    Guidance exists on how to count matches between samples in a crime sample database but we are unable to locate a definition of how to estimate a match rate. We propose a method that does not proceed from the match counting definition but which has a strong logic.

  3. Far-Ultraviolet Number Counts of Field Galaxies

    NASA Technical Reports Server (NTRS)

    Voyer, Elysse N.; Gardner, Jonathan P.; Teplitz, Harry I.; Siana, Brian D.; deMello, Duilia F.

    2010-01-01

    The Number counts of far-ultraviolet (FUV) galaxies as a function of magnitude provide a direct statistical measure of the density and evolution of star-forming galaxies. We report on the results of measurements of the rest-frame FUV number counts computed from data of several fields including the Hubble Ultra Deep Field, the Hubble Deep Field North, and the GOODS-North and -South fields. These data were obtained from the Hubble Space Telescope Solar Blind Channel of the Advance Camera for Surveys. The number counts cover an AB magnitude range from 20-29 magnitudes, covering a total area of 15.9 arcmin'. We show that the number counts are lower than those in previous studies using smaller areas. The differences in the counts are likely the result of cosmic variance; our new data cover more area and more lines of sight than the previous studies. The slope of our number counts connects well with local FUV counts and they show good agreement with recent semi-analytical models based on dark matter "merger trees".

  4. Increasing point-count duration increases standard error

    USGS Publications Warehouse

    Smith, W.P.; Twedt, D.J.; Hamel, P.B.; Ford, R.P.; Wiedenfeld, D.A.; Cooper, R.J.

    1998-01-01

    We examined data from point counts of varying duration in bottomland forests of west Tennessee and the Mississippi Alluvial Valley to determine if counting interval influenced sampling efficiency. Estimates of standard error increased as point count duration increased both for cumulative number of individuals and species in both locations. Although point counts appear to yield data with standard errors proportional to means, a square root transformation of the data may stabilize the variance. Using long (>10 min) point counts may reduce sample size and increase sampling error, both of which diminish statistical power and thereby the ability to detect meaningful changes in avian populations.

  5. Extension of the Dytlewski-style dead time correction formalism for neutron multiplicity counting to any order

    NASA Astrophysics Data System (ADS)

    Croft, Stephen; Favalli, Andrea

    2017-10-01

    Neutron multiplicity counting using shift-register calculus is an established technique in the science of international nuclear safeguards for the identification, verification, and assay of special nuclear materials. Typically passive counting is used for Pu and mixed Pu-U items and active methods are used for U materials. Three measured counting rates, singles, doubles and triples are measured and, in combination with a simple analytical point-model, are used to calculate characteristics of the measurement item in terms of known detector and nuclear parameters. However, the measurement problem usually involves more than three quantities of interest, but even in cases where the next higher order count rate, quads, is statistically viable, it is not quantitatively applied because corrections for dead time losses are currently not available in the predominant analysis paradigm. In this work we overcome this limitation by extending the commonly used dead time correction method, developed by Dytlewski, to quads. We also give results for pents, which may be of interest for certain special investigations. Extension to still higher orders, may be accomplished by inspection based on the sequence presented. We discuss the foundations of the Dytlewski method, give limiting cases, and highlight the opportunities and implications that these new results expose. In particular there exist a number of ways in which the new results may be combined with other approaches to extract the correlated rates, and this leads to various practical implementations.

  6. Extension of the Dytlewski-style dead time correction formalism for neutron multiplicity counting to any order

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Croft, Stephen; Favalli, Andrea

    Here, neutron multiplicity counting using shift-register calculus is an established technique in the science of international nuclear safeguards for the identification, verification, and assay of special nuclear materials. Typically passive counting is used for Pu and mixed Pu-U items and active methods are used for U materials. Three measured counting rates, singles, doubles and triples are measured and, in combination with a simple analytical point-model, are used to calculate characteristics of the measurement item in terms of known detector and nuclear parameters. However, the measurement problem usually involves more than three quantities of interest, but even in cases where themore » next higher order count rate, quads, is statistically viable, it is not quantitatively applied because corrections for dead time losses are currently not available in the predominant analysis paradigm. In this work we overcome this limitation by extending the commonly used dead time correction method, developed by Dytlewski, to quads. We also give results for pents, which may be of interest for certain special investigations. Extension to still higher orders, may be accomplished by inspection based on the sequence presented. We discuss the foundations of the Dytlewski method, give limiting cases, and highlight the opportunities and implications that these new results expose. In particular there exist a number of ways in which the new results may be combined with other approaches to extract the correlated rates, and this leads to various practical implementations.« less

  7. Extension of the Dytlewski-style dead time correction formalism for neutron multiplicity counting to any order

    DOE PAGES

    Croft, Stephen; Favalli, Andrea

    2017-07-16

    Here, neutron multiplicity counting using shift-register calculus is an established technique in the science of international nuclear safeguards for the identification, verification, and assay of special nuclear materials. Typically passive counting is used for Pu and mixed Pu-U items and active methods are used for U materials. Three measured counting rates, singles, doubles and triples are measured and, in combination with a simple analytical point-model, are used to calculate characteristics of the measurement item in terms of known detector and nuclear parameters. However, the measurement problem usually involves more than three quantities of interest, but even in cases where themore » next higher order count rate, quads, is statistically viable, it is not quantitatively applied because corrections for dead time losses are currently not available in the predominant analysis paradigm. In this work we overcome this limitation by extending the commonly used dead time correction method, developed by Dytlewski, to quads. We also give results for pents, which may be of interest for certain special investigations. Extension to still higher orders, may be accomplished by inspection based on the sequence presented. We discuss the foundations of the Dytlewski method, give limiting cases, and highlight the opportunities and implications that these new results expose. In particular there exist a number of ways in which the new results may be combined with other approaches to extract the correlated rates, and this leads to various practical implementations.« less

  8. Fully integrated sub 100ps photon counting platform

    NASA Astrophysics Data System (ADS)

    Buckley, S. J.; Bellis, S. J.; Rosinger, P.; Jackson, J. C.

    2007-02-01

    Current state of the art high resolution counting modules, specifically designed for high timing resolution applications, are largely based on a computer card format. This has tended to result in a costly solution that is restricted to the computer it resides in. We describe a four channel timing module that interfaces to a computer via a USB port and operates with a resolution of less than 100 picoseconds. The core design of the system is an advanced field programmable gate array (FPGA) interfacing to a precision time interval measurement module, mass memory block and a high speed USB 2.0 serial data port. The FPGA design allows the module to operate in a number of modes allowing both continuous recording of photon events (time-tagging) and repetitive time binning. In time-tag mode the system reports, for each photon event, the high resolution time along with the chronological time (macro time) and the channel ID. The time-tags are uploaded in real time to a host computer via a high speed USB port allowing continuous storage to computer memory of up to 4 millions photons per second. In time-bin mode, binning is carried out with count rates up to 10 million photons per second. Each curve resides in a block of 128,000 time-bins each with a resolution programmable down to less than 100 picoseconds. Each bin has a limit of 65535 hits allowing autonomous curve recording until a bin reaches the maximum count or the system is commanded to halt. Due to the large memory storage, several curves/experiments can be stored in the system prior to uploading to the host computer for analysis. This makes this module ideal for integration into high timing resolution specific applications such as laser ranging and fluorescence lifetime imaging using techniques such as time correlated single photon counting (TCSPC).

  9. 25 CFR 81.21 - Counting of ballots.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... STATUTE § 81.21 Counting of ballots. All duly cast ballots are to be counted. Even though it will not be... counted for purposes of determining whether the required percentage of voters have cast their ballots in... of votes cast. ...

  10. High Speed Large Format Photon Counting Microchannel Plate Imaging Sensors

    NASA Astrophysics Data System (ADS)

    Siegmund, O.; Ertley, C.; Vallerga, J.; Craven, C.; Popecki, M.; O'Mahony, A.; Minot, M.

    The development of a new class of microchannel plate technology, using atomic layer deposition (ALD) techniques applied to a borosilicate microcapillary array is enabling the implementation of larger, more stable detectors for Astronomy and remote sensing. Sealed tubes with MCPs with SuperGenII, bialkali, GaAs and GaN photocathodes have been developed to cover a wide range of optical/UV sensing applications. Formats of 18mm and 25mm circular, and 50mm (Planacon) and 20cm square have been constructed for uses from night time remote reconnaissance and biological single-molecule fluorescence lifetime imaging microscopy, to large area focal plane imagers for Astronomy, neutron detection and ring imaging Cherenkov detection. The large focal plane areas were previously unattainable, but the new developments in construction of ALD microchannel plates allow implementation of formats of 20cm or more. Continuing developments in ALD microchannel plates offer improved overall sealed tube lifetime and gain stability, and furthermore show reduced levels of radiation induced background. High time resolution astronomical and remote sensing applications can be addressed with microchannel plate based imaging, photon time tagging detector sealed tube schemes. Photon counting imaging readouts for these devices vary from cross strip (XS), cross delay line (XDL), to stripline anodes, and pad arrays depending on the intended application. The XS and XDL readouts have been implemented in formats from 22mm, and 50mm to 20cm. Both use MCP charge signals detected on two orthogonal layers of conductive fingers to encode event X-Y positions. XDL readout uses signal propagation delay to encode positions while XS readout uses charge cloud centroiding. Spatial resolution readout of XS detectors can be better than 20 microns FWHM, with good image linearity while using low gain (<10^6), allowing high local counting rates and longer overall tube lifetime. XS tubes with electronics can encode event

  11. The SCUBA-2 Cosmology Legacy Survey: the EGS deep field - I. Deep number counts and the redshift distribution of the recovered cosmic infrared background at 450 and 850 μ m

    NASA Astrophysics Data System (ADS)

    Zavala, J. A.; Aretxaga, I.; Geach, J. E.; Hughes, D. H.; Birkinshaw, M.; Chapin, E.; Chapman, S.; Chen, Chian-Chou; Clements, D. L.; Dunlop, J. S.; Farrah, D.; Ivison, R. J.; Jenness, T.; Michałowski, M. J.; Robson, E. I.; Scott, Douglas; Simpson, J.; Spaans, M.; van der Werf, P.

    2017-01-01

    We present deep observations at 450 and 850 μm in the Extended Groth Strip field taken with the SCUBA-2 camera mounted on the James Clerk Maxwell Telescope as part of the deep SCUBA-2 Cosmology Legacy Survey (S2CLS), achieving a central instrumental depth of σ450 = 1.2 mJy beam-1 and σ850 = 0.2 mJy beam-1. We detect 57 sources at 450 μm and 90 at 850 μm with signal-to-noise ratio >3.5 over ˜70 arcmin2. From these detections, we derive the number counts at flux densities S450 > 4.0 mJy and S850 > 0.9 mJy, which represent the deepest number counts at these wavelengths derived using directly extracted sources from only blank-field observations with a single-dish telescope. Our measurements smoothly connect the gap between previous shallower blank-field single-dish observations and deep interferometric ALMA results. We estimate the contribution of our SCUBA-2 detected galaxies to the cosmic infrared background (CIB), as well as the contribution of 24 μm-selected galaxies through a stacking technique, which add a total of 0.26 ± 0.03 and 0.07 ± 0.01 MJy sr-1, at 450 and 850 μm, respectively. These surface brightnesses correspond to 60 ± 20 and 50 ± 20 per cent of the total CIB measurements, where the errors are dominated by those of the total CIB. Using the photometric redshifts of the 24 μm-selected sample and the redshift distributions of the submillimetre galaxies, we find that the redshift distribution of the recovered CIB is different at each wavelength, with a peak at z ˜ 1 for 450 μm and at z ˜ 2 for 850 μm, consistent with previous observations and theoretical models.

  12. Exceeding Pitch Count Recommendations in Little League Baseball Increases the Chance of Requiring Tommy John Surgery as a Professional Baseball Pitcher

    PubMed Central

    Erickson, Brandon J.; Chalmers, Peter N.; Axe, Michael J.; Romeo, Anthony A.

    2017-01-01

    Background: Empirical evidence has suggested a connection between youth pitch counts and subsequent elbow injury. For players within the Little League World Series (LLWS), detailed historical player data are available. Some of these players progress to both professional play and require an ulnar collateral ligament reconstruction (UCLR). Purpose: To determine the percentage of LLWS pitchers who proceed to play professional (major or minor league) baseball, the rate of UCLR in former LLWS pitchers who played professional baseball, and the risk to those who exceeded current pitch count recommendations while playing in the LLWS. Study Design: Cohort study; Level of evidence, 3. Methods: All LLWS pitchers from 2001 through 2009 from all teams and countries were identified, and all performance data were extracted. A professional (major and minor league) baseball database was then searched to determine whether each former LLWS pitcher played professional baseball. These professional players were then searched for using publicly available databases to determine whether they underwent UCLR. Results: Overall, 638 adolescents pitched in the LLWS between 2001 and 2009; 62 (10%) progressed to professional play. Of the 56 minor league players, 25 (45%) pitched. Of the 6 Major League Baseball players, 3 (50%) pitched. Three former LLWS pitchers (5%) who played professionally underwent UCLR. In former LLWS pitchers who exceeded pitch counts and played professionally, 50% (2/4) required UCLR, while only 1.7% (1/58) of those who did not exceed pitch count recommendations required UCLR (P = .009). Similarly, among former LLWS pitchers who subsequently played professionally, 23.1% of those who played as a pitcher required UCLR while 0% of those who also played other positions required UCLR (P = .008). Conclusion: Progression from LLWS pitching to professional baseball is uncommon. Among youth players, both diversification (playing other positions besides pitcher) as well as following

  13. Virological failure and development of new resistance mutations according to CD4 count at combination antiretroviral therapy initiation.

    PubMed

    Jose, S; Quinn, K; Dunn, D; Cox, A; Sabin, C; Fidler, S

    2016-05-01

    No randomized controlled trials have yet reported an individual patient benefit of initiating combination antiretroviral therapy (cART) at CD4 counts > 350 cells/μL. It is hypothesized that earlier initiation of cART in asymptomatic and otherwise healthy individuals may lead to poorer adherence and subsequently higher rates of resistance development. In a large cohort of HIV-positive individuals, we investigated the emergence of new resistance mutations upon virological treatment failure according to the CD4 count at the initiation of cART. Of 7918 included individuals, 6514 (82.3%), 996 (12.6%) and 408 (5.2%) started cART with a CD4 count ≤ 350, 351-499 and ≥ 500 cells/μL, respectively. Virological rebound occurred while on cART in 488 (7.5%), 46 (4.6%) and 30 (7.4%) with a baseline CD4 count ≤ 350, 351-499 and ≥ 500 cells/μL, respectively. Only four (13.0%) individuals with a baseline CD4 count > 350 cells/μL in receipt of a resistance test at viral load rebound were found to have developed new resistance mutations. This compared to 107 (41.2%) of those with virological failure who had initiated cART with a CD4 count < 350 cells/μL. We found no evidence of increased rates of resistance development when cART was initiated at CD4 counts above 350 cells/μL. © 2015 The Authors. HIV Medicine published by John Wiley & Sons Ltd on behalf of British HIV Association.

  14. Recursive algorithms for phylogenetic tree counting.

    PubMed

    Gavryushkina, Alexandra; Welch, David; Drummond, Alexei J

    2013-10-28

    In Bayesian phylogenetic inference we are interested in distributions over a space of trees. The number of trees in a tree space is an important characteristic of the space and is useful for specifying prior distributions. When all samples come from the same time point and no prior information available on divergence times, the tree counting problem is easy. However, when fossil evidence is used in the inference to constrain the tree or data are sampled serially, new tree spaces arise and counting the number of trees is more difficult. We describe an algorithm that is polynomial in the number of sampled individuals for counting of resolutions of a constraint tree assuming that the number of constraints is fixed. We generalise this algorithm to counting resolutions of a fully ranked constraint tree. We describe a quadratic algorithm for counting the number of possible fully ranked trees on n sampled individuals. We introduce a new type of tree, called a fully ranked tree with sampled ancestors, and describe a cubic time algorithm for counting the number of such trees on n sampled individuals. These algorithms should be employed for Bayesian Markov chain Monte Carlo inference when fossil data are included or data are serially sampled.

  15. Addressing immunization registry population inflation in adolescent immunization rates.

    PubMed

    Robison, Steve G

    2015-01-01

    While U.S. adolescent immunization rates are available annually at national and state levels, finding pockets of need may require county or sub-county information. Immunization information systems (IISs) are one tool for assessing local immunization rates. However, the presence of IIS records dating back to early childhood and challenges in capturing mobility out of IIS areas typically leads to denominator inflation. We examined the feasibility of weighting adolescent immunization records by length of time since last report to produce more accurate county adolescent counts and immunization rates. We compared weighted and unweighted adolescent denominators from the Oregon ALERT IIS, along with county-level Census Bureau estimates, with school enrollment counts from Oregon's annual review of seventh-grade school immunization compliance for public and private schools. Adolescent immunization rates calculated using weighted data, for the state as a whole, were also checked against comparable National Immunization Survey (NIS) rates. Weighting individual records by the length of time since last activity substantially improved the fit of IIS data to county populations for adolescents. A nonlinear logarithmic (ogive) weight produced the best fit to the school count data of all examined estimates. Overall, the ogive weighted results matched NIS adolescent rates for Oregon. The problem of mobility-inflated counts of teenagers can be addressed by weighting individual records based on time since last immunization. Well-populated IISs can rely on their own data to produce adolescent immunization rates and find pockets of need.

  16. New non-invasive automatic cough counting program based on 6 types of classified cough sounds.

    PubMed

    Murata, Akira; Ohota, Nao; Shibuya, Atsuo; Ono, Hiroshi; Kudoh, Shoji

    2006-01-01

    Cough consisting of an initial deep inspiration, glottal closure, and an explosive expiration accompanied by a sound is one of the most common symptoms of respiratory disease. Despite its clinical importance, standard methods for objective cough analysis have yet to be established. We investigated the characteristics of cough sounds acoustically, designed a program to discriminate cough sounds from other sounds, and finally developed a new objective method of non-invasive cough counting. In addition, we evaluated the clinical efficacy of that program. We recorded cough sounds using a memory stick IC recorder in free-field from 2 patients and analyzed the intensity of 534 recorded coughs acoustically according to time domain. First we squared the sound waveform of recorded cough sounds, which was then smoothed out over a 20 ms window. The 5 parameters and some definitions to discriminate the cough sounds from other noise were identified and the cough sounds were classified into 6 groups. Next, we applied this method to develop a new automatic cough count program. Finally, to evaluate the accuracy and clinical usefulness of this program, we counted cough sounds collected from another 10 patients using our program and conventional manual counting. And the sensitivity, specificity and discriminative rate of the program were analyzed. This program successfully discriminated recorded cough sounds out of 1902 sound events collected from 10 patients at a rate of 93.1%. The sensitivity was 90.2% and the specificity was 96.5%. Our new cough counting program can be sufficiently useful for clinical studies.

  17. Effect of metronome rates on the quality of bag-mask ventilation during metronome-guided 30:2 cardiopulmonary resuscitation: A randomized simulation study

    PubMed Central

    Na, Ji Ung; Han, Sang Kuk; Choi, Pil Cho; Shin, Dong Hyuk

    2017-01-01

    BACKGROUND: Metronome guidance is a feasible and effective feedback technique to improve the quality of cardiopulmonary resuscitation (CPR). The rate of the metronome should be set between 100 to 120 ticks/minute and the speed of ventilation may have crucial effect on the quality of ventilation. We compared three different metronome rates (100, 110, 120 ticks/minute) to investigate its effect on the quality of ventilation during metronome-guided 30:2 CPR. METHODS: This is a prospective, randomized, crossover observational study using a RespiTrainer○r. To simulate 30 chest compressions, one investigator counted from 1 to 30 in cadence with the metronome rate (1 count for every 1 tick), and the participant performed 2 consecutive ventilations immediately following the counting of 30. Thirty physicians performed 5 sets of 2 consecutive (total 10) bag-mask ventilations for each metronome rate. Participants were instructed to squeeze the bag over 2 ticks (1.0 to 1.2 seconds depending on the rate of metronome) and deflate the bag over 2 ticks. The sequence of three different metronome rates was randomized. RESULTS: Mean tidal volume significantly decreased as the metronome rate was increased from 110 ticks/minute to 120 ticks/minute (343±84 mL vs. 294±90 mL, P=0.004). Peak airway pressure significantly increased as metronome rate increased from 100 ticks/minute to 110 ticks/minute (18.7 vs. 21.6 mmHg, P=0.006). CONCLUSION: In metronome-guided 30:2 CPR, a higher metronome rate may adversely affect the quality of bag-mask ventilations. In cases of cardiac arrest where adequate ventilation support is necessary, 100 ticks/minute may be better than 110 or 120 ticks/minute to deliver adequate tidal volume during audio tone guided 30:2 CPR. PMID:28458759

  18. Heart-rate pulse-shift detector

    NASA Technical Reports Server (NTRS)

    Anderson, M.

    1974-01-01

    Detector circuit accurately separates and counts phase-shift pulses over wide range of basic pulse-rate frequency, and also provides reasonable representation of full repetitive EKG waveform. Single telemeter implanted in small animal monitors not only body temperature but also animal movement and heart rate.

  19. Count trends for migratory Bald Eagles reveal differences between two populations at a spring site along the Lake Ontario shoreline

    PubMed Central

    2016-01-01

    The recovery of Bald Eagles (Haliaeetus leucophalus), after DDT and other organochlorine insecticides were banned in the United States, can be regarded as one of the most iconic success stories resulting from the Endangered Species Act. Interest remains high in the recovery and growth of the Bald Eagle population. Common to evaluating growth and recovery rates are counts at nesting sites and analyses of individuals fledged per season. But this is merely one snapshot that ignores survival rates as eagles grow to maturity. By analyzing indices from migration counts, we get a different snapshot better reflecting the survival of young birds. Different populations of Bald Eagles breed at different sites at different times of the year. Typical migration count analyses do not separate the populations. A separation of two distinct populations can be achieved at spring count sites by taking advantage of the tendency for northern summer breeding birds to migrate north in spring earlier than southern winter breeding birds who disperse north later in spring. In this paper I analyze migratory indices at a spring site along Lake Ontario. The analysis shows that eagles considered to be primarily of the northern summer breeding population showed an estimated growth rate of 5.3 ± 0.85% (SE) per year with 49% of eagles tallied in adult plumage, whereas the migrants considered to be primarily of the southern breeding population had an estimated growth rate of 14.0 ± 1.79% with only 22% in adult plumage. Together these results argue that the populations of southern breeding Bald Eagles are growing at a substantially higher rate than northern breeding eagles. These findings suggest that aggregate population indices for a species at migration counting sites can sometimes obscure important differences among separate populations at any given site and that separating counts by time period can be a useful way to check for differences among sub-populations. PMID:27231647

  20. Background radiation: natural and man-made.

    PubMed

    Thorne, M C

    2003-03-01

    A brief overview and comparison is given of dose rates arising from natural background radiation and the fallout from atmospheric testing of nuclear weapons. Although there are considerable spatial variations in exposure to natural background radiation, it is useful to give estimates of worldwide average overall exposures from the various components of that background. Cosmic-ray secondaries of low linear energy transfer (LET), mainly muons and photons, deliver about 280 microSv a(-1). Cosmic-ray neutrons deliver about another 100 microSv a(-1). These low- and high-LET exposures are relatively uniform to the whole body. The effective dose rate from cosmogenic radionuclides is dominated by the contribution of 12 microSv a(-1) from 14C. This is due to relatively uniform irradiation of all organs and tissues from low-energy beta particles. Primordial radionuclides and their progeny (principally the 238U and 232Th series, and 40K) contribute about 480 microSv a(-1) of effective dose by external irradiation. This is relatively uniform photon irradiation of the whole body. Internally incorporated 40K contributes a further 165 microSv a(-1) of effective dose in adults, mainly from beta particles, but with a significant gamma component. Equivalent doses from 40K are somewhat higher in muscle than other soft tissues, but the distinction is less than a factor of three. Uranium and thorium series radionuclides give rise to an average effective dose rate of around 120 microSv a(-1). This includes a major alpha particle component, and exposures of radiosensitive tissues in lung, liver, kidney and the skeleton are recognised as important contributors to effective dose. Overall, these various sources give a worldwide average effective dose rate of about 1160 microSv a(-1). Exposure to 222Rn, 220Rn and their short-lived progeny has to be considered separately. This is very variable both within and between countries. For 222Rn and its progeny, a worldwide average effective dose