Sample records for integral counting method

  1. Ultra-fast photon counting with a passive quenching silicon photomultiplier in the charge integration regime

    NASA Astrophysics Data System (ADS)

    Zhang, Guoqing; Lina, Liu

    2018-02-01

    An ultra-fast photon counting method is proposed based on the charge integration of output electrical pulses of passive quenching silicon photomultipliers (SiPMs). The results of the numerical analysis with actual parameters of SiPMs show that the maximum photon counting rate of a state-of-art passive quenching SiPM can reach ~THz levels which is much larger than that of the existing photon counting devices. The experimental procedure is proposed based on this method. This photon counting regime of SiPMs is promising in many fields such as large dynamic light power detection.

  2. Pulse pileup statistics for energy discriminating photon counting x-ray detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Adam S.; Harrison, Daniel; Lobastov, Vladimir

    Purpose: Energy discriminating photon counting x-ray detectors can be subject to a wide range of flux rates if applied in clinical settings. Even when the incident rate is a small fraction of the detector's maximum periodic rate N{sub 0}, pulse pileup leads to count rate losses and spectral distortion. Although the deterministic effects can be corrected, the detrimental effect of pileup on image noise is not well understood and may limit the performance of photon counting systems. Therefore, the authors devise a method to determine the detector count statistics and imaging performance. Methods: The detector count statistics are derived analyticallymore » for an idealized pileup model with delta pulses of a nonparalyzable detector. These statistics are then used to compute the performance (e.g., contrast-to-noise ratio) for both single material and material decomposition contrast detection tasks via the Cramer-Rao lower bound (CRLB) as a function of the detector input count rate. With more realistic unipolar and bipolar pulse pileup models of a nonparalyzable detector, the imaging task performance is determined by Monte Carlo simulations and also approximated by a multinomial method based solely on the mean detected output spectrum. Photon counting performance at different count rates is compared with ideal energy integration, which is unaffected by count rate. Results: The authors found that an ideal photon counting detector with perfect energy resolution outperforms energy integration for our contrast detection tasks, but when the input count rate exceeds 20%N{sub 0}, many of these benefits disappear. The benefit with iodine contrast falls rapidly with increased count rate while water contrast is not as sensitive to count rates. The performance with a delta pulse model is overoptimistic when compared to the more realistic bipolar pulse model. The multinomial approximation predicts imaging performance very close to the prediction from Monte Carlo simulations. The monoenergetic image with maximum contrast-to-noise ratio from dual energy imaging with ideal photon counting is only slightly better than with dual kVp energy integration, and with a bipolar pulse model, energy integration outperforms photon counting for this particular metric because of the count rate losses. However, the material resolving capability of photon counting can be superior to energy integration with dual kVp even in the presence of pileup because of the energy information available to photon counting. Conclusions: A computationally efficient multinomial approximation of the count statistics that is based on the mean output spectrum can accurately predict imaging performance. This enables photon counting system designers to directly relate the effect of pileup to its impact on imaging statistics and how to best take advantage of the benefits of energy discriminating photon counting detectors, such as material separation with spectral imaging.« less

  3. Better Than Counting: Density Profiles from Force Sampling

    NASA Astrophysics Data System (ADS)

    de las Heras, Daniel; Schmidt, Matthias

    2018-05-01

    Calculating one-body density profiles in equilibrium via particle-based simulation methods involves counting of events of particle occurrences at (histogram-resolved) space points. Here, we investigate an alternative method based on a histogram of the local force density. Via an exact sum rule, the density profile is obtained with a simple spatial integration. The method circumvents the inherent ideal gas fluctuations. We have tested the method in Monte Carlo, Brownian dynamics, and molecular dynamics simulations. The results carry a statistical uncertainty smaller than that of the standard counting method, reducing therefore the computation time.

  4. Unified method to integrate and blend several, potentially related, sources of information for genetic evaluation.

    PubMed

    Vandenplas, Jérémie; Colinet, Frederic G; Gengler, Nicolas

    2014-09-30

    A condition to predict unbiased estimated breeding values by best linear unbiased prediction is to use simultaneously all available data. However, this condition is not often fully met. For example, in dairy cattle, internal (i.e. local) populations lead to evaluations based only on internal records while widely used foreign sires have been selected using internally unavailable external records. In such cases, internal genetic evaluations may be less accurate and biased. Because external records are unavailable, methods were developed to combine external information that summarizes these records, i.e. external estimated breeding values and associated reliabilities, with internal records to improve accuracy of internal genetic evaluations. Two issues of these methods concern double-counting of contributions due to relationships and due to records. These issues could be worse if external information came from several evaluations, at least partially based on the same records, and combined into a single internal evaluation. Based on a Bayesian approach, the aim of this research was to develop a unified method to integrate and blend simultaneously several sources of information into an internal genetic evaluation by avoiding double-counting of contributions due to relationships and due to records. This research resulted in equations that integrate and blend simultaneously several sources of information and avoid double-counting of contributions due to relationships and due to records. The performance of the developed equations was evaluated using simulated and real datasets. The results showed that the developed equations integrated and blended several sources of information well into a genetic evaluation. The developed equations also avoided double-counting of contributions due to relationships and due to records. Furthermore, because all available external sources of information were correctly propagated, relatives of external animals benefited from the integrated information and, therefore, more reliable estimated breeding values were obtained. The proposed unified method integrated and blended several sources of information well into a genetic evaluation by avoiding double-counting of contributions due to relationships and due to records. The unified method can also be extended to other types of situations such as single-step genomic or multi-trait evaluations, combining information across different traits.

  5. High-Throughput Method for Automated Colony and Cell Counting by Digital Image Analysis Based on Edge Detection

    PubMed Central

    Choudhry, Priya

    2016-01-01

    Counting cells and colonies is an integral part of high-throughput screens and quantitative cellular assays. Due to its subjective and time-intensive nature, manual counting has hindered the adoption of cellular assays such as tumor spheroid formation in high-throughput screens. The objective of this study was to develop an automated method for quick and reliable counting of cells and colonies from digital images. For this purpose, I developed an ImageJ macro Cell Colony Edge and a CellProfiler Pipeline Cell Colony Counting, and compared them to other open-source digital methods and manual counts. The ImageJ macro Cell Colony Edge is valuable in counting cells and colonies, and measuring their area, volume, morphology, and intensity. In this study, I demonstrate that Cell Colony Edge is superior to other open-source methods, in speed, accuracy and applicability to diverse cellular assays. It can fulfill the need to automate colony/cell counting in high-throughput screens, colony forming assays, and cellular assays. PMID:26848849

  6. Pile-up correction algorithm based on successive integration for high count rate medical imaging and radiation spectroscopy

    NASA Astrophysics Data System (ADS)

    Mohammadian-Behbahani, Mohammad-Reza; Saramad, Shahyar

    2018-07-01

    In high count rate radiation spectroscopy and imaging, detector output pulses tend to pile up due to high interaction rate of the particles with the detector. Pile-up effects can lead to a severe distortion of the energy and timing information. Pile-up events are conventionally prevented or rejected by both analog and digital electronics. However, for decreasing the exposure times in medical imaging applications, it is important to maintain the pulses and extract their true information by pile-up correction methods. The single-event reconstruction method is a relatively new model-based approach for recovering the pulses one-by-one using a fitting procedure, for which a fast fitting algorithm is a prerequisite. This article proposes a fast non-iterative algorithm based on successive integration which fits the bi-exponential model to experimental data. After optimizing the method, the energy spectra, energy resolution and peak-to-peak count ratios are calculated for different counting rates using the proposed algorithm as well as the rejection method for comparison. The obtained results prove the effectiveness of the proposed method as a pile-up processing scheme designed for spectroscopic and medical radiation detection applications.

  7. Information theoretic approach for assessing image fidelity in photon-counting arrays.

    PubMed

    Narravula, Srikanth R; Hayat, Majeed M; Javidi, Bahram

    2010-02-01

    The method of photon-counting integral imaging has been introduced recently for three-dimensional object sensing, visualization, recognition and classification of scenes under photon-starved conditions. This paper presents an information-theoretic model for the photon-counting imaging (PCI) method, thereby providing a rigorous foundation for the merits of PCI in terms of image fidelity. This, in turn, can facilitate our understanding of the demonstrated success of photon-counting integral imaging in compressive imaging and classification. The mutual information between the source and photon-counted images is derived in a Markov random field setting and normalized by the source-image's entropy, yielding a fidelity metric that is between zero and unity, which respectively corresponds to complete loss of information and full preservation of information. Calculations suggest that the PCI fidelity metric increases with spatial correlation in source image, from which we infer that the PCI method is particularly effective for source images with high spatial correlation; the metric also increases with the reduction in photon-number uncertainty. As an application to the theory, an image-classification problem is considered showing a congruous relationship between the fidelity metric and classifier's performance.

  8. Generalized Seasonal Autoregressive Integrated Moving Average Models for Count Data with Application to Malaria Time Series with Low Case Numbers

    PubMed Central

    Briët, Olivier J. T.; Amerasinghe, Priyanie H.; Vounatsou, Penelope

    2013-01-01

    Introduction With the renewed drive towards malaria elimination, there is a need for improved surveillance tools. While time series analysis is an important tool for surveillance, prediction and for measuring interventions’ impact, approximations by commonly used Gaussian methods are prone to inaccuracies when case counts are low. Therefore, statistical methods appropriate for count data are required, especially during “consolidation” and “pre-elimination” phases. Methods Generalized autoregressive moving average (GARMA) models were extended to generalized seasonal autoregressive integrated moving average (GSARIMA) models for parsimonious observation-driven modelling of non Gaussian, non stationary and/or seasonal time series of count data. The models were applied to monthly malaria case time series in a district in Sri Lanka, where malaria has decreased dramatically in recent years. Results The malaria series showed long-term changes in the mean, unstable variance and seasonality. After fitting negative-binomial Bayesian models, both a GSARIMA and a GARIMA deterministic seasonality model were selected based on different criteria. Posterior predictive distributions indicated that negative-binomial models provided better predictions than Gaussian models, especially when counts were low. The G(S)ARIMA models were able to capture the autocorrelation in the series. Conclusions G(S)ARIMA models may be particularly useful in the drive towards malaria elimination, since episode count series are often seasonal and non-stationary, especially when control is increased. Although building and fitting GSARIMA models is laborious, they may provide more realistic prediction distributions than do Gaussian methods and may be more suitable when counts are low. PMID:23785448

  9. Red Blood Cell Count Automation Using Microscopic Hyperspectral Imaging Technology.

    PubMed

    Li, Qingli; Zhou, Mei; Liu, Hongying; Wang, Yiting; Guo, Fangmin

    2015-12-01

    Red blood cell counts have been proven to be one of the most frequently performed blood tests and are valuable for early diagnosis of some diseases. This paper describes an automated red blood cell counting method based on microscopic hyperspectral imaging technology. Unlike the light microscopy-based red blood count methods, a combined spatial and spectral algorithm is proposed to identify red blood cells by integrating active contour models and automated two-dimensional k-means with spectral angle mapper algorithm. Experimental results show that the proposed algorithm has better performance than spatial based algorithm because the new algorithm can jointly use the spatial and spectral information of blood cells.

  10. Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors.

    PubMed

    Dutton, Neale A W; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K

    2016-07-20

    SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed.

  11. Do thawing and warming affect the integrity of human milk?

    PubMed

    Handa, D; Ahrabi, A F; Codipilly, C N; Shah, S; Ruff, S; Potak, D; Williams, J E; McGuire, M A; Schanler, R J

    2014-11-01

    To evaluate the integrity of the human milk (pH, bacterial counts, host defense factors and nutrients) subjected to thawing, warming, refrigeration and maintenance at room temperature. Mothers in the neonatal intensive care unit donated freshly expressed milk. A baseline sample was stored at -80 °C and the remainder of the milk was divided and stored for 7 days at -20 °C. The milk was then subjected to two methods of thawing and warming: tepid water and waterless warmer. Thawed milk also was refrigerated for 24 h prior to warming. Lastly, warmed milk was maintained at room temperature for 4 h to simulate a feeding session. Samples were analyzed for pH, bacterial colony counts, total fat and free fatty acids, and the content of protein, secretory IgA and lactoferrin. Data were analyzed by repeated-measures analysis of variance and paired t test. There were no differences between processing methods and no changes in fat, protein, lactoferrin and secretory immunoglobulin A with processing steps. Milk pH and bacterial colony counts declined while free fatty acids rose with processing. Refrigeration of thawed milk resulted in greater declines in pH and bacteria and increases in free fatty acids. Bacterial colony counts and free fatty acids increased with maintenance at room temperature. The integrity of the milk was affected similarly by the two thawing and warming methods. Thawing and warming change the integrity of previously frozen human milk, but not adversely. Concerns about maintaining warmed milk at room temperature need to be explored.

  12. Generalized seasonal autoregressive integrated moving average models for count data with application to malaria time series with low case numbers.

    PubMed

    Briët, Olivier J T; Amerasinghe, Priyanie H; Vounatsou, Penelope

    2013-01-01

    With the renewed drive towards malaria elimination, there is a need for improved surveillance tools. While time series analysis is an important tool for surveillance, prediction and for measuring interventions' impact, approximations by commonly used Gaussian methods are prone to inaccuracies when case counts are low. Therefore, statistical methods appropriate for count data are required, especially during "consolidation" and "pre-elimination" phases. Generalized autoregressive moving average (GARMA) models were extended to generalized seasonal autoregressive integrated moving average (GSARIMA) models for parsimonious observation-driven modelling of non Gaussian, non stationary and/or seasonal time series of count data. The models were applied to monthly malaria case time series in a district in Sri Lanka, where malaria has decreased dramatically in recent years. The malaria series showed long-term changes in the mean, unstable variance and seasonality. After fitting negative-binomial Bayesian models, both a GSARIMA and a GARIMA deterministic seasonality model were selected based on different criteria. Posterior predictive distributions indicated that negative-binomial models provided better predictions than Gaussian models, especially when counts were low. The G(S)ARIMA models were able to capture the autocorrelation in the series. G(S)ARIMA models may be particularly useful in the drive towards malaria elimination, since episode count series are often seasonal and non-stationary, especially when control is increased. Although building and fitting GSARIMA models is laborious, they may provide more realistic prediction distributions than do Gaussian methods and may be more suitable when counts are low.

  13. Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors

    PubMed Central

    Dutton, Neale A. W.; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K.

    2016-01-01

    SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed. PMID:27447643

  14. Detector motion method to increase spatial resolution in photon-counting detectors

    NASA Astrophysics Data System (ADS)

    Lee, Daehee; Park, Kyeongjin; Lim, Kyung Taek; Cho, Gyuseong

    2017-03-01

    Medical imaging requires high spatial resolution of an image to identify fine lesions. Photon-counting detectors in medical imaging have recently been rapidly replacing energy-integrating detectors due to the former`s high spatial resolution, high efficiency and low noise. Spatial resolution in a photon counting image is determined by the pixel size. Therefore, the smaller the pixel size, the higher the spatial resolution that can be obtained in an image. However, detector redesigning is required to reduce pixel size, and an expensive fine process is required to integrate a signal processing unit with reduced pixel size. Furthermore, as the pixel size decreases, charge sharing severely deteriorates spatial resolution. To increase spatial resolution, we propose a detector motion method using a large pixel detector that is less affected by charge sharing. To verify the proposed method, we utilized a UNO-XRI photon-counting detector (1-mm CdTe, Timepix chip) at the maximum X-ray tube voltage of 80 kVp. A similar spatial resolution of a 55- μm-pixel image was achieved by application of the proposed method to a 110- μm-pixel detector with a higher signal-to-noise ratio. The proposed method could be a way to increase spatial resolution without a pixel redesign when pixels severely suffer from charge sharing as pixel size is reduced.

  15. Statistical approaches to the analysis of point count data: A little extra information can go a long way

    USGS Publications Warehouse

    Farnsworth, G.L.; Nichols, J.D.; Sauer, J.R.; Fancy, S.G.; Pollock, K.H.; Shriner, S.A.; Simons, T.R.; Ralph, C. John; Rich, Terrell D.

    2005-01-01

    Point counts are a standard sampling procedure for many bird species, but lingering concerns still exist about the quality of information produced from the method. It is well known that variation in observer ability and environmental conditions can influence the detection probability of birds in point counts, but many biologists have been reluctant to abandon point counts in favor of more intensive approaches to counting. However, over the past few years a variety of statistical and methodological developments have begun to provide practical ways of overcoming some of the problems with point counts. We describe some of these approaches, and show how they can be integrated into standard point count protocols to greatly enhance the quality of the information. Several tools now exist for estimation of detection probability of birds during counts, including distance sampling, double observer methods, time-depletion (removal) methods, and hybrid methods that combine these approaches. Many counts are conducted in habitats that make auditory detection of birds much more likely than visual detection. As a framework for understanding detection probability during such counts, we propose separating two components of the probability a bird is detected during a count into (1) the probability a bird vocalizes during the count and (2) the probability this vocalization is detected by an observer. In addition, we propose that some measure of the area sampled during a count is necessary for valid inferences about bird populations. This can be done by employing fixed-radius counts or more sophisticated distance-sampling models. We recommend any studies employing point counts be designed to estimate detection probability and to include a measure of the area sampled.

  16. Investigation of energy weighting using an energy discriminating photon counting detector for breast CT

    PubMed Central

    Kalluri, Kesava S.; Mahd, Mufeed; Glick, Stephen J.

    2013-01-01

    Purpose: Breast CT is an emerging imaging technique that can portray the breast in 3D and improve visualization of important diagnostic features. Early clinical studies have suggested that breast CT has sufficient spatial and contrast resolution for accurate detection of masses and microcalcifications in the breast, reducing structural overlap that is often a limiting factor in reading mammographic images. For a number of reasons, image quality in breast CT may be improved by use of an energy resolving photon counting detector. In this study, the authors investigate the improvements in image quality obtained when using energy weighting with an energy resolving photon counting detector as compared to that with a conventional energy integrating detector. Methods: Using computer simulation, realistic CT images of multiple breast phantoms were generated. The simulation modeled a prototype breast CT system using an amorphous silicon (a-Si), CsI based energy integrating detector with different x-ray spectra, and a hypothetical, ideal CZT based photon counting detector with capability of energy discrimination. Three biological signals of interest were modeled as spherical lesions and inserted into breast phantoms; hydroxyapatite (HA) to represent microcalcification, infiltrating ductal carcinoma (IDC), and iodine enhanced infiltrating ductal carcinoma (IIDC). Signal-to-noise ratio (SNR) of these three lesions was measured from the CT reconstructions. In addition, a psychophysical study was conducted to evaluate observer performance in detecting microcalcifications embedded into a realistic anthropomorphic breast phantom. Results: In the energy range tested, improvements in SNR with a photon counting detector using energy weighting was higher (than the energy integrating detector method) by 30%–63% and 4%–34%, for HA and IDC lesions and 12%–30% (with Al filtration) and 32%–38% (with Ce filtration) for the IIDC lesion, respectively. The average area under the receiver operating characteristic curve (AUC) for detection of microcalcifications was higher by greater than 19% (for the different energy weighting methods tested) as compared to the AUC obtained with an energy integrating detector. Conclusions: This study showed that breast CT with a CZT photon counting detector using energy weighting can provide improvements in pixel SNR, and detectability of microcalcifications as compared to that with a conventional energy integrating detector. Since a number of degrading physical factors were not modeled into the photon counting detector, this improvement should be considered as an upper bound on achievable performance. PMID:23927337

  17. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    PubMed

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  18. Survey of predators and sampling method comparison in sweet corn.

    PubMed

    Musser, Fred R; Nyrop, Jan P; Shelton, Anthony M

    2004-02-01

    Natural predation is an important component of integrated pest management that is often overlooked because it is difficult to quantify and perceived to be unreliable. To begin incorporating natural predation into sweet corn, Zea mays L., pest management, a predator survey was conducted and then three sampling methods were compared for their ability to accurately monitor the most abundant predators. A predator survey on sweet corn foliage in New York between 1999 and 2001 identified 13 species. Orius insidiosus (Say), Coleomegilla maculata (De Geer), and Harmonia axyridis (Pallas) were the most numerous predators in all years. To determine the best method for sampling adult and immature stages of these predators, comparisons were made among nondestructive field counts, destructive counts, and yellow sticky cards. Field counts were correlated with destructive counts for all populations, but field counts of small insects were biased. Sticky cards underrepresented immature populations. Yellow sticky cards were more attractive to C. maculata adults than H. axyridis adults, especially before pollen shed, making coccinellid population estimates based on sticky cards unreliable. Field counts were the most precise method for monitoring adult and immature stages of the three major predators. Future research on predicting predation of pests in sweet corn should be based on field counts of predators because these counts are accurate, have no associated supply costs, and can be made quickly.

  19. A new NIST primary standardization of 18F.

    PubMed

    Fitzgerald, R; Zimmerman, B E; Bergeron, D E; Cessna, J C; Pibida, L; Moreira, D S

    2014-02-01

    A new primary standardization of (18)F by NIST is reported. The standard is based on live-timed beta-gamma anticoincidence counting with confirmatory measurements by three other methods: (i) liquid scintillation (LS) counting using CIEMAT/NIST (3)H efficiency tracing; (ii) triple-to-double coincidence ratio (TDCR) counting; and (iii) NaI integral counting and HPGe γ-ray spectrometry. The results are reported as calibration factors for NIST-maintained ionization chambers (including some "dose calibrators"). The LS-based methods reveal evidence for cocktail instability for one LS cocktail. Using an ionization chamber to link this work with previous NIST results, the new value differs from the previous reports by about 4%, but appears to be in good agreement with the key comparison reference value (KCRV) of 2005. © 2013 Published by Elsevier Ltd.

  20. Evaluation of surveillance methods for monitoring house fly abundance and activity on large commercial dairy operations.

    PubMed

    Gerry, Alec C; Higginbotham, G E; Periera, L N; Lam, A; Shelton, C R

    2011-06-01

    Relative house fly, Musca domestica L., activity at three large dairies in central California was monitored during the peak fly activity period from June to August 2005 by using spot cards, fly tapes, bait traps, and Alsynite traps. Counts for all monitoring methods were significantly related at two of three dairies; with spot card counts significantly related to fly tape counts recorded the same week, and both spot card counts and fly tape counts significantly related to bait trap counts 1-2 wk later. Mean fly counts differed significantly between dairies, but a significant interaction between dairies sampled and monitoring methods used demonstrates that between-dairy comparisons are unwise. Estimate precision was determined by the coefficient of variability (CV) (or SE/mean). Using a CV = 0.15 as a desired level of estimate precision and assuming an integrate pest management (IPM) action threshold near the peak house fly activity measured by each monitoring method, house fly monitoring at a large dairy would require 12 spot cards placed in midafternoon shaded fly resting sites near cattle or seven bait traps placed in open areas near cattle. Software (FlySpotter; http://ucanr.org/ sites/FlySpotter/download/) using computer vision technology was developed to count fly spots on a scanned image of a spot card to dramatically reduce time invested in monitoring house flies. Counts provided by the FlySpotter software were highly correlated to visual counts. The use of spot cards for monitoring house flies is recommended for dairy IPM programs.

  1. Comparison of Three Bed Bug Management Strategies in a Low-Income Apartment Building.

    PubMed

    Wang, Changlu; Saltzmann, Kurt; Bennett, Gary; Gibb, Timothy

    2012-04-02

    Bed bug (Cimex lectularius L.) infestations are currently controlled by a variety of non-chemical and chemical methods. There have been few studies on the comparative effectiveness of these control techniques. We evaluated three bed bug management strategies in an apartment building: (1) non-chemical methods only (n = 9); (2) insecticides only (n = 6); and (3) integrated pest management including both non-chemical methods and insecticides (n = 9). The apartments were one-bedroom units occupied by seniors or people with disabilities. Bed bug numbers in each apartment were determined by visual inspection and/or installing intercepting devices under bed and sofa legs. The median (min, max) bed bug counts in the non-chemical methods only, insecticides only, and integrated pest management (IPM) treatment were: 4 (1, 57), 19 (1, 250), and 14 (1, 219), respectively prior to the treatments. The apartments were retreated if found necessary during biweekly to monthly inspections. After 10 weeks, bed bugs were found to be eliminated from 67, 33, and 44% of the apartments in the three treatment groups, respectively. The final (after 10 weeks) median (min, max) bed bug counts in the non-chemical methods only, insecticides only, and IPM treatment were: 0 (0, 134), 11.5 (0, 58), and 1 (0, 38), respectively. There were no significant differences in the speed of bed bug count reduction or the final bed bug counts. Lack of resident cooperation partially contributed to the failure in eliminating bed bugs from some of the apartments. Results of this study suggest that non-chemical methods can effectively eliminate bed bugs in lightly infested apartments.

  2. Accuracy of neutron self-activation method with iodine-containing scintillators for quantifying 128I generation using decay-fitting technique

    NASA Astrophysics Data System (ADS)

    Nohtomi, Akihiro; Wakabayashi, Genichiro

    2015-11-01

    We evaluated the accuracy of a self-activation method with iodine-containing scintillators in quantifying 128I generation in an activation detector; the self-activation method was recently proposed for photo-neutron on-line measurements around X-ray radiotherapy machines. Here, we consider the accuracy of determining the initial count rate R0, observed just after termination of neutron irradiation of the activation detector. The value R0 is directly related to the amount of activity generated by incident neutrons; the detection efficiency of radiation emitted from the activity should be taken into account for such an evaluation. Decay curves of 128I activity were numerically simulated by a computer program for various conditions including different initial count rates (R0) and background rates (RB), as well as counting statistical fluctuations. The data points sampled at minute intervals and integrated over the same period were fit by a non-linear least-squares fitting routine to obtain the value R0 as a fitting parameter with an associated uncertainty. The corresponding background rate RB was simultaneously calculated in the same fitting routine. Identical data sets were also evaluated by a well-known integration algorithm used for conventional activation methods and the results were compared with those of the proposed fitting method. When we fixed RB = 500 cpm, the relative uncertainty σR0 /R0 ≤ 0.02 was achieved for R0/RB ≥ 20 with 20 data points from 1 min to 20 min following the termination of neutron irradiation used in the fitting; σR0 /R0 ≤ 0.01 was achieved for R0/RB ≥ 50 with the same data points. Reasonable relative uncertainties to evaluate initial count rates were reached by the decay-fitting method using practically realistic sampling numbers. These results clarified the theoretical limits of the fitting method. The integration method was found to be potentially vulnerable to short-term variations in background levels, especially instantaneous contaminations by spike-like noise. The fitting method easily detects and removes such spike-like noise.

  3. Children's Counting Strategies for Time Quantification and Integration.

    ERIC Educational Resources Information Center

    Wilkening, Friedrich; And Others

    1987-01-01

    Investigated whether and how children age 5 to 7 employed counting to measure and integrate the duration of two events, which were accompanied by metronome beats for half the children. The rhythm enhanced use of counting in younger children. By age 7, most counted spontaneously, using sensible counting strategies. (SKC)

  4. Si-strip photon counting detectors for contrast-enhanced spectral mammography

    NASA Astrophysics Data System (ADS)

    Chen, Buxin; Reiser, Ingrid; Wessel, Jan C.; Malakhov, Nail; Wawrzyniak, Gregor; Hartsough, Neal E.; Gandhi, Thulasi; Chen, Chin-Tu; Iwanczyk, Jan S.; Barber, William C.

    2015-08-01

    We report on the development of silicon strip detectors for energy-resolved clinical mammography. Typically, X-ray integrating detectors based on scintillating cesium iodide CsI(Tl) or amorphous selenium (a-Se) are used in most commercial systems. Recently, mammography instrumentation has been introduced based on photon counting Si strip detectors. The required performance for mammography in terms of the output count rate, spatial resolution, and dynamic range must be obtained with sufficient field of view for the application, thus requiring the tiling of pixel arrays and particular scanning techniques. Room temperature Si strip detector, operating as direct conversion x-ray sensors, can provide the required speed when connected to application specific integrated circuits (ASICs) operating at fast peaking times with multiple fixed thresholds per pixel, provided that the sensors are designed for rapid signal formation across the X-ray energy ranges of the application. We present our methods and results from the optimization of Si-strip detectors for contrast enhanced spectral mammography. We describe the method being developed for quantifying iodine contrast using the energy-resolved detector with fixed thresholds. We demonstrate the feasibility of the method by scanning an iodine phantom with clinically relevant contrast levels.

  5. Comparison of Three Bed Bug Management Strategies in a Low-Income Apartment Building

    PubMed Central

    Wang, Changlu; Saltzmann, Kurt; Bennett, Gary; Gibb, Timothy

    2012-01-01

    Bed bug (Cimex lectularius L.) infestations are currently controlled by a variety of non-chemical and chemical methods. There have been few studies on the comparative effectiveness of these control techniques. We evaluated three bed bug management strategies in an apartment building: (1) non-chemical methods only (n = 9); (2) insecticides only (n = 6); and (3) integrated pest management including both non-chemical methods and insecticides (n = 9). The apartments were one-bedroom units occupied by seniors or people with disabilities. Bed bug numbers in each apartment were determined by visual inspection and/or installing intercepting devices under bed and sofa legs. The median (min, max) bed bug counts in the non-chemical methods only, insecticides only, and integrated pest management (IPM) treatment were: 4 (1, 57), 19 (1, 250), and 14 (1, 219), respectively prior to the treatments. The apartments were retreated if found necessary during biweekly to monthly inspections. After 10 weeks, bed bugs were found to be eliminated from 67, 33, and 44% of the apartments in the three treatment groups, respectively. The final (after 10 weeks) median (min, max) bed bug counts in the non-chemical methods only, insecticides only, and IPM treatment were: 0 (0, 134), 11.5 (0, 58), and 1 (0, 38), respectively. There were no significant differences in the speed of bed bug count reduction or the final bed bug counts. Lack of resident cooperation partially contributed to the failure in eliminating bed bugs from some of the apartments. Results of this study suggest that non-chemical methods can effectively eliminate bed bugs in lightly infested apartments. PMID:26466533

  6. 2D dark-count-rate modeling of PureB single-photon avalanche diodes in a TCAD environment

    NASA Astrophysics Data System (ADS)

    Knežević, Tihomir; Nanver, Lis K.; Suligoj, Tomislav

    2018-02-01

    PureB silicon photodiodes have nm-shallow p+n junctions with which photons/electrons with penetration-depths of a few nanometer can be detected. PureB Single-Photon Avalanche Diodes (SPADs) were fabricated and analysed by 2D numerical modeling as an extension to TCAD software. The very shallow p+ -anode has high perimeter curvature that enhances the electric field. In SPADs, noise is quantified by the dark count rate (DCR) that is a measure for the number of false counts triggered by unwanted processes in the non-illuminated device. Just like for desired events, the probability a dark count increases with increasing electric field and the perimeter conditions are critical. In this work, the DCR was studied by two 2D methods of analysis: the "quasi-2D" (Q-2D) method where vertical 1D cross-sections were assumed for calculating the electron/hole avalanche-probabilities, and the "ionization-integral 2D" (II-2D) method where crosssections were placed where the maximum ionization-integrals were calculated. The Q-2D method gave satisfactory results in structures where the peripheral regions had a small contribution to the DCR, such as in devices with conventional deepjunction guard rings (GRs). Otherwise, the II-2D method proved to be much more precise. The results show that the DCR simulation methods are useful for optimizing the compromise between fill-factor and p-/n-doping profile design in SPAD devices. For the experimentally investigated PureB SPADs, excellent agreement of the measured and simulated DCR was achieved. This shows that although an implicit GR is attractively compact, the very shallow pn-junction gives a risk of having such a low breakdown voltage at the perimeter that the DCR of the device may be negatively impacted.

  7. Multiple-Event, Single-Photon Counting Imaging Sensor

    NASA Technical Reports Server (NTRS)

    Zheng, Xinyu; Cunningham, Thomas J.; Sun, Chao; Wang, Kang L.

    2011-01-01

    The single-photon counting imaging sensor is typically an array of silicon Geiger-mode avalanche photodiodes that are monolithically integrated with CMOS (complementary metal oxide semiconductor) readout, signal processing, and addressing circuits located in each pixel and the peripheral area of the chip. The major problem is its single-event method for photon count number registration. A single-event single-photon counting imaging array only allows registration of up to one photon count in each of its pixels during a frame time, i.e., the interval between two successive pixel reset operations. Since the frame time can t be too short, this will lead to very low dynamic range and make the sensor merely useful for very low flux environments. The second problem of the prior technique is a limited fill factor resulting from consumption of chip area by the monolithically integrated CMOS readout in pixels. The resulting low photon collection efficiency will substantially ruin any benefit gained from the very sensitive single-photon counting detection. The single-photon counting imaging sensor developed in this work has a novel multiple-event architecture, which allows each of its pixels to register as more than one million (or more) photon-counting events during a frame time. Because of a consequently boosted dynamic range, the imaging array of the invention is capable of performing single-photon counting under ultra-low light through high-flux environments. On the other hand, since the multiple-event architecture is implemented in a hybrid structure, back-illumination and close-to-unity fill factor can be realized, and maximized quantum efficiency can also be achieved in the detector array.

  8. A counting-weighted calibration method for a field-programmable-gate-array-based time-to-digital converter

    NASA Astrophysics Data System (ADS)

    Chen, Yuan-Ho

    2017-05-01

    In this work, we propose a counting-weighted calibration method for field-programmable-gate-array (FPGA)-based time-to-digital converter (TDC) to provide non-linearity calibration for use in positron emission tomography (PET) scanners. To deal with the non-linearity in FPGA, we developed a counting-weighted delay line (CWD) to count the delay time of the delay cells in the TDC in order to reduce the differential non-linearity (DNL) values based on code density counts. The performance of the proposed CWD-TDC with regard to linearity far exceeds that of TDC with a traditional tapped delay line (TDL) architecture, without the need for nonlinearity calibration. When implemented in a Xilinx Vertix-5 FPGA device, the proposed CWD-TDC achieved time resolution of 60 ps with integral non-linearity (INL) and DNL of [-0.54, 0.24] and [-0.66, 0.65] least-significant-bit (LSB), respectively. This is a clear indication of the suitability of the proposed FPGA-based CWD-TDC for use in PET scanners.

  9. Automated Mobile System for Accurate Outdoor Tree Crop Enumeration Using an Uncalibrated Camera.

    PubMed

    Nguyen, Thuy Tuong; Slaughter, David C; Hanson, Bradley D; Barber, Andrew; Freitas, Amy; Robles, Daniel; Whelan, Erin

    2015-07-28

    This paper demonstrates an automated computer vision system for outdoor tree crop enumeration in a seedling nursery. The complete system incorporates both hardware components (including an embedded microcontroller, an odometry encoder, and an uncalibrated digital color camera) and software algorithms (including microcontroller algorithms and the proposed algorithm for tree crop enumeration) required to obtain robust performance in a natural outdoor environment. The enumeration system uses a three-step image analysis process based upon: (1) an orthographic plant projection method integrating a perspective transform with automatic parameter estimation; (2) a plant counting method based on projection histograms; and (3) a double-counting avoidance method based on a homography transform. Experimental results demonstrate the ability to count large numbers of plants automatically with no human effort. Results show that, for tree seedlings having a height up to 40 cm and a within-row tree spacing of approximately 10 cm, the algorithms successfully estimated the number of plants with an average accuracy of 95.2% for trees within a single image and 98% for counting of the whole plant population in a large sequence of images.

  10. Automated Mobile System for Accurate Outdoor Tree Crop Enumeration Using an Uncalibrated Camera

    PubMed Central

    Nguyen, Thuy Tuong; Slaughter, David C.; Hanson, Bradley D.; Barber, Andrew; Freitas, Amy; Robles, Daniel; Whelan, Erin

    2015-01-01

    This paper demonstrates an automated computer vision system for outdoor tree crop enumeration in a seedling nursery. The complete system incorporates both hardware components (including an embedded microcontroller, an odometry encoder, and an uncalibrated digital color camera) and software algorithms (including microcontroller algorithms and the proposed algorithm for tree crop enumeration) required to obtain robust performance in a natural outdoor environment. The enumeration system uses a three-step image analysis process based upon: (1) an orthographic plant projection method integrating a perspective transform with automatic parameter estimation; (2) a plant counting method based on projection histograms; and (3) a double-counting avoidance method based on a homography transform. Experimental results demonstrate the ability to count large numbers of plants automatically with no human effort. Results show that, for tree seedlings having a height up to 40 cm and a within-row tree spacing of approximately 10 cm, the algorithms successfully estimated the number of plants with an average accuracy of 95.2% for trees within a single image and 98% for counting of the whole plant population in a large sequence of images. PMID:26225982

  11. A polychromatic adaption of the Beer-Lambert model for spectral decomposition

    NASA Astrophysics Data System (ADS)

    Sellerer, Thorsten; Ehn, Sebastian; Mechlem, Korbinian; Pfeiffer, Franz; Herzen, Julia; Noël, Peter B.

    2017-03-01

    We present a semi-empirical forward-model for spectral photon-counting CT which is fully compatible with state-of-the-art maximum-likelihood estimators (MLE) for basis material line integrals. The model relies on a minimum calibration effort to make the method applicable in routine clinical set-ups with the need for periodic re-calibration. In this work we present an experimental verifcation of our proposed method. The proposed method uses an adapted Beer-Lambert model, describing the energy dependent attenuation of a polychromatic x-ray spectrum using additional exponential terms. In an experimental dual-energy photon-counting CT setup based on a CdTe detector, the model demonstrates an accurate prediction of the registered counts for an attenuated polychromatic spectrum. Thereby deviations between model and measurement data lie within the Poisson statistical limit of the performed acquisitions, providing an effectively unbiased forward-model. The experimental data also shows that the model is capable of handling possible spectral distortions introduced by the photon-counting detector and CdTe sensor. The simplicity and high accuracy of the proposed model provides a viable forward-model for MLE-based spectral decomposition methods without the need of costly and time-consuming characterization of the system response.

  12. In vivo macular pigment measurements: a comparison of resonance Raman spectroscopy and heterochromatic flicker photometry

    PubMed Central

    Hogg, R E; Anderson, R S; Stevenson, M R; Zlatkova, M B; Chakravarthy, U

    2007-01-01

    Aim To investigate whether two methods of measuring macular pigment—namely, heterochromatic flicker photometry (HFP) and resonance Raman spectroscopy (RRS)—yield comparable data. Methods Macular pigment was measured using HFP and RRS in the right eye of 107 participants aged 20–79 years. Correlations between methods were sought and regression models generated. RRS was recorded as Raman counts and HFP as macular pigment optical density (MPOD). The average of the top three of five Raman counts was compared with MPOD obtained at 0.5° eccentricity, and an integrated measure (spatial profile; MPODsp) computed from four stimulus sizes on HFP. Results The coefficient of variation was 12.0% for MPODsp and 13.5% for Raman counts. MPODsp exhibited significant correlations with Raman counts (r = 0.260, p = 0.012), whereas MPOD at 0.5° did not correlate significantly (r = 0.163, p = 0.118). MPODsp was not significantly correlated with age (p = 0.062), whereas MPOD at 0.5° was positively correlated (p = 0.011). Raman counts showed a significant decrease with age (p = 0.002) and were significantly lower when pupil size was smaller (p = 0.015). Conclusions Despite a statistically significant correlation, the correlations were weak, with those in excess of 90% of the variance between MPODsp and Raman counts remaining unexplained, meriting further research. PMID:16825281

  13. Signal to noise ratio of energy selective x-ray photon counting systems with pileup

    PubMed Central

    Alvarez, Robert E.

    2014-01-01

    Purpose: To derive fundamental limits on the effect of pulse pileup and quantum noise in photon counting detectors on the signal to noise ratio (SNR) and noise variance of energy selective x-ray imaging systems. Methods: An idealized model of the response of counting detectors to pulse pileup is used. The model assumes a nonparalyzable response and delta function pulse shape. The model is used to derive analytical formulas for the noise and energy spectrum of the recorded photons with pulse pileup. These formulas are first verified with a Monte Carlo simulation. They are then used with a method introduced in a previous paper [R. E. Alvarez, “Near optimal energy selective x-ray imaging system performance with simple detectors,” Med. Phys. 37, 822–841 (2010)] to compare the signal to noise ratio with pileup to the ideal SNR with perfect energy resolution. Detectors studied include photon counting detectors with pulse height analysis (PHA), detectors that simultaneously measure the number of photons and the integrated energy (NQ detector), and conventional energy integrating and photon counting detectors. The increase in the A-vector variance with dead time is also computed and compared to the Monte Carlo results. A formula for the covariance of the NQ detector is developed. The validity of the constant covariance approximation to the Cramèr–Rao lower bound (CRLB) for larger counts is tested. Results: The SNR becomes smaller than the conventional energy integrating detector (Q) SNR for 0.52, 0.65, and 0.78 expected number photons per dead time for counting (N), two, and four bin PHA detectors, respectively. The NQ detector SNR is always larger than the N and Q SNR but only marginally so for larger dead times. Its noise variance increases by a factor of approximately 3 and 5 for the A1 and A2 components as the dead time parameter increases from 0 to 0.8 photons per dead time. With four bin PHA data, the increase in variance is approximately 2 and 4 times. The constant covariance approximation to the CRLB is valid for larger counts such as those used in medical imaging. Conclusions: The SNR decreases rapidly as dead time increases. This decrease places stringent limits on allowable dead times with the high count rates required for medical imaging systems. The probability distribution of the idealized data with pileup is shown to be accurately described as a multivariate normal for expected counts greater than those typically utilized in medical imaging systems. The constant covariance approximation to the CRLB is also shown to be valid in this case. A new formula for the covariance of the NQ detector with pileup is derived and validated. PMID:25370642

  14. A New Statistics-Based Online Baseline Restorer for a High Count-Rate Fully Digital System.

    PubMed

    Li, Hongdi; Wang, Chao; Baghaei, Hossain; Zhang, Yuxuan; Ramirez, Rocio; Liu, Shitao; An, Shaohui; Wong, Wai-Hoi

    2010-04-01

    The goal of this work is to develop a novel, accurate, real-time digital baseline restorer using online statistical processing for a high count-rate digital system such as positron emission tomography (PET). In high count-rate nuclear instrumentation applications, analog signals are DC-coupled for better performance. However, the detectors, pre-amplifiers and other front-end electronics would cause a signal baseline drift in a DC-coupling system, which will degrade the performance of energy resolution and positioning accuracy. Event pileups normally exist in a high-count rate system and the baseline drift will create errors in the event pileup-correction. Hence, a baseline restorer (BLR) is required in a high count-rate system to remove the DC drift ahead of the pileup correction. Many methods have been reported for BLR from classic analog methods to digital filter solutions. However a single channel BLR with analog method can only work under 500 kcps count-rate, and normally an analog front-end application-specific integrated circuits (ASIC) is required for the application involved hundreds BLR such as a PET camera. We have developed a simple statistics-based online baseline restorer (SOBLR) for a high count-rate fully digital system. In this method, we acquire additional samples, excluding the real gamma pulses, from the existing free-running ADC in the digital system, and perform online statistical processing to generate a baseline value. This baseline value will be subtracted from the digitized waveform to retrieve its original pulse with zero-baseline drift. This method can self-track the baseline without a micro-controller involved. The circuit consists of two digital counter/timers, one comparator, one register and one subtraction unit. Simulation shows a single channel works at 30 Mcps count-rate with pileup condition. 336 baseline restorer circuits have been implemented into 12 field-programmable-gate-arrays (FPGA) for our new fully digital PET system.

  15. Social network extraction based on Web: 3. the integrated superficial method

    NASA Astrophysics Data System (ADS)

    Nasution, M. K. M.; Sitompul, O. S.; Noah, S. A.

    2018-03-01

    The Web as a source of information has become part of the social behavior information. Although, by involving only the limitation of information disclosed by search engines in the form of: hit counts, snippets, and URL addresses of web pages, the integrated extraction method produces a social network not only trusted but enriched. Unintegrated extraction methods may produce social networks without explanation, resulting in poor supplemental information, or resulting in a social network of durmise laden, consequently unrepresentative social structures. The integrated superficial method in addition to generating the core social network, also generates an expanded network so as to reach the scope of relation clues, or number of edges computationally almost similar to n(n - 1)/2 for n social actors.

  16. Spitzer deep and wide legacy mid- and far-infrared number counts and lower limits of cosmic infrared background

    NASA Astrophysics Data System (ADS)

    Béthermin, M.; Dole, H.; Beelen, A.; Aussel, H.

    2010-03-01

    Aims: We aim to place stronger lower limits on the cosmic infrared background (CIB) brightness at 24 μm, 70 μm and 160 μm and measure the extragalactic number counts at these wavelengths in a homogeneous way from various surveys. Methods: Using Spitzer legacy data over 53.6 deg2 of various depths, we build catalogs with the same extraction method at each wavelength. Completeness and photometric accuracy are estimated with Monte-Carlo simulations. Number count uncertainties are estimated with a counts-in-cells moment method to take galaxy clustering into account. Furthermore, we use a stacking analysis to estimate number counts of sources not detected at 70 μm and 160 μm. This method is validated by simulations. The integration of the number counts gives new CIB lower limits. Results: Number counts reach 35 μJy, 3.5 mJy and 40 mJy at 24 μm, 70 μm, and 160 μm, respectively. We reach deeper flux densities of 0.38 mJy at 70, and 3.1 at 160 μm with a stacking analysis. We confirm the number count turnover at 24 μm and 70 μm, and observe it for the first time at 160 μm at about 20 mJy, together with a power-law behavior below 10 mJy. These mid- and far-infrared counts: 1) are homogeneously built by combining fields of different depths and sizes, providing a legacy over about three orders of magnitude in flux density; 2) are the deepest to date at 70 μm and 160 μm; 3) agree with previously published results in the common measured flux density range; 4) globally agree with the Lagache et al. (2004) model, except at 160 μm, where the model slightly overestimates the counts around 20 and 200 mJy. Conclusions: These counts are integrated to estimate new CIB firm lower limits of 2.29-0.09+0.09 nW m-2 sr-1, 5.4-0.4+0.4 nW m-2 sr-1, and 8.9-1.1+1.1 nW m-2 sr-1 at 24 μm, 70 μm, and 160 μm, respectively, and extrapolated to give new estimates of the CIB due to galaxies of 2.86-0.16+0.19 nW m-2 sr-1, 6.6-0.6+0.7 nW m-2 sr-1, and 14.6-2.9+7.1 nW m-2 sr-1, respectively. Products (point spread function, counts, CIB contributions, software) are publicly available for download at http://www.ias.u-psud.fr/irgalaxies/ Counts and CIB contributions are only available in electronic form at the CDS via anonymous ftp to cdsarc.u-strasbg.fr (130.79.128.5) or via http://cdsweb.u-strasbg.fr/cgi-bin/qcat?J/A+A/512/A78

  17. Counting local integrals of motion in disordered spinless-fermion and Hubbard chains

    NASA Astrophysics Data System (ADS)

    Mierzejewski, Marcin; Kozarzewski, Maciej; Prelovšek, Peter

    2018-02-01

    We develop a procedure which systematically generates all conserved operators in the disordered models of interacting fermions. Among these operators, we identify and count the independent and local integrals of motion (LIOM), which represent the hallmark of the many-body localization (MBL). The method is tested first on the prototype disordered chain of interacting spinless fermions. As expected for full MBL, we find for large enough disorder NM=2M-1 independent and quasilocal LIOM with support on M consecutive sites. On the other hand, the study of the disordered Hubbard chain reveals that 3M-1

  18. Edge detection of optical subaperture image based on improved differential box-counting method

    NASA Astrophysics Data System (ADS)

    Li, Yi; Hui, Mei; Liu, Ming; Dong, Liquan; Kong, Lingqin; Zhao, Yuejin

    2018-01-01

    Optical synthetic aperture imaging technology is an effective approach to improve imaging resolution. Compared with monolithic mirror system, the image of optical synthetic aperture system is often more complex at the edge, and as a result of the existence of gap between segments, which makes stitching becomes a difficult problem. So it is necessary to extract the edge of subaperture image for achieving effective stitching. Fractal dimension as a measure feature can describe image surface texture characteristics, which provides a new approach for edge detection. In our research, an improved differential box-counting method is used to calculate fractal dimension of image, then the obtained fractal dimension is mapped to grayscale image to detect edges. Compared with original differential box-counting method, this method has two improvements as follows: by modifying the box-counting mechanism, a box with a fixed height is replaced by a box with adaptive height, which solves the problem of over-counting the number of boxes covering image intensity surface; an image reconstruction method based on super-resolution convolutional neural network is used to enlarge small size image, which can solve the problem that fractal dimension can't be calculated accurately under the small size image, and this method may well maintain scale invariability of fractal dimension. The experimental results show that the proposed algorithm can effectively eliminate noise and has a lower false detection rate compared with the traditional edge detection algorithms. In addition, this algorithm can maintain the integrity and continuity of image edge in the case of retaining important edge information.

  19. A comparison of artificial compressibility and fractional step methods for incompressible flow computations

    NASA Technical Reports Server (NTRS)

    Chan, Daniel C.; Darian, Armen; Sindir, Munir

    1992-01-01

    We have applied and compared the efficiency and accuracy of two commonly used numerical methods for the solution of Navier-Stokes equations. The artificial compressibility method augments the continuity equation with a transient pressure term and allows one to solve the modified equations as a coupled system. Due to its implicit nature, one can have the luxury of taking a large temporal integration step at the expense of higher memory requirement and larger operation counts per step. Meanwhile, the fractional step method splits the Navier-Stokes equations into a sequence of differential operators and integrates them in multiple steps. The memory requirement and operation count per time step are low, however, the restriction on the size of time marching step is more severe. To explore the strengths and weaknesses of these two methods, we used them for the computation of a two-dimensional driven cavity flow with Reynolds number of 100 and 1000, respectively. Three grid sizes, 41 x 41, 81 x 81, and 161 x 161 were used. The computations were considered after the L2-norm of the change of the dependent variables in two consecutive time steps has fallen below 10(exp -5).

  20. An accurate derivation of the air dose-rate and the deposition concentration distribution by aerial monitoring in a low level contaminated area

    NASA Astrophysics Data System (ADS)

    Nishizawa, Yukiyasu; Sugita, Takeshi; Sanada, Yukihisa; Torii, Tatsuo

    2015-04-01

    Since 2011, MEXT (Ministry of Education, Culture, Sports, Science and Technology, Japan) have been conducting aerial monitoring to investigate the distribution of radioactive cesium dispersed into the atmosphere after the accident at the Fukushima Dai-ichi Nuclear Power Plant (FDNPP), Tokyo Electric Power Company. Distribution maps of the air dose-rate at 1 m above the ground and the radioactive cesium deposition concentration on the ground are prepared using spectrum obtained by aerial monitoring. The radioactive cesium deposition is derived from its dose rate, which is calculated by excluding the dose rate of the background radiation due to natural radionuclides from the air dose-rate at 1 m above the ground. The first step of the current method of calculating the dose rate due to natural radionuclides is calculate the ratio of the total count rate of areas where no radioactive cesium is detected and the count rate of regions with energy levels of 1,400 keV or higher (BG-Index). Next, calculate the air dose rate of radioactive cesium by multiplying the BG-Index and the integrated count rate of 1,400 keV or higher for the area where the radioactive cesium is distributed. In high dose-rate areas, however, the count rate of the 1,365-keV peak of Cs-134, though small, is included in the integrated count rate of 1,400 keV or higher, which could cause an overestimation of the air dose rate of natural radionuclides. We developed a method for accurately evaluating the distribution maps of natural air dose-rate by excluding the effect of radioactive cesium, even in contaminated areas, and obtained the accurate air dose-rate map attributed the radioactive cesium deposition on the ground. Furthermore, the natural dose-rate distribution throughout Japan has been obtained by this method.

  1. Integrating count and detection–nondetection data to model population dynamics

    USGS Publications Warehouse

    Zipkin, Elise F.; Rossman, Sam; Yackulic, Charles B.; Wiens, David; Thorson, James T.; Davis, Raymond J.; Grant, Evan H. Campbell

    2017-01-01

    There is increasing need for methods that integrate multiple data types into a single analytical framework as the spatial and temporal scale of ecological research expands. Current work on this topic primarily focuses on combining capture–recapture data from marked individuals with other data types into integrated population models. Yet, studies of species distributions and trends often rely on data from unmarked individuals across broad scales where local abundance and environmental variables may vary. We present a modeling framework for integrating detection–nondetection and count data into a single analysis to estimate population dynamics, abundance, and individual detection probabilities during sampling. Our dynamic population model assumes that site-specific abundance can change over time according to survival of individuals and gains through reproduction and immigration. The observation process for each data type is modeled by assuming that every individual present at a site has an equal probability of being detected during sampling processes. We examine our modeling approach through a series of simulations illustrating the relative value of count vs. detection–nondetection data under a variety of parameter values and survey configurations. We also provide an empirical example of the model by combining long-term detection–nondetection data (1995–2014) with newly collected count data (2015–2016) from a growing population of Barred Owl (Strix varia) in the Pacific Northwest to examine the factors influencing population abundance over time. Our model provides a foundation for incorporating unmarked data within a single framework, even in cases where sampling processes yield different detection probabilities. This approach will be useful for survey design and to researchers interested in incorporating historical or citizen science data into analyses focused on understanding how demographic rates drive population abundance.

  2. Integrating count and detection-nondetection data to model population dynamics.

    PubMed

    Zipkin, Elise F; Rossman, Sam; Yackulic, Charles B; Wiens, J David; Thorson, James T; Davis, Raymond J; Grant, Evan H Campbell

    2017-06-01

    There is increasing need for methods that integrate multiple data types into a single analytical framework as the spatial and temporal scale of ecological research expands. Current work on this topic primarily focuses on combining capture-recapture data from marked individuals with other data types into integrated population models. Yet, studies of species distributions and trends often rely on data from unmarked individuals across broad scales where local abundance and environmental variables may vary. We present a modeling framework for integrating detection-nondetection and count data into a single analysis to estimate population dynamics, abundance, and individual detection probabilities during sampling. Our dynamic population model assumes that site-specific abundance can change over time according to survival of individuals and gains through reproduction and immigration. The observation process for each data type is modeled by assuming that every individual present at a site has an equal probability of being detected during sampling processes. We examine our modeling approach through a series of simulations illustrating the relative value of count vs. detection-nondetection data under a variety of parameter values and survey configurations. We also provide an empirical example of the model by combining long-term detection-nondetection data (1995-2014) with newly collected count data (2015-2016) from a growing population of Barred Owl (Strix varia) in the Pacific Northwest to examine the factors influencing population abundance over time. Our model provides a foundation for incorporating unmarked data within a single framework, even in cases where sampling processes yield different detection probabilities. This approach will be useful for survey design and to researchers interested in incorporating historical or citizen science data into analyses focused on understanding how demographic rates drive population abundance. © 2017 by the Ecological Society of America.

  3. Detector Design Considerations in High-Dimensional Artificial Immune Systems

    DTIC Science & Technology

    2012-03-22

    a method known as randomized RNS [15]. In this approach, Monte Carlo integration is used to determine the size of self and non-self within the given...feature space, then a number of randomly placed detectors are chosen according to Monte Carlo integration calculations. Simulated annealing is then...detector is only counted once). This value is termed ‘actual content’ because it does not including overlapping content, but only that content that is

  4. Standardization of Ga-68 by coincidence measurements, liquid scintillation counting and 4πγ counting.

    PubMed

    Roteta, Miguel; Peyres, Virginia; Rodríguez Barquero, Leonor; García-Toraño, Eduardo; Arenillas, Pablo; Balpardo, Christian; Rodrígues, Darío; Llovera, Roberto

    2012-09-01

    The radionuclide (68)Ga is one of the few positron emitters that can be prepared in-house without the use of a cyclotron. It disintegrates to the ground state of (68)Zn partially by positron emission (89.1%) with a maximum energy of 1899.1 keV, and partially by electron capture (10.9%). This nuclide has been standardized in the frame of a cooperation project between the Radionuclide Metrology laboratories from CIEMAT (Spain) and CNEA (Argentina). Measurements involved several techniques: 4πβ-γ coincidences, integral gamma counting and Liquid Scintillation Counting using the triple to double coincidence ratio and the CIEMAT/NIST methods. Given the short half-life of the radionuclide assayed, a direct comparison between results from both laboratories was excluded and a comparison of experimental efficiencies of similar NaI detectors was used instead. Copyright © 2012 Elsevier Ltd. All rights reserved.

  5. Simulated fissioning of uranium and testing of the fission-track dating method

    USGS Publications Warehouse

    McGee, V.E.; Johnson, N.M.; Naeser, C.W.

    1985-01-01

    A computer program (FTD-SIM) faithfully simulates the fissioning of 238U with time and 235U with neutron dose. The simulation is based on first principles of physics where the fissioning of 238U with the flux of time is described by Ns = ??f 238Ut and the fissioning of 235U with the fluence of neutrons is described by Ni = ??235U??. The Poisson law is used to set the stochastic variation of fissioning within the uranium population. The life history of a given crystal can thus be traced under an infinite variety of age and irradiation conditions. A single dating attempt or up to 500 dating attempts on a given crystal population can be simulated by specifying the age of the crystal population, the size and variation in the areas to be counted, the amount and distribution of uranium, the neutron dose to be used and its variation, and the desired ratio of 238U to 235U. A variety of probability distributions can be applied to uranium and counting-area. The Price and Walker age equation is used to estimate age. The output of FTD-SIM includes the tabulated results of each individual dating attempt (sample) on demand and/or the summary statistics and histograms for multiple dating attempts (samples) including the sampling age. An analysis of the results from FTD-SIM shows that: (1) The external detector method is intrinsically more precise than the population method. (2) For the external detector method a correlation between spontaneous track count, Ns, and induced track count, Ni, results when the population of grains has a stochastic uranium content and/or when the counting areas between grains are stochastic. For the population method no correlation can exist. (3) In the external detector method the sampling distribution of age is independent of the number of grains counted. In the population method the sampling distribution of age is highly dependent on the number of grains counted. (4) Grains with zero-track counts, either in Ns or Ni, are in integral part of fissioning theory and under certain circumstances must be included in any estimate of age. (5) In estimating standard error of age the standard error of Ns and Ni and ?? must be accurately estimated and propagated through the age equation. Several statistical models are presently available to do so. ?? 1985.

  6. Low-pressure membrane integrity tests for drinking water treatment: A review.

    PubMed

    Guo, H; Wyart, Y; Perot, J; Nauleau, F; Moulin, P

    2010-01-01

    Low-pressure membrane systems, including microfiltration (MF) and ultrafiltration (UF) membranes, are being increasingly used in drinking water treatments due to their high level of pathogen removal. However, the pathogen will pass through the membrane and contaminate the product if the membrane integrity is compromised. Therefore, an effective on-line integrity monitoring method for MF and UF membrane systems is essential to guarantee the regulatory requirements for pathogen removal. A lot of works on low-pressure membrane integrity tests have been conducted by many researchers. This paper provides a literature review about different low-pressure membrane integrity monitoring methods for the drinking water treatment, including direct methods (pressure-based tests, acoustic sensor test, liquid porosimetry, etc.) and indirect methods (particle counting, particle monitoring, turbidity monitoring, surrogate challenge tests). Additionally, some information about the operation of membrane integrity tests is presented here. It can be realized from this review that it remains urgent to develop an alternative on-line detection technique for a quick, accurate, simple, continuous and relatively inexpensive evaluation of low-pressure membrane integrity. To better satisfy regulatory requirements for drinking water treatments, the characteristic of this ideal membrane integrity test is proposed at the end of this paper.

  7. The unbiasedness of a generalized mirage boundary correction method for Monte Carlo integration estimators of volume

    Treesearch

    Thomas B. Lynch; Jeffrey H. Gove

    2014-01-01

    The typical "double counting" application of the mirage method of boundary correction cannot be applied to sampling systems such as critical height sampling (CHS) that are based on a Monte Carlo sample of a tree (or debris) attribute because the critical height (or other random attribute) sampled from a mirage point is generally not equal to the critical...

  8. Differential white cell count by centrifugal microfluidics.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sommer, Gregory Jon; Tentori, Augusto M.; Schaff, Ulrich Y.

    We present a method for counting white blood cells that is uniquely compatible with centrifugation based microfluidics. Blood is deposited on top of one or more layers of density media within a microfluidic disk. Spinning the disk causes the cell populations within whole blood to settle through the media, reaching an equilibrium based on the density of each cell type. Separation and fluorescence measurement of cell types stained with a DNA dye is demonstrated using this technique. The integrated signal from bands of fluorescent microspheres is shown to be proportional to their initial concentration in suspension. Among the current generationmore » of medical diagnostics are devices based on the principle of centrifuging a CD sized disk functionalized with microfluidics. These portable 'lab on a disk' devices are capable of conducting multiple assays directly from a blood sample, embodied by platforms developed by Gyros, Samsung, and Abaxis. [1,2] However, no centrifugal platform to date includes a differential white blood cell count, which is an important metric complimentary to diagnostic assays. Measuring the differential white blood cell count (the relative fraction of granulocytes, lymphocytes, and monocytes) is a standard medical diagnostic technique useful for identifying sepsis, leukemia, AIDS, radiation exposure, and a host of other conditions that affect the immune system. Several methods exist for measuring the relative white blood cell count including flow cytometry, electrical impedance, and visual identification from a stained drop of blood under a microscope. However, none of these methods is easily incorporated into a centrifugal microfluidic diagnostic platform.« less

  9. Blood–brain barrier integrity, intrathecal immunoactivation, and neuronal injury in HIV

    PubMed Central

    Yilmaz, Aylin; Hagberg, Lars; Zetterberg, Henrik; Nilsson, Staffan; Brew, Bruce J.; Fuchs, Dietmar; Price, Richard W.; Gisslén, Magnus

    2016-01-01

    Objective: Although blood–brain barrier (BBB) impairment has been reported in HIV-infected individuals, characterization of this impairment has not been clearly defined. Methods: BBB integrity was measured by CSF/plasma albumin ratio in this cross-sectional study of 631 HIV-infected individuals and 71 controls. We also analyzed CSF and blood HIV RNA and neopterin, CSF leukocyte count, and neurofilament light chain protein (NFL) concentrations. The HIV-infected participants included untreated neuroasymptomatic patients, patients with untreated HIV-associated dementia (HAD), and participants on suppressive antiretroviral treatment (ART). Results: The albumin ratio was significantly increased in patients with HAD compared to all other groups. There were no significant differences between untreated neuroasymptomatic participants, treated participants, and controls. BBB integrity, however, correlated significantly with CSF leukocyte count, CSF HIV RNA, serum and CSF neopterin, and age in untreated neuroasymptomatic participants. In a multiple linear regression analysis, age, CSF neopterin, and CSF leukocyte count stood out as independent predictors of albumin ratio. A significant correlation was found between albumin ratio and CSF NFL in untreated neuroasymptomatic patients and in participants on ART. Albumin ratio, age, and CD4 cell count were confirmed as independent predictors of CSF NFL in multivariable analysis. Conclusions: BBB disruption was mainly found in patients with HAD, where BBB damage correlated with CNS immunoactivation. Albumin ratios also correlated with CSF inflammatory markers and NFL in untreated neuroasymptomatic participants. These findings give support to the association among BBB deterioration, intrathecal immunoactivation, and neuronal injury in untreated neuroasymptomatic HIV-infected individuals. PMID:27868081

  10. Investigation of bicycle and pedestrian continuous and short duration count technologies in Oregon : final report : SPR 772.

    DOT National Transportation Integrated Search

    2016-05-01

    While motorized traffic counts are systematic and comprehensive, bicycle and pedestrian counts are : often unknown or inaccurate. This research presents recommendations to increase bicycle and : pedestrian count accuracy while integrating bicycle and...

  11. Integrating chronological uncertainties for annually laminated lake sediments using layer counting, independent chronologies and Bayesian age modelling (Lake Ohau, South Island, New Zealand)

    NASA Astrophysics Data System (ADS)

    Vandergoes, Marcus J.; Howarth, Jamie D.; Dunbar, Gavin B.; Turnbull, Jocelyn C.; Roop, Heidi A.; Levy, Richard H.; Li, Xun; Prior, Christine; Norris, Margaret; Keller, Liz D.; Baisden, W. Troy; Ditchburn, Robert; Fitzsimons, Sean J.; Bronk Ramsey, Christopher

    2018-05-01

    Annually resolved (varved) lake sequences are important palaeoenvironmental archives as they offer a direct incremental dating technique for high-frequency reconstruction of environmental and climate change. Despite the importance of these records, establishing a robust chronology and quantifying its precision and accuracy (estimations of error) remains an essential but challenging component of their development. We outline an approach for building reliable independent chronologies, testing the accuracy of layer counts and integrating all chronological uncertainties to provide quantitative age and error estimates for varved lake sequences. The approach incorporates (1) layer counts and estimates of counting precision; (2) radiometric and biostratigrapic dating techniques to derive independent chronology; and (3) the application of Bayesian age modelling to produce an integrated age model. This approach is applied to a case study of an annually resolved sediment record from Lake Ohau, New Zealand. The most robust age model provides an average error of 72 years across the whole depth range. This represents a fractional uncertainty of ∼5%, higher than the <3% quoted for most published varve records. However, the age model and reported uncertainty represent the best fit between layer counts and independent chronology and the uncertainties account for both layer counting precision and the chronological accuracy of the layer counts. This integrated approach provides a more representative estimate of age uncertainty and therefore represents a statistically more robust chronology.

  12. Improvement of spatial resolution in a Timepix based CdTe photon counting detector using ToT method

    NASA Astrophysics Data System (ADS)

    Park, Kyeongjin; Lee, Daehee; Lim, Kyung Taek; Kim, Giyoon; Chang, Hojong; Yi, Yun; Cho, Gyuseong

    2018-05-01

    Photon counting detectors (PCDs) have been recognized as potential candidates in X-ray radiography and computed tomography due to their many advantages over conventional energy-integrating detectors. In particular, a PCD-based X-ray system shows an improved contrast-to-noise ratio, reduced radiation exposure dose, and more importantly, exhibits a capability for material decomposition with energy binning. For some applications, a very high resolution is required, which translates into smaller pixel size. Unfortunately, small pixels may suffer from energy spectral distortions (distortion in energy resolution) due to charge sharing effects (CSEs). In this work, we propose a method for correcting CSEs by measuring the point of interaction of an incident X-ray photon by the time-of-threshold (ToT) method. Moreover, we also show that it is possible to obtain an X-ray image with a reduced pixel size by using the concept of virtual pixels at a given pixel size. To verify the proposed method, modulation transfer function (MTF) and signal-to-noise ratio (SNR) measurements were carried out with the Timepix chip combined with the CdTe pixel sensor. The X-ray test condition was set at 80 kVp with 5 μA, and a tungsten edge phantom and a lead line phantom were used for the measurements. Enhanced spatial resolution was achieved by applying the proposed method when compared to that of the conventional photon counting method. From experiment results, MTF increased from 6.3 (conventional counting method) to 8.3 lp/mm (proposed method) at 0.3 MTF. On the other hand, the SNR decreased from 33.08 to 26.85 dB due to four virtual pixels.

  13. Advances in the computation of the Sjöstrand, Rossi, and Feynman distributions

    DOE PAGES

    Talamo, A.; Gohar, Y.; Gabrielli, F.; ...

    2017-02-01

    This study illustrates recent computational advances in the application of the Sjöstrand (area), Rossi, and Feynman methods to estimate the effective multiplication factor of a subcritical system driven by an external neutron source. The methodologies introduced in this study have been validated with the experimental results from the KUKA facility of Japan by Monte Carlo (MCNP6 and MCNPX) and deterministic (ERANOS, VARIANT, and PARTISN) codes. When the assembly is driven by a pulsed neutron source generated by a particle accelerator and delayed neutrons are at equilibrium, the Sjöstrand method becomes extremely fast if the integral of the reaction rate frommore » a single pulse is split into two parts. These two integrals distinguish between the neutron counts during and after the pulse period. To conclude, when the facility is driven by a spontaneous fission neutron source, the timestamps of the detector neutron counts can be obtained up to the nanosecond precision using MCNP6, which allows obtaining the Rossi and Feynman distributions.« less

  14. Upgraded photon calorimeter with integrating readout for Hall A Compton Polarimeter at Jefferson Lab

    DOE PAGES

    Friend, M.; Parno, D.; Benmokhtar, F.; ...

    2012-06-01

    The photon arm of the Compton polarimeter in Hall A of Jefferson Lab has been upgraded to allow for electron beam polarization measurements with better than 1% accuracy. The data acquisition system (DAQ) now includes an integrating mode, which eliminates several systematic uncertainties inherent in the original counting-DAQ setup. The photon calorimeter has been replaced with a Ce-doped Gd 2SiO 5 crystal, which has a bright output and fast response, and works well for measurements using the new integrating method at electron beam energies from 1 to 6 GeV.

  15. Characterization of Sphinx1 ASIC X-ray detector using photon counting and charge integration

    NASA Astrophysics Data System (ADS)

    Habib, A.; Arques, M.; Moro, J.-L.; Accensi, M.; Stanchina, S.; Dupont, B.; Rohr, P.; Sicard, G.; Tchagaspanian, M.; Verger, L.

    2018-01-01

    Sphinx1 is a novel pixel architecture adapted for X-ray imaging, it detects radiation by photon counting and charge integration. In photon counting mode, each photon is compensated by one or more counter-charges typically consisting of 100 electrons (e-) each. The number of counter-charges required gives a measure of the incoming photon energy, thus allowing spectrometric detection. Pixels can also detect radiation by integrating the charges deposited by all incoming photons during one image frame and converting this analog value into a digital response with a 100 electrons least significant bit (LSB), based on the counter-charge concept. A proof of concept test chip measuring 5 mm × 5 mm, with 200 μm × 200 μm pixels has been produced and characterized. This paper provides details on the architecture and the counter-charge design; it also describes the two modes of operation: photon counting and charge integration. The first performance measurements for this test chip are presented. Noise was found to be ~80 e-rms in photon counting mode with a power consumption of only 0.9 μW/pixel for the static analog part and 0.3 μW/pixel for the static digital part.

  16. Integrating molecular diagnostic and flow cytometric reporting for improved longitudinal monitoring of HIV patients.

    PubMed Central

    Asare, A. L.; Huda, H.; Klimczak, J. C.; Caldwell, C. W.

    1998-01-01

    Studies have shown that monitoring HIV-infected patients undergoing antiretroviral therapy is best represented by combined measurement of plasma HIV-1 RNA and CD4+ T-lymphocytes [1]. This pilot study at the University of Missouri-Columbia integrates molecular diagnostic and flow cytometric data reporting to provide current and historical HIV-1 RNA levels and CD4+ T-cell counts. The development of a single database for storage and retrieval of these values facilitates composite report generation that includes longitudinal HIV-1 RNA levels and CD4+ T-cell counts for all patients. Results are displayed in tables and plotted graphically within a web browser. This method of data presentation converts individual data points to more useful medical information and could provide clinicians with decision support for improved monitoring of HIV patients undergoing antiretroviral therapy. Images Figure 2 Figure 3 Figure 4 PMID:9929359

  17. CROSS-DISCIPLINARY PHYSICS AND RELATED AREAS OF SCIENCE AND TECHNOLOGY: Surface Micromachined Adjustable Micro-Concave Mirror for Bio-Detection Applications

    NASA Astrophysics Data System (ADS)

    Kuo, Ju-Nan; Chen, Wei-Lun; Jywe, Wen-Yuh

    2009-08-01

    We present a bio-detection system integrated with an adjustable micro-concave mirror. The bio-detection system consists of an adjustable micro-concave mirror, micro flow cytometer chip and optical detection module. The adjustable micro-concave mirror can be fabricated with ease using commercially available MEMS foundry services (such as multiuser MEMS processes, MUMPs) and its curvature can be controlled utilizing thermal or electrical effects. Experimental results show that focal lengths of the micro-concave mirror ranging from 313.5 to 2275.0 μm are achieved. The adjustable micro-concave mirror can be used to increase the efficiency of optical detection and provide a high signal-to-noise ratio. The developed micro-concave mirror is integrated with a micro flow cytometer for cell counting applications. Successful counting of fluorescent-labeled beads is demonstrated using the developed method.

  18. [Interconnection of stress and physical development processes in young persons].

    PubMed

    Barbarash, N A; Kuvshinkov, D Iu; Tul'chinskiĭ, M Ia

    2003-01-01

    The physical development (PD) rates, constitutional peculiarities and an integral level of different manifestations of stress-reactivity (SR) were evaluated in 201 students of Medical Academy (73 males and 138 females), aged 17-21; the above parameters were tested by the color method of Luscher, by Teylor's anxiety assessment, by "Individual Minute" measurements, by the iridoscopic count of iris nervous rings, by the "Mathematical Count" technique and by calculating the index of regulatory systems' tension according to the heart rate variability. The highest total SR index, including the SR cardiac manifestations was found in youth to correlate with the lowest PD index. The integral SR level correlated, in youth, inversely with the PD parameters. Such relations are more pronounced in individuals of the abdominal somatic type. The mechanisms and biological significance of SR correlations with the processes of growth and development are under discussion.

  19. Relations between elliptic multiple zeta values and a special derivation algebra

    NASA Astrophysics Data System (ADS)

    Broedel, Johannes; Matthes, Nils; Schlotterer, Oliver

    2016-04-01

    We investigate relations between elliptic multiple zeta values (eMZVs) and describe a method to derive the number of indecomposable elements of given weight and length. Our method is based on representing eMZVs as iterated integrals over Eisenstein series and exploiting the connection with a special derivation algebra. Its commutator relations give rise to constraints on the iterated integrals over Eisenstein series relevant for eMZVs and thereby allow to count the indecomposable representatives. Conversely, the above connection suggests apparently new relations in the derivation algebra. Under https://tools.aei.mpg.de/emzv we provide relations for eMZVs over a wide range of weights and lengths.

  20. Signal to noise ratio of energy selective x-ray photon counting systems with pileup.

    PubMed

    Alvarez, Robert E

    2014-11-01

    To derive fundamental limits on the effect of pulse pileup and quantum noise in photon counting detectors on the signal to noise ratio (SNR) and noise variance of energy selective x-ray imaging systems. An idealized model of the response of counting detectors to pulse pileup is used. The model assumes a nonparalyzable response and delta function pulse shape. The model is used to derive analytical formulas for the noise and energy spectrum of the recorded photons with pulse pileup. These formulas are first verified with a Monte Carlo simulation. They are then used with a method introduced in a previous paper [R. E. Alvarez, "Near optimal energy selective x-ray imaging system performance with simple detectors," Med. Phys. 37, 822-841 (2010)] to compare the signal to noise ratio with pileup to the ideal SNR with perfect energy resolution. Detectors studied include photon counting detectors with pulse height analysis (PHA), detectors that simultaneously measure the number of photons and the integrated energy (NQ detector), and conventional energy integrating and photon counting detectors. The increase in the A-vector variance with dead time is also computed and compared to the Monte Carlo results. A formula for the covariance of the NQ detector is developed. The validity of the constant covariance approximation to the Cramèr-Rao lower bound (CRLB) for larger counts is tested. The SNR becomes smaller than the conventional energy integrating detector (Q) SNR for 0.52, 0.65, and 0.78 expected number photons per dead time for counting (N), two, and four bin PHA detectors, respectively. The NQ detector SNR is always larger than the N and Q SNR but only marginally so for larger dead times. Its noise variance increases by a factor of approximately 3 and 5 for the A1 and A2 components as the dead time parameter increases from 0 to 0.8 photons per dead time. With four bin PHA data, the increase in variance is approximately 2 and 4 times. The constant covariance approximation to the CRLB is valid for larger counts such as those used in medical imaging. The SNR decreases rapidly as dead time increases. This decrease places stringent limits on allowable dead times with the high count rates required for medical imaging systems. The probability distribution of the idealized data with pileup is shown to be accurately described as a multivariate normal for expected counts greater than those typically utilized in medical imaging systems. The constant covariance approximation to the CRLB is also shown to be valid in this case. A new formula for the covariance of the NQ detector with pileup is derived and validated.

  1. Flexible Method for Inter-object Communication in C++

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.; Gould, Jack J.

    1994-01-01

    A method has been developed for organizing and sharing large amounts of information between objects in C++ code. This method uses a set of object classes to define variables and group them into tables. The variable tables presented here provide a convenient way of defining and cataloging data, as well as a user-friendly input/output system, a standardized set of access functions, mechanisms for ensuring data integrity, methods for interprocessor data transfer, and an interpretive language for programming relationships between parameters. The object-oriented nature of these variable tables enables the use of multiple data types, each with unique attributes and behavior. Because each variable provides its own access methods, redundant table lookup functions can be bypassed, thus decreasing access times while maintaining data integrity. In addition, a method for automatic reference counting was developed to manage memory safely.

  2. Concept for an off-line gain stabilisation method.

    PubMed

    Pommé, S; Sibbens, G

    2004-01-01

    Conceptual ideas are presented for an off-line gain stabilisation method for spectrometry, in particular for alpha-particle spectrometry at low count rate. The method involves list mode storage of individual energy and time stamp data pairs. The 'Stieltjes integral' of measured spectra with respect to a reference spectrum is proposed as an indicator for gain instability. 'Exponentially moving averages' of the latter show the gain shift as a function of time. With this information, the data are relocated stochastically on a point-by-point basis.

  3. Assessing the transferability of a hybrid Taguchi-objective function method to optimize image segmentation for detecting and counting cave roosting birds using terrestrial laser scanning data

    NASA Astrophysics Data System (ADS)

    Idrees, Mohammed Oludare; Pradhan, Biswajeet; Buchroithner, Manfred F.; Shafri, Helmi Zulhaidi Mohd; Khairunniza Bejo, Siti

    2016-07-01

    As far back as early 15th century during the reign of the Ming Dynasty (1368 to 1634 AD), Gomantong cave in Sabah (Malaysia) has been known as one of the largest roosting sites for wrinkle-lipped bats (Chaerephon plicata) and swiftlet birds (Aerodramus maximus and Aerodramus fuciphagus) in very large colonies. Until recently, no study has been done to quantify or estimate the colony sizes of these inhabitants in spite of the grave danger posed to this avifauna by human activities and potential habitat loss to postspeleogenetic processes. This paper evaluates the transferability of a hybrid optimization image analysis-based method developed to detect and count cave roosting birds. The method utilizes high-resolution terrestrial laser scanning intensity image. First, segmentation parameters were optimized by integrating objective function and the statistical Taguchi methods. Thereafter, the optimized parameters were used as input into the segmentation and classification processes using two images selected from Simud Hitam (lower cave) and Simud Putih (upper cave) of the Gomantong cave. The result shows that the method is capable of detecting birds (and bats) from the image for accurate population censusing. A total number of 9998 swiftlet birds were counted from the first image while 1132 comprising of both bats and birds were obtained from the second image. Furthermore, the transferability evaluation yielded overall accuracies of 0.93 and 0.94 (area under receiver operating characteristic curve) for the first and second image, respectively, with p value of <0.0001 at 95% confidence level. The findings indicate that the method is not only efficient for the detection and counting cave birds for which it was developed for but also useful for counting bats; thus, it can be adopted in any cave.

  4. Absolute dose calibration of an X-ray system and dead time investigations of photon-counting techniques

    NASA Astrophysics Data System (ADS)

    Carpentieri, C.; Schwarz, C.; Ludwig, J.; Ashfaq, A.; Fiederle, M.

    2002-07-01

    High precision concerning the dose calibration of X-ray sources is required when counting and integrating methods are compared. The dose calibration for a dental X-ray tube was executed with special dose calibration equipment (dosimeter) as function of exposure time and rate. Results were compared with a benchmark spectrum and agree within ±1.5%. Dead time investigations with the Medipix1 photon-counting chip (PCC) have been performed by rate variations. Two different types of dead time, paralysable and non-paralysable will be discussed. The dead time depends on settings of the front-end electronics and is a function of signal height, which might lead to systematic defects of systems. Dead time losses in excess of 30% have been found for the PCC at 200 kHz absorbed photons per pixel.

  5. Note: Fully integrated active quenching circuit achieving 100 MHz count rate with custom technology single photon avalanche diodes.

    PubMed

    Acconcia, G; Labanca, I; Rech, I; Gulinatti, A; Ghioni, M

    2017-02-01

    The minimization of Single Photon Avalanche Diodes (SPADs) dead time is a key factor to speed up photon counting and timing measurements. We present a fully integrated Active Quenching Circuit (AQC) able to provide a count rate as high as 100 MHz with custom technology SPAD detectors. The AQC can also operate the new red enhanced SPAD and provide the timing information with a timing jitter Full Width at Half Maximum (FWHM) as low as 160 ps.

  6. Gender counts: A systematic review of evaluations of gender-integrated health interventions in low- and middle-income countries.

    PubMed

    Schriver, Brittany; Mandal, Mahua; Muralidharan, Arundati; Nwosu, Anthony; Dayal, Radhika; Das, Madhumita; Fehringer, Jessica

    2017-11-01

    As a result of new global priorities, there is a growing need for high-quality evaluations of gender-integrated health programmes. This systematic review examined 99 peer-reviewed articles on evaluations of gender-integrated (accommodating and transformative) health programmes with regard to their theory of change (ToC), study design, gender integration in data collection, analysis, and gender measures used. Half of the evaluations explicitly described a ToC or conceptual framework (n = 50) that guided strategies for their interventions. Over half (61%) of the evaluations used quantitative methods exclusively; 11% used qualitative methods exclusively; and 28% used mixed methods. Qualitative methods were not commonly detailed. Evaluations of transformative interventions were less likely than those of accommodating interventions to employ randomised control trials. Two-thirds of the reviewed evaluations reported including at least one specific gender-related outcome (n = 18 accommodating, n = 44 transformative). To strengthen evaluations of gender-integrated programmes, we recommend use of ToCs, explicitly including gender in the ToC, use of gender-sensitive measures, mixed-method designs, in-depth descriptions of qualitative methods, and attention to gender-related factors in data collection logistics. We also recommend further research to develop valid and reliable gender measures that are globally relevant.

  7. Deriving the Contribution of Blazars to the Fermi-LAT Extragalactic γ-ray Background at E > 10 GeV with Efficiency Corrections and Photon Statistics

    NASA Astrophysics Data System (ADS)

    Di Mauro, M.; Manconi, S.; Zechlin, H.-S.; Ajello, M.; Charles, E.; Donato, F.

    2018-04-01

    The Fermi Large Area Telescope (LAT) Collaboration has recently released the Third Catalog of Hard Fermi-LAT Sources (3FHL), which contains 1556 sources detected above 10 GeV with seven years of Pass 8 data. Building upon the 3FHL results, we investigate the flux distribution of sources at high Galactic latitudes (| b| > 20^\\circ ), which are mostly blazars. We use two complementary techniques: (1) a source-detection efficiency correction method and (2) an analysis of pixel photon count statistics with the one-point probability distribution function (1pPDF). With the first method, using realistic Monte Carlo simulations of the γ-ray sky, we calculate the efficiency of the LAT to detect point sources. This enables us to find the intrinsic source-count distribution at photon fluxes down to 7.5 × 10‑12 ph cm‑2 s‑1. With this method, we detect a flux break at (3.5 ± 0.4) × 10‑11 ph cm‑2 s‑1 with a significance of at least 5.4σ. The power-law indexes of the source-count distribution above and below the break are 2.09 ± 0.04 and 1.07 ± 0.27, respectively. This result is confirmed with the 1pPDF method, which has a sensitivity reach of ∼10‑11 ph cm‑2 s‑1. Integrating the derived source-count distribution above the sensitivity of our analysis, we find that (42 ± 8)% of the extragalactic γ-ray background originates from blazars.

  8. Integration of semiconductor quantum dots into nano-bio-chip systems for enumeration of CD4+ T cell counts at the point-of-need†‡

    PubMed Central

    Jokerst, Jesse V.; Floriano, Pierre N.; Christodoulides, Nicolaos; Simmons, Glennon W.; McDevitt, John T.

    2010-01-01

    Recent humanitarian efforts have led to the widespread release of antiretroviral drugs for the treatment of the more than 33 million HIV afflicted people living in resource-scarce settings. Here, the enumeration of CD4+ T lymphocytes is required to establish the level at which the immune system has been compromised. The gold standard method used in developed countries, based on flow cytometry, though widely accepted and accurate, is precluded from widespread use in resource-scarce settings due to its high expense, high technical requirements, difficulty in operation-maintenance and the lack of portability for these sophisticated laboratory-confined systems. As part of continuing efforts to develop practical diagnostic instrumentation, the integration of semiconductor nanocrystals (quantum dots, QDs) into a portable microfluidic-based lymphocyte capture and detection device is completed. This integrated system is capable of isolating and counting selected lymphocyte sub-populations (CD3+CD4+) from whole blood samples. By combining the unique optical properties of the QDs with the sample handling capabilities and cost effectiveness of novel microfluidic systems, a practical, portable lymphocyte measurement modality that correlates nicely with flow cytometry (R2 = 0.97) has been developed. This QD-based system reduces the optical requirements significantly relative to molecular fluorophores and the mini-CD4 counting device is projected to be suitable for use in both point-of-need and resource-scarce settings. PMID:19023471

  9. Radiation and Temperature Hard Multi-Pixel Avalanche Photodiodes

    NASA Technical Reports Server (NTRS)

    Bensaoula, Abdelhak (Inventor); Starikov, David (Inventor); Pillai, Rajeev (Inventor)

    2017-01-01

    The structure and method of fabricating a radiation and temperature hard avalanche photodiode with integrated radiation and temperature hard readout circuit, comprising a substrate, an avalanche region, an absorption region, and a plurality of Ohmic contacts are presented. The present disclosure provides for tuning of spectral sensitivity and high device efficiency, resulting in photon counting capability with decreased crosstalk and reduced dark current.

  10. Amplitude distributions of dark counts and photon counts in NbN superconducting single-photon detectors integrated with the HEMT readout

    NASA Astrophysics Data System (ADS)

    Kitaygorsky, J.; Słysz, W.; Shouten, R.; Dorenbos, S.; Reiger, E.; Zwiller, V.; Sobolewski, Roman

    2017-01-01

    We present a new operation regime of NbN superconducting single-photon detectors (SSPDs) by integrating them with a low-noise cryogenic high-electron-mobility transistor and a high-load resistor. The integrated sensors are designed to get a better understanding of the origin of dark counts triggered by the detector, as our scheme allows us to distinguish the origin of dark pulses from the actual photon pulses in SSPDs. The presented approach is based on a statistical analysis of amplitude distributions of recorded trains of the SSPD photoresponse transients. It also enables to obtain information on energy of the incident photons, as well as demonstrates some photon-number-resolving capability of meander-type SSPDs.

  11. Prediction of noise field of a propfan at angle of attack

    NASA Technical Reports Server (NTRS)

    Envia, Edmane

    1991-01-01

    A method for predicting the noise field of a propfan operating at an angle of attack to the oncoming flow is presented. The method takes advantage of the high-blade-count of the advanced propeller designs to provide an accurate and efficient formula for predicting their noise field. The formula, which is written in terms of the Airy function and its derivative, provides a very attractive alternative to the use of numerical integration. A preliminary comparison shows rather favorable agreement between the predictions from the present method and the experimental data.

  12. Avian leucocyte counting using the hemocytometer

    USGS Publications Warehouse

    Dein, F.J.; Wilson, A.; Fischer, D.; Langenberg, P.

    1994-01-01

    Automated methods for counting leucocytes in avian blood are not available because of the presence of nucleated erythrocytes and thrombocytes. Therefore, total white blood cell counts are performed by hand using a hemocytometer. The Natt and Herrick and the Unopette methods are the most common stain and diluent preparations for this procedure. Replicate hemocytometer counts using these two methods were performed on blood from four birds of different species. Cells present in each square of the hemocytometer were counted. Counting cells in the corner, side, or center hemocytometer squares produced statistically equivalent results; counting four squares per chamber provided a result similar to that obtained by counting nine squares; and the Unopette method was more precise for hemocytometer counting than was the Natt and Herrick method. The Unopette method is easier to learn and perform but is an indirect process, utilizing the differential count from a stained smear. The Natt and Herrick method is a direct total count, but cell identification is more difficult.

  13. Bi-photon spectral correlation measurements from a silicon nanowire in the quantum and classical regimes

    PubMed Central

    Jizan, Iman; Helt, L. G.; Xiong, Chunle; Collins, Matthew J.; Choi, Duk-Yong; Joon Chae, Chang; Liscidini, Marco; Steel, M. J.; Eggleton, Benjamin J.; Clark, Alex S.

    2015-01-01

    The growing requirement for photon pairs with specific spectral correlations in quantum optics experiments has created a demand for fast, high resolution and accurate source characterisation. A promising tool for such characterisation uses classical stimulated processes, in which an additional seed laser stimulates photon generation yielding much higher count rates, as recently demonstrated for a χ(2) integrated source in A. Eckstein et al. Laser Photon. Rev. 8, L76 (2014). In this work we extend these results to χ(3) integrated sources, directly measuring for the first time the relation between spectral correlation measurements via stimulated and spontaneous four wave mixing in an integrated optical waveguide, a silicon nanowire. We directly confirm the speed-up due to higher count rates and demonstrate that this allows additional resolution to be gained when compared to traditional coincidence measurements without any increase in measurement time. As the pump pulse duration can influence the degree of spectral correlation, all of our measurements are taken for two different pump pulse widths. This allows us to confirm that the classical stimulated process correctly captures the degree of spectral correlation regardless of pump pulse duration, and cements its place as an essential characterisation method for the development of future quantum integrated devices. PMID:26218609

  14. MSblender: A probabilistic approach for integrating peptide identifications from multiple database search engines.

    PubMed

    Kwon, Taejoon; Choi, Hyungwon; Vogel, Christine; Nesvizhskii, Alexey I; Marcotte, Edward M

    2011-07-01

    Shotgun proteomics using mass spectrometry is a powerful method for protein identification but suffers limited sensitivity in complex samples. Integrating peptide identifications from multiple database search engines is a promising strategy to increase the number of peptide identifications and reduce the volume of unassigned tandem mass spectra. Existing methods pool statistical significance scores such as p-values or posterior probabilities of peptide-spectrum matches (PSMs) from multiple search engines after high scoring peptides have been assigned to spectra, but these methods lack reliable control of identification error rates as data are integrated from different search engines. We developed a statistically coherent method for integrative analysis, termed MSblender. MSblender converts raw search scores from search engines into a probability score for every possible PSM and properly accounts for the correlation between search scores. The method reliably estimates false discovery rates and identifies more PSMs than any single search engine at the same false discovery rate. Increased identifications increment spectral counts for most proteins and allow quantification of proteins that would not have been quantified by individual search engines. We also demonstrate that enhanced quantification contributes to improve sensitivity in differential expression analyses.

  15. MSblender: a probabilistic approach for integrating peptide identifications from multiple database search engines

    PubMed Central

    Kwon, Taejoon; Choi, Hyungwon; Vogel, Christine; Nesvizhskii, Alexey I.; Marcotte, Edward M.

    2011-01-01

    Shotgun proteomics using mass spectrometry is a powerful method for protein identification but suffers limited sensitivity in complex samples. Integrating peptide identifications from multiple database search engines is a promising strategy to increase the number of peptide identifications and reduce the volume of unassigned tandem mass spectra. Existing methods pool statistical significance scores such as p-values or posterior probabilities of peptide-spectrum matches (PSMs) from multiple search engines after high scoring peptides have been assigned to spectra, but these methods lack reliable control of identification error rates as data are integrated from different search engines. We developed a statistically coherent method for integrative analysis, termed MSblender. MSblender converts raw search scores from search engines into a probability score for all possible PSMs and properly accounts for the correlation between search scores. The method reliably estimates false discovery rates and identifies more PSMs than any single search engine at the same false discovery rate. Increased identifications increment spectral counts for all detected proteins and allow quantification of proteins that would not have been quantified by individual search engines. We also demonstrate that enhanced quantification contributes to improve sensitivity in differential expression analyses. PMID:21488652

  16. Extended range radiation dose-rate monitor

    DOEpatents

    Valentine, Kenneth H.

    1988-01-01

    An extended range dose-rate monitor is provided which utilizes the pulse pileup phenomenon that occurs in conventional counting systems to alter the dynamic response of the system to extend the dose-rate counting range. The current pulses from a solid-state detector generated by radiation events are amplified and shaped prior to applying the pulses to the input of a comparator. The comparator generates one logic pulse for each input pulse which exceeds the comparator reference threshold. These pulses are integrated and applied to a meter calibrated to indicate the measured dose-rate in response to the integrator output. A portion of the output signal from the integrator is fed back to vary the comparator reference threshold in proportion to the output count rate to extend the sensitive dynamic detection range by delaying the asymptotic approach of the integrator output toward full scale as measured by the meter.

  17. Semantic similarity measure in biomedical domain leverage web search engine.

    PubMed

    Chen, Chi-Huang; Hsieh, Sheau-Ling; Weng, Yung-Ching; Chang, Wen-Yung; Lai, Feipei

    2010-01-01

    Semantic similarity measure plays an essential role in Information Retrieval and Natural Language Processing. In this paper we propose a page-count-based semantic similarity measure and apply it in biomedical domains. Previous researches in semantic web related applications have deployed various semantic similarity measures. Despite the usefulness of the measurements in those applications, measuring semantic similarity between two terms remains a challenge task. The proposed method exploits page counts returned by the Web Search Engine. We define various similarity scores for two given terms P and Q, using the page counts for querying P, Q and P AND Q. Moreover, we propose a novel approach to compute semantic similarity using lexico-syntactic patterns with page counts. These different similarity scores are integrated adapting support vector machines, to leverage the robustness of semantic similarity measures. Experimental results on two datasets achieve correlation coefficients of 0.798 on the dataset provided by A. Hliaoutakis, 0.705 on the dataset provide by T. Pedersen with physician scores and 0.496 on the dataset provided by T. Pedersen et al. with expert scores.

  18. A direct viable count method for the enumeration of attached bacteria and assessment of biofilm disinfection

    NASA Technical Reports Server (NTRS)

    Yu, F. P.; Pyle, B. H.; McFeters, G. A.

    1993-01-01

    This report describes the adaptation of an in situ direct viable count (in situ DVC) method in biofilm disinfection studies. The results obtained with this technique were compared to two other enumeration methods, the plate count (PC) and conventional direct viable count (c-DVC). An environmental isolate (Klebsiella pneumoniae Kp1) was used to form biofilms on stainless steel coupons in a stirred batch reactor. The in situ DVC method was applied to directly assess the viability of bacteria in biofilms without disturbing the integrity of the interfacial community. As additional advantages, the results were observed after 4 h instead of the 24 h incubation time required for colony formation and total cell numbers that remained on the substratum were enumerated. Chlorine and monochloramine were used to determine the susceptibilities of attached and planktonic bacteria to disinfection treatment using this novel analytical approach. The planktonic cells in the reactor showed no significant change in susceptibility to disinfectants during the period of biofilm formation. In addition, the attached cells did not reveal any more resistance to disinfection than planktonic cells. The disinfection studies of young biofilms indicated that 0.25 mg/l free chlorine (at pH 7.2) and 1 mg/l monochloramine (at pH 9.0) have comparable disinfection efficiencies at 25 degrees C. Although being a weaker disinfectant, monochloramine was more effective in removing attached bacteria from the substratum than free chlorine. The in situ DVC method always showed at least one log higher viable cell densities than the PC method, suggesting that the in situ DVC method is more efficient in the enumeration of biofilm bacteria. The results also indicated that the in situ DVC method can provide more accurate information regarding the cell numbers and viability of bacteria within biofilms following disinfection.

  19. An integrated modeling approach to estimating Gunnison Sage-Grouse population dynamics: combining index and demographic data.

    USGS Publications Warehouse

    Davis, Amy J.; Hooten, Mevin B.; Phillips, Michael L.; Doherty, Paul F.

    2014-01-01

    Evaluation of population dynamics for rare and declining species is often limited to data that are sparse and/or of poor quality. Frequently, the best data available for rare bird species are based on large-scale, population count data. These data are commonly based on sampling methods that lack consistent sampling effort, do not account for detectability, and are complicated by observer bias. For some species, short-term studies of demographic rates have been conducted as well, but the data from such studies are typically analyzed separately. To utilize the strengths and minimize the weaknesses of these two data types, we developed a novel Bayesian integrated model that links population count data and population demographic data through population growth rate (λ) for Gunnison sage-grouse (Centrocercus minimus). The long-term population index data available for Gunnison sage-grouse are annual (years 1953–2012) male lek counts. An intensive demographic study was also conducted from years 2005 to 2010. We were able to reduce the variability in expected population growth rates across time, while correcting for potential small sample size bias in the demographic data. We found the population of Gunnison sage-grouse to be variable and slightly declining over the past 16 years.

  20. Nutsedge Counts Predict Meloidogyne incognita Juvenile Counts in an Integrated Management System.

    PubMed

    Ou, Zhining; Murray, Leigh; Thomas, Stephen H; Schroeder, Jill; Libbin, James

    2008-06-01

    The southern root-knot nematode (Meloidogyne incognita), yellow nutsedge (Cyperus esculentus) and purple nutsedge (Cyperus rotundus) are important pests in crops grown in the southern US. Management of the individual pests rather than the pest complex is often unsuccessful due to mutually beneficial pest interactions. In an integrated pest management scheme using alfalfa to suppress nutsedges and M. incognita, we evaluated quadratic polynomial regression models for prediction of the number of M. incognita J2 in soil samples as a function of yellow and purple nutsedge plant counts, squares of nutsedge counts and the cross-product between nutsedge counts . In May 2005, purple nutsedge plant count was a significant predictor of M. incognita count. In July and September 2005, counts of both nutsedges and the cross-product were significant predictors. In 2006, the second year of the alfalfa rotation, counts of all three species were reduced. As a likely consequence, the predictive relationship between nutsedges and M. incognita was not significant for May and July. In September 2006, purple nutsedge was a significant predictor of M. incognita. These results lead us to conclude that nutsedge plant counts in a field infested with the M. incognita-nutsedge pest complex can be used as a visual predictor of M. incognita J2 populations, unless the numbers of nutsedge plants and M. incognita are all very low.

  1. Nutsedge Counts Predict Meloidogyne incognita Juvenile Counts in an Integrated Management System

    PubMed Central

    Ou, Zhining; Murray, Leigh; Thomas, Stephen H.; Schroeder, Jill; Libbin, James

    2008-01-01

    The southern root-knot nematode (Meloidogyne incognita), yellow nutsedge (Cyperus esculentus) and purple nutsedge (Cyperus rotundus) are important pests in crops grown in the southern US. Management of the individual pests rather than the pest complex is often unsuccessful due to mutually beneficial pest interactions. In an integrated pest management scheme using alfalfa to suppress nutsedges and M. incognita, we evaluated quadratic polynomial regression models for prediction of the number of M. incognita J2 in soil samples as a function of yellow and purple nutsedge plant counts, squares of nutsedge counts and the cross-product between nutsedge counts . In May 2005, purple nutsedge plant count was a significant predictor of M. incognita count. In July and September 2005, counts of both nutsedges and the cross-product were significant predictors. In 2006, the second year of the alfalfa rotation, counts of all three species were reduced. As a likely consequence, the predictive relationship between nutsedges and M. incognita was not significant for May and July. In September 2006, purple nutsedge was a significant predictor of M. incognita. These results lead us to conclude that nutsedge plant counts in a field infested with the M. incognita-nutsedge pest complex can be used as a visual predictor of M. incognita J2 populations, unless the numbers of nutsedge plants and M. incognita are all very low. PMID:19259526

  2. Allele-specific copy-number discovery from whole-genome and whole-exome sequencing

    PubMed Central

    Wang, WeiBo; Wang, Wei; Sun, Wei; Crowley, James J.; Szatkiewicz, Jin P.

    2015-01-01

    Copy-number variants (CNVs) are a major form of genetic variation and a risk factor for various human diseases, so it is crucial to accurately detect and characterize them. It is conceivable that allele-specific reads from high-throughput sequencing data could be leveraged to both enhance CNV detection and produce allele-specific copy number (ASCN) calls. Although statistical methods have been developed to detect CNVs using whole-genome sequence (WGS) and/or whole-exome sequence (WES) data, information from allele-specific read counts has not yet been adequately exploited. In this paper, we develop an integrated method, called AS-GENSENG, which incorporates allele-specific read counts in CNV detection and estimates ASCN using either WGS or WES data. To evaluate the performance of AS-GENSENG, we conducted extensive simulations, generated empirical data using existing WGS and WES data sets and validated predicted CNVs using an independent methodology. We conclude that AS-GENSENG not only predicts accurate ASCN calls but also improves the accuracy of total copy number calls, owing to its unique ability to exploit information from both total and allele-specific read counts while accounting for various experimental biases in sequence data. Our novel, user-friendly and computationally efficient method and a complete analytic protocol is freely available at https://sourceforge.net/projects/asgenseng/. PMID:25883151

  3. Improving the counting efficiency in time-correlated single photon counting experiments by dead-time optimization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peronio, P.; Acconcia, G.; Rech, I.

    Time-Correlated Single Photon Counting (TCSPC) has been long recognized as the most sensitive method for fluorescence lifetime measurements, but often requiring “long” data acquisition times. This drawback is related to the limited counting capability of the TCSPC technique, due to pile-up and counting loss effects. In recent years, multi-module TCSPC systems have been introduced to overcome this issue. Splitting the light into several detectors connected to independent TCSPC modules proportionally increases the counting capability. Of course, multi-module operation also increases the system cost and can cause space and power supply problems. In this paper, we propose an alternative approach basedmore » on a new detector and processing electronics designed to reduce the overall system dead time, thus enabling efficient photon collection at high excitation rate. We present a fast active quenching circuit for single-photon avalanche diodes which features a minimum dead time of 12.4 ns. We also introduce a new Time-to-Amplitude Converter (TAC) able to attain extra-short dead time thanks to the combination of a scalable array of monolithically integrated TACs and a sequential router. The fast TAC (F-TAC) makes it possible to operate the system towards the upper limit of detector count rate capability (∼80 Mcps) with reduced pile-up losses, addressing one of the historic criticisms of TCSPC. Preliminary measurements on the F-TAC are presented and discussed.« less

  4. Low photon-count tip-tilt sensor

    NASA Astrophysics Data System (ADS)

    Saathof, Rudolf; Schitter, Georg

    2016-07-01

    Due to the low photon-count of dark areas of the universe, signal strength of tip-tilt sensor is low, limiting sky-coverage of reliable tip-tilt measurements. This paper presents the low photon-count tip-tilt (LPC-TT) sensor, which potentially achieves improved signal strength. Its optical design spatially samples and integrates the scene. This increases the probability that several individual sources coincide on a detector segment. Laboratory experiments show feasibility of spatial sampling and integration and the ability to measure tilt angles. By simulation an improvement of the SNR of 10 dB compared to conventional tip-tilt sensors is shown.

  5. Reliably counting atomic planes of few-layer graphene (n > 4).

    PubMed

    Koh, Yee Kan; Bae, Myung-Ho; Cahill, David G; Pop, Eric

    2011-01-25

    We demonstrate a reliable technique for counting atomic planes (n) of few-layer graphene (FLG) on SiO(2)/Si substrates by Raman spectroscopy. Our approach is based on measuring the ratio of the integrated intensity of the G graphene peak and the optical phonon peak of Si, I(G)/I(Si), and is particularly useful in the range n > 4 where few methods exist. We compare our results with atomic force microscopy (AFM) measurements and Fresnel equation calculations. Then, we apply our method to unambiguously identify n of FLG devices on SiO(2) and find that the mobility (μ ≈ 2000 cm(2) V(-1) s(-1)) is independent of layer thickness for n > 4. Our findings suggest that electrical transport in gated FLG devices is dominated by carriers near the FLG/SiO(2) interface and is thus limited by the environment, even for n > 4.

  6. Initial steps toward the realization of large area arrays of single photon counting pixels based on polycrystalline silicon TFTs

    NASA Astrophysics Data System (ADS)

    Liang, Albert K.; Koniczek, Martin; Antonuk, Larry E.; El-Mohri, Youcef; Zhao, Qihua; Jiang, Hao; Street, Robert A.; Lu, Jeng Ping

    2014-03-01

    The thin-film semiconductor processing methods that enabled creation of inexpensive liquid crystal displays based on amorphous silicon transistors for cell phones and televisions, as well as desktop, laptop and mobile computers, also facilitated the development of devices that have become ubiquitous in medical x-ray imaging environments. These devices, called active matrix flat-panel imagers (AMFPIs), measure the integrated signal generated by incident X rays and offer detection areas as large as ~43×43 cm2. In recent years, there has been growing interest in medical x-ray imagers that record information from X ray photons on an individual basis. However, such photon counting devices have generally been based on crystalline silicon, a material not inherently suited to the cost-effective manufacture of monolithic devices of a size comparable to that of AMFPIs. Motivated by these considerations, we have developed an initial set of small area prototype arrays using thin-film processing methods and polycrystalline silicon transistors. These prototypes were developed in the spirit of exploring the possibility of creating large area arrays offering single photon counting capabilities and, to our knowledge, are the first photon counting arrays fabricated using thin film techniques. In this paper, the architecture of the prototype pixels is presented and considerations that influenced the design of the pixel circuits, including amplifier noise, TFT performance variations, and minimum feature size, are discussed.

  7. FAST NEUTRON DOSIMETER FOR HIGH TEMPERATURE OPERATION BY MEASUREMENT OF THE AMOUNT OF CESIUM 137 FORMED FROM A THORIUM WIRE

    DOEpatents

    McCune, D.A.

    1964-03-17

    A method and device for measurement of integrated fast neutron flux in the presence of a large thermal neutron field are described. The device comprises a thorium wire surrounded by a thermal neutron attenuator that is, in turn, enclosed by heat-resistant material. The method consists of irradiating the device in a neutron field whereby neutrons with energies in excess of 1.1 Mev cause fast fissions in the thorium, then removing the thorium wire, separating the cesium-137 fission product by chemical means from the thorium, and finally counting the radioactivity of the cesium to determine the number of fissions which have occurred so that the integrated fast flux may be obtained. (AEC)

  8. Deriving the Contribution of Blazars to the Fermi-LAT Extragalactic γ-ray Background at E > 10 GeV with Efficiency Corrections and Photon Statistics

    DOE PAGES

    Di Mauro, M.; Manconi, S.; Zechlin, H. -S.; ...

    2018-03-29

    Here, the Fermi Large Area Telescope (LAT) Collaboration has recently released the Third Catalog of Hard Fermi-LAT Sources (3FHL), which contains 1556 sources detected above 10 GeV with seven years of Pass 8 data. Building upon the 3FHL results, we investigate the flux distribution of sources at high Galactic latitudes (more » $$|b| \\gt 20^\\circ $$), which are mostly blazars. We use two complementary techniques: (1) a source-detection efficiency correction method and (2) an analysis of pixel photon count statistics with the one-point probability distribution function (1pPDF). With the first method, using realistic Monte Carlo simulations of the γ-ray sky, we calculate the efficiency of the LAT to detect point sources. This enables us to find the intrinsic source-count distribution at photon fluxes down to 7.5 × 10 –12 ph cm –2 s –1. With this method, we detect a flux break at (3.5 ± 0.4) × 10 –11 ph cm –2 s –1 with a significance of at least 5.4σ. The power-law indexes of the source-count distribution above and below the break are 2.09 ± 0.04 and 1.07 ± 0.27, respectively. This result is confirmed with the 1pPDF method, which has a sensitivity reach of ~10 –11 ph cm –2 s –1. Integrating the derived source-count distribution above the sensitivity of our analysis, we find that (42 ± 8)% of the extragalactic γ-ray background originates from blazars.« less

  9. Deriving the Contribution of Blazars to the Fermi-LAT Extragalactic γ-ray Background at E > 10 GeV with Efficiency Corrections and Photon Statistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Di Mauro, M.; Manconi, S.; Zechlin, H. -S.

    Here, the Fermi Large Area Telescope (LAT) Collaboration has recently released the Third Catalog of Hard Fermi-LAT Sources (3FHL), which contains 1556 sources detected above 10 GeV with seven years of Pass 8 data. Building upon the 3FHL results, we investigate the flux distribution of sources at high Galactic latitudes (more » $$|b| \\gt 20^\\circ $$), which are mostly blazars. We use two complementary techniques: (1) a source-detection efficiency correction method and (2) an analysis of pixel photon count statistics with the one-point probability distribution function (1pPDF). With the first method, using realistic Monte Carlo simulations of the γ-ray sky, we calculate the efficiency of the LAT to detect point sources. This enables us to find the intrinsic source-count distribution at photon fluxes down to 7.5 × 10 –12 ph cm –2 s –1. With this method, we detect a flux break at (3.5 ± 0.4) × 10 –11 ph cm –2 s –1 with a significance of at least 5.4σ. The power-law indexes of the source-count distribution above and below the break are 2.09 ± 0.04 and 1.07 ± 0.27, respectively. This result is confirmed with the 1pPDF method, which has a sensitivity reach of ~10 –11 ph cm –2 s –1. Integrating the derived source-count distribution above the sensitivity of our analysis, we find that (42 ± 8)% of the extragalactic γ-ray background originates from blazars.« less

  10. Forecasting Emergency Department Crowding: An External, Multi-Center Evaluation

    PubMed Central

    Hoot, Nathan R.; Epstein, Stephen K.; Allen, Todd L.; Jones, Spencer S.; Baumlin, Kevin M.; Chawla, Neal; Lee, Anna T.; Pines, Jesse M.; Klair, Amandeep K.; Gordon, Bradley D.; Flottemesch, Thomas J.; LeBlanc, Larry J.; Jones, Ian; Levin, Scott R.; Zhou, Chuan; Gadd, Cynthia S.; Aronsky, Dominik

    2009-01-01

    Objective To apply a previously described tool to forecast ED crowding at multiple institutions, and to assess its generalizability for predicting the near-future waiting count, occupancy level, and boarding count. Methods The ForecastED tool was validated using historical data from five institutions external to the development site. A sliding-window design separated the data for parameter estimation and forecast validation. Observations were sampled at consecutive 10-minute intervals during 12 months (n = 52,560) at four sites and 10 months (n = 44,064) at the fifth. Three outcome measures – the waiting count, occupancy level, and boarding count – were forecast 2, 4, 6, and 8 hours beyond each observation, and forecasts were compared to observed data at corresponding times. The reliability and calibration were measured following previously described methods. After linear calibration, the forecasting accuracy was measured using the median absolute error (MAE). Results The tool was successfully used for five different sites. Its forecasts were more reliable, better calibrated, and more accurate at 2 hours than at 8 hours. The reliability and calibration of the tool were similar between the original development site and external sites; the boarding count was an exception, which was less reliable at four out of five sites. Some variability in accuracy existed among institutions; when forecasting 4 hours into the future, the MAE of the waiting count ranged between 0.6 and 3.1 patients, the MAE of the occupancy level ranged between 9.0 and 14.5% of beds, and the MAE of the boarding count ranged between 0.9 and 2.7 patients. Conclusion The ForecastED tool generated potentially useful forecasts of input and throughput measures of ED crowding at five external sites, without modifying the underlying assumptions. Noting the limitation that this was not a real-time validation, ongoing research will focus on integrating the tool with ED information systems. PMID:19716629

  11. High event rate ROICs (HEROICs) for astronomical UV photon counting detectors

    NASA Astrophysics Data System (ADS)

    Harwit, Alex; France, Kevin; Argabright, Vic; Franka, Steve; Freymiller, Ed; Ebbets, Dennis

    2014-07-01

    The next generation of astronomical photocathode / microchannel plate based UV photon counting detectors will overcome existing count rate limitations by replacing the anode arrays and external cabled electronics with anode arrays integrated into imaging Read Out Integrated Circuits (ROICs). We have fabricated a High Event Rate ROIC (HEROIC) consisting of a 32 by 32 array of 55 μm square pixels on a 60 μm pitch. The pixel sensitivity (threshold) has been designed to be globally programmable between 1 × 103 and 1 × 106 electrons. To achieve the sensitivity of 1 × 103 electrons, parasitic capacitances had to be minimized and this was achieved by fabricating the ROIC in a 65 nm CMOS process. The ROIC has been designed to support pixel counts up to 4096 events per integration period at rates up to 1 MHz per pixel. Integration time periods can be controlled via an external signal with a time resolution of less than 1 microsecond enabling temporally resolved imaging and spectroscopy of astronomical sources. An electrical injection port is provided to verify functionality and performance of each ROIC prior to vacuum integration with a photocathode and microchannel plate amplifier. Test results on the first ROICs using the electrical injection port demonstrate sensitivities between 3 × 103 and 4 × 105 electrons are achieved. A number of fixes are identified for a re-spin of this ROIC.

  12. The supercontinuum laser as a flexible source for quasi-steady state and time resolved fluorescence studies

    NASA Astrophysics Data System (ADS)

    Fenske, Roger; Näther, Dirk U.; Dennis, Richard B.; Smith, S. Desmond

    2010-02-01

    Commercial Fluorescence Lifetime Spectrometers have long suffered from the lack of a simple, compact and relatively inexpensive broad spectral band light source that can be flexibly employed for both quasi-steady state and time resolved measurements (using Time Correlated Single Photon Counting [TCSPC]). This paper reports the integration of an optically pumped photonic crystal fibre, supercontinuum source1 (Fianium model SC400PP) as a light source in Fluorescence Lifetime Spectrometers (Edinburgh Instruments FLS920 and Lifespec II), with single photon counting detectors (micro-channel plate photomultiplier and a near-infrared photomultiplier) covering the UV to NIR range. An innovative method of spectral selection of the supercontinuum source involving wedge interference filters is also discussed.

  13. An alternative method for immediate dose estimation using CaSO4:Dy based TLD badges

    NASA Astrophysics Data System (ADS)

    Singh, A. K.; Menon, S. N.; Dhabekar, Bhushan; Kadam, Sonal; Chougaonkar, M. P.; Babu, D. A. R.

    2014-11-01

    CaSO4:Dy based Thermoluminescence dosimeters (TLDs) are being used in country wide personnel monitoring program in India. The TL glow curve of CaSO4:Dy consists of a dosimetric peak at 220 °C and a low temperature peak at 120 °C which is unstable at room temperature. The TL integral counts in CaSO4:Dy reduces by 15% in seven days after irradiation due to the thermal fading of 120 °C TL peak. As the dosimetric procedure involves total integrated counts for dose conversion, the dosimeters are typically read about a week after receiving. However in the event of a suspected over exposure, where urgent processing is expected, this poses limitation. Post irradiation annealing treatment is used in such cases of immediate readout of cards. In this paper we report a new and easier to use technique based on optical bleaching for the urgent processing of TLD cards. Optical bleaching with green LED (∼555 nm photons) of 25,000 lux for one and half hour removes the low temperature TL peak without affecting the dosimetric peak. This method can be used for immediate dose estimation using CaSO4:Dy based TLD badges.

  14. On-Line Identification of Simulation Examples for Forgetting Methods to Track Time Varying Parameters Using the Alternative Covariance Matrix in Matlab

    NASA Astrophysics Data System (ADS)

    Vachálek, Ján

    2011-12-01

    The paper compares the abilities of forgetting methods to track time varying parameters of two different simulated models with different types of excitation. The observed parameters in the simulations are the integral sum of the Euclidean norm, deviation of the parameter estimates from their true values and a selected band prediction error count. As supplementary information, we observe the eigenvalues of the covariance matrix. In the paper we used a modified method of Regularized Exponential Forgetting with Alternative Covariance Matrix (REFACM) along with Directional Forgetting (DF) and three standard regularized methods.

  15. Swarm Observations: Implementing Integration Theory to Understand an Opponent Swarm

    DTIC Science & Technology

    2012-09-01

    80 Figure 14 Box counts and local dimension plots for the “Rally” scenario. .....................81 Figure...88 Figure 21 Spatial entropy over time for the “Avoid” scenario.........................................89 Figure 22 Box counts and local...96 Figure 27 Spatial entropy over time for the “Rally-integration” scenario. ......................97 Figure 28 Box counts and

  16. Assessment of cell concentration and viability of isolated hepatocytes using flow cytometry.

    PubMed

    Wigg, Alan J; Phillips, John W; Wheatland, Loretta; Berry, Michael N

    2003-06-01

    The assessment of cell concentration and viability of freshly isolated hepatocyte preparations has been traditionally performed using manual counting with a Neubauer counting chamber and staining for trypan blue exclusion. Despite the simple and rapid nature of this assessment, concerns about the accuracy of these methods exist. Simple flow cytometry techniques which determine cell concentration and viability are available yet surprisingly have not been extensively used or validated with isolated hepatocyte preparations. We therefore investigated the use of flow cytometry using TRUCOUNT Tubes and propidium iodide staining to measure cell concentration and viability of isolated rat hepatocytes in suspension. Analysis using TRUCOUNT Tubes provided more accurate and reproducible measurement of cell concentration than manual cell counting. Hepatocyte viability, assessed using propidium iodide, correlated more closely than did trypan blue exclusion with all indicators of hepatocyte integrity and function measured (lactate dehydrogenase leakage, cytochrome p450 content, cellular ATP concentration, ammonia and lactate removal, urea and albumin synthesis). We conclude that flow cytometry techniques can be used to measure cell concentration and viability of isolated hepatocyte preparations. The techniques are simple, rapid, and more accurate than manual cell counting and trypan blue staining and the results are not affected by protein-containing media.

  17. Comment on: 'A Poisson resampling method for simulating reduced counts in nuclear medicine images'.

    PubMed

    de Nijs, Robin

    2015-07-21

    In order to be able to calculate half-count images from already acquired data, White and Lawson published their method based on Poisson resampling. They verified their method experimentally by measurements with a Co-57 flood source. In this comment their results are reproduced and confirmed by a direct numerical simulation in Matlab. Not only Poisson resampling, but also two direct redrawing methods were investigated. Redrawing methods were based on a Poisson and a Gaussian distribution. Mean, standard deviation, skewness and excess kurtosis half-count/full-count ratios were determined for all methods, and compared to the theoretical values for a Poisson distribution. Statistical parameters showed the same behavior as in the original note and showed the superiority of the Poisson resampling method. Rounding off before saving of the half count image had a severe impact on counting statistics for counts below 100. Only Poisson resampling was not affected by this, while Gaussian redrawing was less affected by it than Poisson redrawing. Poisson resampling is the method of choice, when simulating half-count (or less) images from full-count images. It simulates correctly the statistical properties, also in the case of rounding off of the images.

  18. A qualitative evaluation of a physician-delivered pedometer-based step count prescription strategy with insight from participants and treating physicians.

    PubMed

    Cooke, Alexandra B; Pace, Romina; Chan, Deborah; Rosenberg, Ellen; Dasgupta, Kaberi; Daskalopoulou, Stella S

    2018-05-01

    The integration of pedometers into clinical practice has the potential to enhance physical activity levels in patients with chronic disease. Our SMARTER randomized controlled trial demonstrated that a physician-delivered step count prescription strategy has measurable effects on daily steps, glycemic control, and insulin resistance in patients with type 2 diabetes and/or hypertension. In this study, we aimed to understand perceived barriers and facilitators influencing successful uptake and sustainability of the strategy, from patient and physician perspectives. Qualitative in-depth interviews were conducted in a purposive sample of physicians (n = 10) and participants (n = 20), including successful and less successful cases in terms of pedometer-assessed step count improvements. Themes that achieved saturation in either group through thematic analysis are presented. All participants appreciated the pedometer-based monitoring combined with step count prescriptions. Accountability to physicians and support offered by the trial coordinator influenced participant motivation. Those who increased step counts adopted strategies to integrate more steps into their routines and were able to overcome weather-related barriers by finding indoor alternative options to outdoor steps. Those who decreased step counts reported difficulty in overcoming weather-related challenges, health limitations and work constraints. Physicians indicated the strategy provided a framework for discussing physical activity and motivating patients, but emphasized the need for support from allied professionals to help deliver the strategy in busy clinical settings. A physician-delivered step count prescription strategy was feasibly integrated into clinical practice and successful in engaging most patients; however, continual support is needed for maximal engagement and sustained use. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Maximum likelihood positioning and energy correction for scintillation detectors

    NASA Astrophysics Data System (ADS)

    Lerche, Christoph W.; Salomon, André; Goldschmidt, Benjamin; Lodomez, Sarah; Weissler, Björn; Solf, Torsten

    2016-02-01

    An algorithm for determining the crystal pixel and the gamma ray energy with scintillation detectors for PET is presented. The algorithm uses Likelihood Maximisation (ML) and therefore is inherently robust to missing data caused by defect or paralysed photo detector pixels. We tested the algorithm on a highly integrated MRI compatible small animal PET insert. The scintillation detector blocks of the PET gantry were built with the newly developed digital Silicon Photomultiplier (SiPM) technology from Philips Digital Photon Counting and LYSO pixel arrays with a pitch of 1 mm and length of 12 mm. Light sharing was used to readout the scintillation light from the 30× 30 scintillator pixel array with an 8× 8 SiPM array. For the performance evaluation of the proposed algorithm, we measured the scanner’s spatial resolution, energy resolution, singles and prompt count rate performance, and image noise. These values were compared to corresponding values obtained with Center of Gravity (CoG) based positioning methods for different scintillation light trigger thresholds and also for different energy windows. While all positioning algorithms showed similar spatial resolution, a clear advantage for the ML method was observed when comparing the PET scanner’s overall single and prompt detection efficiency, image noise, and energy resolution to the CoG based methods. Further, ML positioning reduces the dependence of image quality on scanner configuration parameters and was the only method that allowed achieving highest energy resolution, count rate performance and spatial resolution at the same time.

  20. K-edge energy-based calibration method for photon counting detectors

    NASA Astrophysics Data System (ADS)

    Ge, Yongshuai; Ji, Xu; Zhang, Ran; Li, Ke; Chen, Guang-Hong

    2018-01-01

    In recent years, potential applications of energy-resolved photon counting detectors (PCDs) in the x-ray medical imaging field have been actively investigated. Unlike conventional x-ray energy integration detectors, PCDs count the number of incident x-ray photons within certain energy windows. For PCDs, the interactions between x-ray photons and photoconductor generate electronic voltage pulse signals. The pulse height of each signal is proportional to the energy of the incident photons. By comparing the pulse height with the preset energy threshold values, x-ray photons with specific energies are recorded and sorted into different energy bins. To quantitatively understand the meaning of the energy threshold values, and thus to assign an absolute energy value to each energy bin, energy calibration is needed to establish the quantitative relationship between the threshold values and the corresponding effective photon energies. In practice, the energy calibration is not always easy, due to the lack of well-calibrated energy references for the working energy range of the PCDs. In this paper, a new method was developed to use the precise knowledge of the characteristic K-edge energy of materials to perform energy calibration. The proposed method was demonstrated using experimental data acquired from three K-edge materials (viz., iodine, gadolinium, and gold) on two different PCDs (Hydra and Flite, XCounter, Sweden). Finally, the proposed energy calibration method was further validated using a radioactive isotope (Am-241) with a known decay energy spectrum.

  1. Allele-specific copy-number discovery from whole-genome and whole-exome sequencing.

    PubMed

    Wang, WeiBo; Wang, Wei; Sun, Wei; Crowley, James J; Szatkiewicz, Jin P

    2015-08-18

    Copy-number variants (CNVs) are a major form of genetic variation and a risk factor for various human diseases, so it is crucial to accurately detect and characterize them. It is conceivable that allele-specific reads from high-throughput sequencing data could be leveraged to both enhance CNV detection and produce allele-specific copy number (ASCN) calls. Although statistical methods have been developed to detect CNVs using whole-genome sequence (WGS) and/or whole-exome sequence (WES) data, information from allele-specific read counts has not yet been adequately exploited. In this paper, we develop an integrated method, called AS-GENSENG, which incorporates allele-specific read counts in CNV detection and estimates ASCN using either WGS or WES data. To evaluate the performance of AS-GENSENG, we conducted extensive simulations, generated empirical data using existing WGS and WES data sets and validated predicted CNVs using an independent methodology. We conclude that AS-GENSENG not only predicts accurate ASCN calls but also improves the accuracy of total copy number calls, owing to its unique ability to exploit information from both total and allele-specific read counts while accounting for various experimental biases in sequence data. Our novel, user-friendly and computationally efficient method and a complete analytic protocol is freely available at https://sourceforge.net/projects/asgenseng/. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.

  2. Waveguide integrated superconducting single-photon detectors with high internal quantum efficiency at telecom wavelengths

    PubMed Central

    Kahl, Oliver; Ferrari, Simone; Kovalyuk, Vadim; Goltsman, Gregory N.; Korneev, Alexander; Pernice, Wolfram H. P.

    2015-01-01

    Superconducting nanowire single-photon detectors (SNSPDs) provide high efficiency for detecting individual photons while keeping dark counts and timing jitter minimal. Besides superior detection performance over a broad optical bandwidth, compatibility with an integrated optical platform is a crucial requirement for applications in emerging quantum photonic technologies. Here we present SNSPDs embedded in nanophotonic integrated circuits which achieve internal quantum efficiencies close to unity at 1550 nm wavelength. This allows for the SNSPDs to be operated at bias currents far below the critical current where unwanted dark count events reach milli-Hz levels while on-chip detection efficiencies above 70% are maintained. The measured dark count rates correspond to noise-equivalent powers in the 10−19 W/Hz−1/2 range and the timing jitter is as low as 35 ps. Our detectors are fully scalable and interface directly with waveguide-based optical platforms. PMID:26061283

  3. Waveguide integrated superconducting single-photon detectors with high internal quantum efficiency at telecom wavelengths.

    PubMed

    Kahl, Oliver; Ferrari, Simone; Kovalyuk, Vadim; Goltsman, Gregory N; Korneev, Alexander; Pernice, Wolfram H P

    2015-06-10

    Superconducting nanowire single-photon detectors (SNSPDs) provide high efficiency for detecting individual photons while keeping dark counts and timing jitter minimal. Besides superior detection performance over a broad optical bandwidth, compatibility with an integrated optical platform is a crucial requirement for applications in emerging quantum photonic technologies. Here we present SNSPDs embedded in nanophotonic integrated circuits which achieve internal quantum efficiencies close to unity at 1550 nm wavelength. This allows for the SNSPDs to be operated at bias currents far below the critical current where unwanted dark count events reach milli-Hz levels while on-chip detection efficiencies above 70% are maintained. The measured dark count rates correspond to noise-equivalent powers in the 10(-19) W/Hz(-1/2) range and the timing jitter is as low as 35 ps. Our detectors are fully scalable and interface directly with waveguide-based optical platforms.

  4. Counts, serovars, and antimicrobial resistance phenotypes of Salmonella on raw chicken meat at retail in Colombia.

    PubMed

    Donado-Godoy, Pilar; Clavijo, Viviana; León, Maribel; Arevalo, Alejandra; Castellanos, Ricardo; Bernal, Johan; Tafur, Mc Allister; Ovalle, Maria Victoria; Alali, Walid Q; Hume, Michael; Romero-Zuñiga, Juan Jose; Walls, Isabel; Doyle, Michael P

    2014-02-01

    The objective of this study was to determine Salmonella counts, serovars, and antimicrobial-resistant phenotypes on retail raw chicken carcasses in Colombia. A total of 301 chicken carcasses were collected from six departments (one city per department) in Colombia. Samples were analyzed for Salmonella counts using the most-probable-number method as recommended by the U.S. Department of Agriculture, Food Safety Inspection Service protocol. A total of 378 isolates (268 from our previous study) were serotyped and tested for antimicrobial susceptibility. The overall Salmonella count (mean log most probable number per carcass ± 95% confidence interval) and prevalence were 2.1 (2.0 to 2.3) and 37%, respectively. There were significant differences (P < 0.05) by Salmonella levels (i.e., counts and prevalence) by storage temperature (i.e., frozen, chilled, or ambient), retail store type (wet markets, supermarkets, and independent markets), and poultry company (chicken produced by integrated or nonintegrated company). Frozen chicken had the lowest Salmonella levels compared with chicken stored at other temperatures, chickens from wet markets had higher levels than those from other retail store types, and chicken produced by integrated companies had lower levels than nonintegrated companies. Thirty-one Salmonella serovars were identified among 378 isolates, with Salmonella Paratyphi B tartrate-positive (i.e., Salmonella Paratyphi B dT+) the most prevalent (44.7%), followed by Heidelberg (19%), Enteritidis (17.7%), Typhimurium (5.3%), and Anatum (2.1%). Of all the Salmonella isolates, 35.2% were resistant to 1 to 5 antimicrobial agents, 24.6% to 6 to 10, and 33.9% to 11 to 15. Among all the serovars obtained, Salmonella Paratyphi B dT+ and Salmonella Heidelberg were the most antimicrobial resistant. Salmonella prevalence was determined to be high, whereas cell numbers were relatively low. These data can be used in developing risk assessment models for preventing the transmission of Salmonella from chicken to humans in Colombia.

  5. A fast and high-sensitive dual-wavelength diffuse optical tomography system using digital lock-in photon-counting technique

    NASA Astrophysics Data System (ADS)

    Chen, Weiting; Yi, Xi; Zhao, Huijuan; Gao, Feng

    2014-09-01

    We presented a novel dual-wavelength diffuse optical imaging system which can perform 2-D or 3-D imaging fast and high-sensitively for monitoring the dynamic change of optical parameters. A newly proposed lock-in photon-counting detection method was adopted for week optical signal collection, which brought in excellent property as well as simplified geometry. Fundamental principles of the lock-in photon-counting detection were elaborately demonstrated, and the feasibility was strictly verified by the linearity experiment. Systemic performance of the prototype set up was experimentally accessed, including stray light rejection and inherent interference. Results showed that the system possessed superior anti-interference capability (under 0.58% in darkroom) compared with traditional photon-counting detection, and the crosstalk between two wavelengths was lower than 2.28%. For comprehensive assessment, 2-D phantom experiments towards relatively large dimension model (diameter of 4cm) were conducted. Different absorption targets were imaged to investigate detection sensitivity. Reconstruction image under all conditions was exciting, with a desirable SNR. Study on image quality v.s. integration time put forward a new method for accessing higher SNR with the sacrifice of measuring speed. In summary, the newly developed system showed great potential in promoting detection sensitivity as well as measuring speed. This will make substantial progress in dynamically tracking the blood concentration distribution in many clinical areas, such as small animal disease modeling, human brain activity research and thick tissues (for example, breast) diagnosis.

  6. Integrated circuit authentication using photon-limited x-ray microscopy.

    PubMed

    Markman, Adam; Javidi, Bahram

    2016-07-15

    A counterfeit integrated circuit (IC) may contain subtle changes to its circuit configuration. These changes may be observed when imaged using an x-ray; however, the energy from the x-ray can potentially damage the IC. We have investigated a technique to authenticate ICs under photon-limited x-ray imaging. We modeled an x-ray image with lower energy by generating a photon-limited image from a real x-ray image using a weighted photon-counting method. We performed feature extraction on the image using the speeded-up robust features (SURF) algorithm. We then authenticated the IC by comparing the SURF features to a database of SURF features from authentic and counterfeit ICs. Our experimental results with real and counterfeit ICs using an x-ray microscope demonstrate that we can correctly authenticate an IC image captured using orders of magnitude lower energy x-rays. To the best of our knowledge, this Letter is the first one on using a photon-counting x-ray imaging model and relevant algorithms to authenticate ICs to prevent potential damage.

  7. Monolithically compatible impedance measurement

    DOEpatents

    Ericson, Milton Nance; Holcomb, David Eugene

    2002-01-01

    A monolithic sensor includes a reference channel and at least one sensing channel. Each sensing channel has an oscillator and a counter driven by the oscillator. The reference channel and the at least one sensing channel being formed integrally with a substrate and intimately nested with one another on the substrate. Thus, the oscillator and the counter have matched component values and temperature coefficients. A frequency determining component of the sensing oscillator is formed integrally with the substrate and has an impedance parameter which varies with an environmental parameter to be measured by the sensor. A gating control is responsive to an output signal generated by the reference channel, for terminating counting in the at least one sensing channel at an output count, whereby the output count is indicative of the environmental parameter, and successive ones of the output counts are indicative of changes in the environmental parameter.

  8. Waveguide integrated low noise NbTiN nanowire single-photon detectors with milli-Hz dark count rate

    PubMed Central

    Schuck, Carsten; Pernice, Wolfram H. P.; Tang, Hong X.

    2013-01-01

    Superconducting nanowire single-photon detectors are an ideal match for integrated quantum photonic circuits due to their high detection efficiency for telecom wavelength photons. Quantum optical technology also requires single-photon detection with low dark count rate and high timing accuracy. Here we present very low noise superconducting nanowire single-photon detectors based on NbTiN thin films patterned directly on top of Si3N4 waveguides. We systematically investigate a large variety of detector designs and characterize their detection noise performance. Milli-Hz dark count rates are demonstrated over the entire operating range of the nanowire detectors which also feature low timing jitter. The ultra-low dark count rate, in combination with the high detection efficiency inherent to our travelling wave detector geometry, gives rise to a measured noise equivalent power at the 10−20 W/Hz1/2 level. PMID:23714696

  9. TU-FG-209-03: Exploring the Maximum Count Rate Capabilities of Photon Counting Arrays Based On Polycrystalline Silicon

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liang, A K; Koniczek, M; Antonuk, L E

    Purpose: Photon counting arrays (PCAs) offer several advantages over conventional, fluence-integrating x-ray imagers, such as improved contrast by means of energy windowing. For that reason, we are exploring the feasibility and performance of PCA pixel circuitry based on polycrystalline silicon. This material, unlike the crystalline silicon commonly used in photon counting detectors, lends itself toward the economic manufacture of radiation tolerant, monolithic large area (e.g., ∼43×43 cm2) devices. In this presentation, exploration of maximum count rate, a critical performance parameter for such devices, is reported. Methods: Count rate performance for a variety of pixel circuit designs was explored through detailedmore » circuit simulations over a wide range of parameters (including pixel pitch and operating conditions) with the additional goal of preserving good energy resolution. The count rate simulations assume input events corresponding to a 72 kVp x-ray spectrum with 20 mm Al filtration interacting with a CZT detector at various input flux rates. Output count rates are determined at various photon energy threshold levels, and the percentage of counts lost (e.g., due to deadtime or pile-up) is calculated from the ratio of output to input counts. The energy resolution simulations involve thermal and flicker noise originating from each circuit element in a design. Results: Circuit designs compatible with pixel pitches ranging from 250 to 1000 µm that allow count rates over a megacount per second per pixel appear feasible. Such rates are expected to be suitable for radiographic and fluoroscopic imaging. Results for the analog front-end circuitry of the pixels show that acceptable energy resolution can also be achieved. Conclusion: PCAs created using polycrystalline silicon have the potential to offer monolithic large-area detectors with count rate performance comparable to those of crystalline silicon detectors. Further improvement through detailed circuit simulations and prototyping is expected. Partially supported by NIH grant R01-EB000558. This work was partially supported by NIH grant no. R01-EB000558.« less

  10. A study on evaluation of the dependences of the function and the shape in a 99 m Tc-DMSA renal scan on the difference in acquisition count

    NASA Astrophysics Data System (ADS)

    Dong, Kyung-Rae; Shim, Dong-Oh; Kim, Ho-Sung; Park, Yong-Soon; Chung, Woon-Kwan; Cho, Jae-Hwan

    2013-02-01

    In a nuclear medicine examination, methods to acquire a static image include the preset count method and the preset time method. The preset count method is used mainly in a static renal scan that utilizes 99 m Tc-DMSA (dimoercaptosuccinic acid) whereas the preset time method is used occasionally. When the preset count method is used, the same number of acquisition counts is acquired for each time, but the scan time varies. When the preset time method is used, the scan time is constant, but the number of counts acquired is not the same. Therefore, this study examined the dependence of the difference in information on the function and the shape of both sides of the kidneys on the counts acquired during a renal scan that utilizes 99 m Tc-DMSA. The study involved patients who had 40-60% relative function of one kidney among patients who underwent a 99 m Tc-DMSA renal scan in the Nuclear Medicine Department during the period from January 11 to March 31, 2012. A gamma camera was used to obtain the acquisition count continuously using 100,000 counts and 300,000 counts, and an acquisition time of 7 minutes (exceeding 300,000 counts). The function and the shape of the kidney were evaluated by measuring the relative function of both sides of the kidneys, the geometric mean, and the size of kidney before comparative analysis. According to the study results, neither the relative function nor the geometric mean of both sides of the kidneys varied significantly with the acquisition count. On the other hand, the size of the kidney tended to be larger with increasing acquisition count.

  11. Testing the Hypothesis of Young Martian Volcanism: Studies of the Tharsis Volcanoes and Adjacent Lava Plains

    NASA Technical Reports Server (NTRS)

    Grier, Jennifer A.

    2005-01-01

    We experienced much success in reaching our stated goals in our original MDAP proposal. Our work made substantial contributions towards an integrated understanding of the counting and calibration of crater data on Mars, and changing nature of the Martian surface influenced by craters, water, and wind, and their general relationship to Martian geothermal history. We accomplished this while being to responsive to the rapid changes in the field brought about by several key NASA missions that returned data during the life of the grant. Our integrated effort included three stages: The first major area of research (Crater Count Research) was conducted by Jennifer Grier (P.I.), Lazslo Keszthelyi (Collaborator), William Hartmann (Collaborator), with assistance from Dan Berman (Graduate student) and concerned the mapping and the collection of crater count data on various Martian terrains. The second major area of study (Absolute Age Calibration) was conducted by William Bottke (Co-I) at SWRI, and concerned constraining the nature of the Moon and Mars impactor populations to create better absolute age calibrations for counted areas. The third major area of study was the integration and leverage of this effort with ongoing related Mars crater work at PSI (Integrated and Continuing Studies - Older Volcanoes), headed by David Crown (PSI Scientist), assisted by Les Bleamaster (PSI Scientist) and Dan Berman (Graduate Student).

  12. Intercomparison of active and passive instruments for radon and radon progeny in North America

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    George, A.C.; Tu, Keng-Wu; Knutson, E.O.

    1995-02-01

    An intercomparison exercise for radon and radon progeny instruments and methods was held at the Environmental Measurements Laboratory (EML) from April 22--May 2, 1994. The exercise was conducted in the new EML radon test and calibration facility in which conditions of exposure are very well controlled. The detection systems of the intercompared instruments consisted of. (1) pulse ionization chambers, (2) electret ionization chambers, (3) scintillation detectors, (4) alpha particle spectrometers with silicon diodes, surface barrier or diffused junction detectors, (5) registration of nuclear tracks in solid-state materials, and (6) activated carbon collectors counted by gamma-ray spectrometry or by alpha- andmore » beta-liquid scintillation counting. 23 private firms, government laboratories and universities participated with a 165 passive integrating devices consisting of: Activated carbon collectors, nuclear alpha track detectors and electret ionization chambers, and 11 active and passive continuous radon monitors. Five portable integrating and continuous instruments were intercompared for radon progeny. Forty grab samples for radon progeny were taken by five groups that participated in person to test and evaluate their primary instruments and methods that measure individual radon progeny and the potential alpha energy concentration (PAEC) in indoor air. Results indicate that more than 80% of the measurements for radon performed with a variety of instruments, are within {plus_minus}10% of actual value. The majority of the instruments that measure individual radon progeny and the PAEC gave results that are in good agreement with the EML reference value. Radon progeny measurements made with continuous and integrating instruments are satisfactory with room for improvement.« less

  13. Revision of the NIST Standard for (223)Ra: New Measurements and Review of 2008 Data.

    PubMed

    Zimmerman, B E; Bergeron, D E; Cessna, J T; Fitzgerald, R; Pibida, L

    2015-01-01

    After discovering a discrepancy in the transfer standard currently being disseminated by the National Institute of Standards and Technology (NIST), we have performed a new primary standardization of the alpha-emitter (223)Ra using Live-timed Anticoincidence Counting (LTAC) and the Triple-to-Double Coincidence Ratio Method (TDCR). Additional confirmatory measurements were made with the CIEMAT-NIST efficiency tracing method (CNET) of liquid scintillation counting, integral γ-ray counting using a NaI(Tl) well counter, and several High Purity Germanium (HPGe) detectors in an attempt to understand the origin of the discrepancy and to provide a correction. The results indicate that a -9.5 % difference exists between activity values obtained using the former transfer standard relative to the new primary standardization. During one of the experiments, a 2 % difference in activity was observed between dilutions of the (223)Ra master solution prepared using the composition used in the original standardization and those prepared using 1 mol·L(-1) HCl. This effect appeared to be dependent on the number of dilutions or the total dilution factor to the master solution, but the magnitude was not reproducible. A new calibration factor ("K-value") has been determined for the NIST Secondary Standard Ionization Chamber (IC "A"), thereby correcting the discrepancy between the primary and secondary standards.

  14. An integrated circuit floating point accumulator

    NASA Technical Reports Server (NTRS)

    Goldsmith, T. C.

    1977-01-01

    Goddard Space Flight Center has developed a large scale integrated circuit (type 623) which can perform pulse counting, storage, floating point compression, and serial transmission, using a single monolithic device. Counts of 27 or 19 bits can be converted to transmitted values of 12 or 8 bits respectively. Use of the 623 has resulted in substantial savaings in weight, volume, and dollar resources on at least 11 scientific instruments to be flown on 4 NASA spacecraft. The design, construction, and application of the 623 are described.

  15. Integrated four-channel all-fiber up-conversion single-photon-detector with adjustable efficiency and dark count.

    PubMed

    Zheng, Ming-Yang; Shentu, Guo-Liang; Ma, Fei; Zhou, Fei; Zhang, Hai-Ting; Dai, Yun-Qi; Xie, Xiuping; Zhang, Qiang; Pan, Jian-Wei

    2016-09-01

    Up-conversion single photon detector (UCSPD) has been widely used in many research fields including quantum key distribution, lidar, optical time domain reflectrometry, and deep space communication. For the first time in laboratory, we have developed an integrated four-channel all-fiber UCSPD which can work in both free-running and gate modes. This compact module can satisfy different experimental demands with adjustable detection efficiency and dark count. We have characterized the key parameters of the UCSPD system.

  16. A New Integrated Onboard Charger and Accessory Power Converter for Plug-in Electric Vehicles

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Su, Gui-Jia; Tang, Lixin

    2014-01-01

    In this paper, a new approach is presented for integrating the function of onboard battery charging into the traction drive system and accessory dc-dc converter of a plug-in electric vehicle (PEV). The idea is to utilize the segmented traction drive system of a PEV as the frond converter of the charging circuit and the transformer and high voltage converter of the 14 V accessory dc-dc converter to form a galvanically isolated onboard charger. Moreover, a control method is presented for suppressing the battery current ripple component of twice the grid frequency with the reduced dc bus capacitor in the segmentedmore » inverter. The resultant integrated charger has lower cost, weight, and volume than a standalone charger due to a substantially reduced component count. The proposed integrated charger topology was verified by modeling and experimental results on a 5.8 kW charger prototype.« less

  17. Evaluation of mouse red blood cell and platelet counting with an automated hematology analyzer.

    PubMed

    Fukuda, Teruko; Asou, Eri; Nogi, Kimiko; Goto, Kazuo

    2017-10-07

    An evaluation of mouse red blood cell (RBC) and platelet (PLT) counting with an automated hematology analyzer was performed with three strains of mice, C57BL/6 (B6), BALB/c (BALB) and DBA/2 (D2). There were no significant differences in RBC and PLT counts between manual and automated optical methods in any of the samples, except for D2 mice. For D2, RBC counts obtained using the manual method were significantly lower than those obtained using the automated optical method (P<0.05), and PLT counts obtained using the manual method were higher than those obtained using the automated optical method (P<0.05). An automated hematology analyzer can be used for RBC and PLT counting; however, an appropriate method should be selected when D2 mice samples are used.

  18. Disruption of diphthamide synthesis genes and resulting toxin resistance as a robust technology for quantifying and optimizing CRISPR/Cas9-mediated gene editing.

    PubMed

    Killian, Tobias; Dickopf, Steffen; Haas, Alexander K; Kirstenpfad, Claudia; Mayer, Klaus; Brinkmann, Ulrich

    2017-11-13

    We have devised an effective and robust method for the characterization of gene-editing events. The efficacy of editing-mediated mono- and bi-allelic gene inactivation and integration events is quantified based on colony counts. The combination of diphtheria toxin (DT) and puromycin (PM) selection enables analyses of 10,000-100,000 individual cells, assessing hundreds of clones with inactivated genes per experiment. Mono- and bi-allelic gene inactivation is differentiated by DT resistance, which occurs only upon bi-allelic inactivation. PM resistance indicates integration. The robustness and generalizability of the method were demonstrated by quantifying the frequency of gene inactivation and cassette integration under different editing approaches: CRISPR/Cas9-mediated complete inactivation was ~30-50-fold more frequent than cassette integration. Mono-allelic inactivation without integration occurred >100-fold more frequently than integration. Assessment of gRNA length confirmed 20mers to be most effective length for inactivation, while 16-18mers provided the highest overall integration efficacy. The overall efficacy was ~2-fold higher for CRISPR/Cas9 than for zinc-finger nuclease and was significantly increased upon modulation of non-homologous end joining or homology-directed repair. The frequencies and ratios of editing events were similar for two different DPH genes (independent of the target sequence or chromosomal location), which indicates that the optimization parameters identified with this method can be generalized.

  19. Experimental evaluation of the extended Dytlewski-style dead time correction formalism for neutron multiplicity counting

    NASA Astrophysics Data System (ADS)

    Lockhart, M.; Henzlova, D.; Croft, S.; Cutler, T.; Favalli, A.; McGahee, Ch.; Parker, R.

    2018-01-01

    Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli(DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory and implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. The current paper discusses and presents the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. In order to assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. The DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.

  20. Ways to improve your correlation functions

    NASA Technical Reports Server (NTRS)

    Hamilton, A. J. S.

    1993-01-01

    This paper describes a number of ways to improve on the standard method for measuring the two-point correlation function of large scale structure in the Universe. Issues addressed are: (1) the problem of the mean density, and how to solve it; (2) how to estimate the uncertainty in a measured correlation function; (3) minimum variance pair weighting; (4) unbiased estimation of the selection function when magnitudes are discrete; and (5) analytic computation of angular integrals in background pair counts.

  1. Addition of DNase Improves the In Vitro Activity of Antifungal Drugs against Candida albicans Biofilms

    PubMed Central

    Martins, Margarida; Henriques, Mariana; Lopez-Ribot, José L.; Oliveira, Rosário

    2011-01-01

    SUMMARY Background Cells within Candida albicans biofilms display decreased susceptibility to most clinically used antifungal agents. We recently demonstrated that extracellular DNA (eDNA) plays an important role in biofilm integrity, as a component of the biofilm matrix. Objective To gain insight into the contributions of eDNA to C. albicans biofilms antifungal susceptibility by the investigation of the impact of the combined use of deoxyribonuclease I (DNase) and antifungals to treat biofilms. Methods C. albicans biofilms were formed using a simple and reproducible 96-well plate-based method. The activity of the combined use of 0.13 mg l−1 DNase and antifungals was estimated by the 2,3-bis(2-methoxy-4-nitro-5-sulfophenyl)-5-[(phenylamino)carbonyl]-2H-tetrazolium hydroxide (XTT) reduction assay, and total viable counts. Results and Conclusions Here we report the improved efficacy of amphotericin B when in combination with DNase against C. albicans biofilms, as assessed by XTT readings and viable counts. Furthermore, although DNase increased the efficacy of caspofungin in the reduction of mitochondrial activity, no changes were observed in terms of culturable cells. DNase did not affect biofilm cells susceptibility to fluconazole. This work suggests that agents that target processes affecting the biofilm structural integrity may have potential use as adjuvants of a catheter–lock therapy. PMID:21668524

  2. Performance Assessment of Different Pulse Reconstruction Algorithms for the ATHENA X-Ray Integral Field Unit

    NASA Technical Reports Server (NTRS)

    Peille, Phillip; Ceballos, Maria Teresa; Cobo, Beatriz; Wilms, Joern; Bandler, Simon; Smith, Stephen J.; Dauser, Thomas; Brand, Thorsten; Den Haretog, Roland; de Plaa, Jelle; hide

    2016-01-01

    The X-ray Integral Field Unit (X-IFU) microcalorimeter, on-board Athena, with its focal plane comprising 3840 Transition Edge Sensors (TESs) operating at 90 mK, will provide unprecedented spectral-imaging capability in the 0.2-12 keV energy range. It will rely on the on-board digital processing of current pulses induced by the heat deposited in the TES absorber, as to recover the energy of each individual events. Assessing the capabilities of the pulse reconstruction is required to understand the overall scientific performance of the X-IFU, notably in terms of energy resolution degradation with both increasing energies and count rates. Using synthetic data streams generated by the X-IFU End-to-End simulator, we present here a comprehensive benchmark of various pulse reconstruction techniques, ranging from standard optimal filtering to more advanced algorithms based on noise covariance matrices. Beside deriving the spectral resolution achieved by the different algorithms, a first assessment of the computing power and ground calibration needs is presented. Overall, all methods show similar performances, with the reconstruction based on noise covariance matrices showing the best improvement with respect to the standard optimal filtering technique. Due to prohibitive calibration needs, this method might however not be applicable to the X-IFU and the best compromise currently appears to be the so-called resistance space analysis which also features very promising high count rate capabilities.

  3. Estimation and correction of visibility bias in aerial surveys of wintering ducks

    USGS Publications Warehouse

    Pearse, A.T.; Gerard, P.D.; Dinsmore, S.J.; Kaminski, R.M.; Reinecke, K.J.

    2008-01-01

    Incomplete detection of all individuals leading to negative bias in abundance estimates is a pervasive source of error in aerial surveys of wildlife, and correcting that bias is a critical step in improving surveys. We conducted experiments using duck decoys as surrogates for live ducks to estimate bias associated with surveys of wintering ducks in Mississippi, USA. We found detection of decoy groups was related to wetland cover type (open vs. forested), group size (1?100 decoys), and interaction of these variables. Observers who detected decoy groups reported counts that averaged 78% of the decoys actually present, and this counting bias was not influenced by either covariate cited above. We integrated this sightability model into estimation procedures for our sample surveys with weight adjustments derived from probabilities of group detection (estimated by logistic regression) and count bias. To estimate variances of abundance estimates, we used bootstrap resampling of transects included in aerial surveys and data from the bias-correction experiment. When we implemented bias correction procedures on data from a field survey conducted in January 2004, we found bias-corrected estimates of abundance increased 36?42%, and associated standard errors increased 38?55%, depending on species or group estimated. We deemed our method successful for integrating correction of visibility bias in an existing sample survey design for wintering ducks in Mississippi, and we believe this procedure could be implemented in a variety of sampling problems for other locations and species.

  4. Identification of CSF fistulas by radionuclide counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamamoto, Y.; Kunishio, K.; Sunami, N.

    1990-07-01

    A radionuclide counting method, performed with the patient prone and the neck flexed, was used successfully to diagnose CSF rhinorrhea in two patients. A normal radionuclide ratio (radionuclide counts in pledget/radionuclide counts in 1-ml blood sample) was obtained in 11 normal control subjects. Significance was determined to be a ratio greater than 0.37. Use of radionuclide counting method of determining CSF rhinorrhea is recommended when other methods have failed to locate a site of leakage or when posttraumatic meningitis suggests subclinical CSF rhinorrhea.

  5. Single photon detection in a waveguide-coupled Ge-on-Si lateral avalanche photodiode.

    PubMed

    Martinez, Nicholas J D; Gehl, Michael; Derose, Christopher T; Starbuck, Andrew L; Pomerene, Andrew T; Lentine, Anthony L; Trotter, Douglas C; Davids, Paul S

    2017-07-10

    We examine gated-Geiger mode operation of an integrated waveguide-coupled Ge-on-Si lateral avalanche photodiode (APD) and demonstrate single photon detection at low dark count for this mode of operation. Our integrated waveguide-coupled APD is fabricated using a selective epitaxial Ge-on-Si growth process resulting in a separate absorption and charge multiplication (SACM) design compatible with our silicon photonics platform. Single photon detection efficiency and dark count rate is measured as a function of temperature in order to understand and optimize performance characteristics in this device. We report single photon detection of 5.27% at 1310 nm and a dark count rate of 534 kHz at 80 K for a Ge-on-Si single photon avalanche diode. Dark count rate is the lowest for a Ge-on-Si single photon detector in this range of temperatures while maintaining competitive detection efficiency. A jitter of 105 ps was measured for this device.

  6. Effects of Extended Freezer Storage on the Integrity of Human Milk.

    PubMed

    Ahrabi, Ali Faraghi; Handa, Deepali; Codipilly, Champa N; Shah, Syed; Williams, Janet E; McGuire, Mark A; Potak, Debra; Aharon, Grace Golda; Schanler, Richard J

    2016-10-01

    To examine the integrity (pH, bacterial counts, host defense factors, nutrient contents, and osmolality) of freshly expressed and previously refrigerated human milk subjected to long-term freezer storage. Mothers donated 100 mL of freshly expressed milk. Samples were divided into baseline, storage at -20°C (fresh frozen) for 1, 3, 6, and 9 months, and prior storage at +4°C for 72 hours (refrigerated frozen) before storage at -20°C for 1 to 9 months. Samples were analyzed for pH, total bacterial colony count, gram-positive and gram-negative colony counts, and concentrations of total protein, fat, nonesterified fatty acids, lactoferrin, secretory IgA, and osmolality. Milk pH, total bacterial colony count, and Gram-positive colony counts decreased significantly with freezer storage (P < .001); bacterial counts decreased most rapidly in the refrigerated frozen group. The gram-negative colony count decreased significantly over time (P < .001). Nonesterified fatty acid concentrations increased significantly with time in storage (P < .001). Freezing for up to 9 months did not affect total protein, fat, lactoferrin, secretory IgA, or osmolality in either group. Freezer storage of human milk for 9 months at -20°C is associated with decreasing pH and bacterial counts, but preservation of key macronutrients and immunoactive components, with or without prior refrigeration for 72 hours. These data support current guidelines for freezer storage of human milk for up to 9 months for both freshly expressed and refrigerated milk. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. HETEROTROPHIC PLATE COUNT (HPC) METHODOLOGY IN THE UNITED STATES

    EPA Science Inventory

    ABSTRACT

    In the United States (U.S.), the history of bacterial plate counting methods used for water can be traced largely through Standard Methods for the Examination of Water and Wastewater (Standard Methods). The bacterial count method has evolved from the original St...

  8. Characteristic performance evaluation of a photon counting Si strip detector for low dose spectral breast CT imaging

    PubMed Central

    Cho, Hyo-Min; Barber, William C.; Ding, Huanjun; Iwanczyk, Jan S.; Molloi, Sabee

    2014-01-01

    Purpose: The possible clinical applications which can be performed using a newly developed detector depend on the detector's characteristic performance in a number of metrics including the dynamic range, resolution, uniformity, and stability. The authors have evaluated a prototype energy resolved fast photon counting x-ray detector based on a silicon (Si) strip sensor used in an edge-on geometry with an application specific integrated circuit to record the number of x-rays and their energies at high flux and fast frame rates. The investigated detector was integrated with a dedicated breast spectral computed tomography (CT) system to make use of the detector's high spatial and energy resolution and low noise performance under conditions suitable for clinical breast imaging. The aim of this article is to investigate the intrinsic characteristics of the detector, in terms of maximum output count rate, spatial and energy resolution, and noise performance of the imaging system. Methods: The maximum output count rate was obtained with a 50 W x-ray tube with a maximum continuous output of 50 kVp at 1.0 mA. A109Cd source, with a characteristic x-ray peak at 22 keV from Ag, was used to measure the energy resolution of the detector. The axial plane modulation transfer function (MTF) was measured using a 67 μm diameter tungsten wire. The two-dimensional (2D) noise power spectrum (NPS) was measured using flat field images and noise equivalent quanta (NEQ) were calculated using the MTF and NPS results. The image quality parameters were studied as a function of various radiation doses and reconstruction filters. The one-dimensional (1D) NPS was used to investigate the effect of electronic noise elimination by varying the minimum energy threshold. Results: A maximum output count rate of 100 million counts per second per square millimeter (cps/mm2) has been obtained (1 million cps per 100 × 100 μm pixel). The electrical noise floor was less than 4 keV. The energy resolution measured with the 22 keV photons from a 109Cd source was less than 9%. A reduction of image noise was shown in all the spatial frequencies in 1D NPS as a result of the elimination of the electronic noise. The spatial resolution was measured just above 5 line pairs per mm (lp/mm) where 10% of MTF corresponded to 5.4 mm−1. The 2D NPS and NEQ shows a low noise floor and a linear dependence on dose. The reconstruction filter choice affected both of the MTF and NPS results, but had a weak effect on the NEQ. Conclusions: The prototype energy resolved photon counting Si strip detector can offer superior imaging performance for dedicated breast CT as compared to a conventional energy-integrating detector due to its high output count rate, high spatial and energy resolution, and low noise characteristics, which are essential characteristics for spectral breast CT imaging. PMID:25186390

  9. Mapping of Bird Distributions from Point Count Surveys

    Treesearch

    John R. Sauer; Grey W. Pendleton; Sandra Orsillo

    1995-01-01

    Maps generated from bird survey data are used for a variety of scientific purposes, but little is known about their bias and precision. We review methods for preparing maps from point count data and appropriate sampling methods for maps based on point counts. Maps based on point counts can be affected by bias associated with incomplete counts, primarily due to changes...

  10. Platelet counting using the Coulter electronic counter.

    PubMed

    Eggleton, M J; Sharp, A A

    1963-03-01

    A method for counting platelets in dilutions of platelet-rich plasm using the Coulter electronic counter is described.(1) The results obtained show that such platelet counts are at least as accurate as the best methods of visual counting. The various technical difficulties encountered are discussed.

  11. Three-dimensional passive sensing photon counting for object classification

    NASA Astrophysics Data System (ADS)

    Yeom, Seokwon; Javidi, Bahram; Watson, Edward

    2007-04-01

    In this keynote address, we address three-dimensional (3D) distortion-tolerant object recognition using photon-counting integral imaging (II). A photon-counting linear discriminant analysis (LDA) is discussed for classification of photon-limited images. We develop a compact distortion-tolerant recognition system based on the multiple-perspective imaging of II. Experimental and simulation results have shown that a low level of photons is sufficient to classify out-of-plane rotated objects.

  12. Image-based red cell counting for wild animals blood.

    PubMed

    Mauricio, Claudio R M; Schneider, Fabio K; Dos Santos, Leonilda Correia

    2010-01-01

    An image-based red blood cell (RBC) automatic counting system is presented for wild animals blood analysis. Images with 2048×1536-pixel resolution acquired on an optical microscope using Neubauer chambers are used to evaluate RBC counting for three animal species (Leopardus pardalis, Cebus apella and Nasua nasua) and the error found using the proposed method is similar to that obtained for inter observer visual counting method, i.e., around 10%. Smaller errors (e.g., 3%) can be obtained in regions with less grid artifacts. These promising results allow the use of the proposed method either as a complete automatic counting tool in laboratories for wild animal's blood analysis or as a first counting stage in a semi-automatic counting tool.

  13. The Box-and-Dot Method: A Simple Strategy for Counting Significant Figures

    NASA Astrophysics Data System (ADS)

    Stephenson, W. Kirk

    2009-08-01

    A visual method for counting significant digits is presented. This easy-to-learn (and easy-to-teach) method, designated the box-and-dot method, uses the device of "boxing" significant figures based on two simple rules, then counting the number of digits in the boxes.

  14. 17 CFR 275.203(b)(3)-2 - Methods for counting clients in certain private funds.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 17 Commodity and Securities Exchanges 3 2010-04-01 2010-04-01 false Methods for counting clients....203(b)(3)-2 Methods for counting clients in certain private funds. (a) For purposes of section 203(b)(3) of the Act (15 U.S.C. 80b-3(b)(3)), you must count as clients the shareholders, limited partners...

  15. Platelet counting using the Coulter electronic counter

    PubMed Central

    Eggleton, M. J.; Sharp, A. A.

    1963-01-01

    A method for counting platelets in dilutions of platelet-rich plasm using the Coulter electronic counter is described.1 The results obtained show that such platelet counts are at least as accurate as the best methods of visual counting. The various technical difficulties encountered are discussed. PMID:16811002

  16. Disrupted Sleep in Narcolepsy: Exploring the Integrity of Galanin Neurons in the Ventrolateral Preoptic Area

    PubMed Central

    Gavrilov, Yury V.; Ellison, Brian A.; Yamamoto, Mihoko; Reddy, Hasini; Haybaeck, Johannes; Mignot, Emmanuel; Baumann, Christian R.; Scammell, Thomas E.; Valko, Philipp O.

    2016-01-01

    Study Objectives: To examine the integrity of sleep-promoting neurons of the ventrolateral preoptic nucleus (VLPO) in postmortem brains of narcolepsy type 1 patients. Methods: Postmortem examination of five narcolepsy and eight control brains. Results: VLPO galanin neuron count did not differ between narcolepsy patients (11,151 ± 3,656) and controls (13,526 ± 9,544). Conclusions: A normal number of galanin-immunoreactive VLPO neurons in narcolepsy type 1 brains at autopsy suggests that VLPO cell loss is an unlikely explanation for the sleep fragmentation that often accompanies the disease. Citation: Gavrilov YV, Ellison BA, Yamamoto M, Reddy H, Haybaeck J, Mignot E, Baumann CR, Scammell TE, Valko PO. Disrupted sleep in narcolepsy: exploring the integrity of galanin neurons in the ventrolateral preoptic area. SLEEP 2016;39(5):1059–1062. PMID:26951397

  17. Dark-count-less photon-counting x-ray computed tomography system using a YAP-MPPC detector

    NASA Astrophysics Data System (ADS)

    Sato, Eiichi; Sato, Yuich; Abudurexiti, Abulajiang; Hagiwara, Osahiko; Matsukiyo, Hiroshi; Osawa, Akihiro; Enomoto, Toshiyuki; Watanabe, Manabu; Kusachi, Shinya; Sato, Shigehiro; Ogawa, Akira; Onagawa, Jun

    2012-10-01

    A high-sensitive X-ray computed tomography (CT) system is useful for decreasing absorbed dose for patients, and a dark-count-less photon-counting CT system was developed. X-ray photons are detected using a YAP(Ce) [cerium-doped yttrium aluminum perovskite] single crystal scintillator and an MPPC (multipixel photon counter). Photocurrents are amplified by a high-speed current-voltage amplifier, and smooth event pulses from an integrator are sent to a high-speed comparator. Then, logical pulses are produced from the comparator and are counted by a counter card. Tomography is accomplished by repeated linear scans and rotations of an object, and projection curves of the object are obtained by the linear scan. The image contrast of gadolinium medium slightly fell with increase in lower-level voltage (Vl) of the comparator. The dark count rate was 0 cps, and the count rate for the CT was approximately 250 kcps.

  18. The Box-and-Dot Method: A Simple Strategy for Counting Significant Figures

    ERIC Educational Resources Information Center

    Stephenson, W. Kirk

    2009-01-01

    A visual method for counting significant digits is presented. This easy-to-learn (and easy-to-teach) method, designated the box-and-dot method, uses the device of "boxing" significant figures based on two simple rules, then counting the number of digits in the boxes. (Contains 4 notes.)

  19. Inter-rater reliability of malaria parasite counts and comparison of methods

    PubMed Central

    2009-01-01

    Background The introduction of artemesinin-based treatment for falciparum malaria has led to a shift away from symptom-based diagnosis. Diagnosis may be achieved by using rapid non-microscopic diagnostic tests (RDTs), of which there are many available. Light microscopy, however, has a central role in parasite identification and quantification and remains the main method of parasite-based diagnosis in clinic and hospital settings and is necessary for monitoring the accuracy of RDTs. The World Health Organization has prepared a proficiency testing panel containing a range of malaria-positive blood samples of known parasitaemia, to be used for the assessment of commercially available malaria RDTs. Different blood film and counting methods may be used for this purpose, which raises questions regarding accuracy and reproducibility. A comparison was made of the established methods for parasitaemia estimation to determine which would give the least inter-rater and inter-method variation Methods Experienced malaria microscopists counted asexual parasitaemia on different slides using three methods; the thin film method using the total erythrocyte count, the thick film method using the total white cell count and the Earle and Perez method. All the slides were stained using Giemsa pH 7.2. Analysis of variance (ANOVA) models were used to find the inter-rater reliability for the different methods. The paired t-test was used to assess any systematic bias between the two methods, and a regression analysis was used to see if there was a changing bias with parasite count level. Results The thin blood film gave parasite counts around 30% higher than those obtained by the thick film and Earle and Perez methods, but exhibited a loss of sensitivity with low parasitaemia. The thick film and Earle and Perez methods showed little or no bias in counts between the two methods, however, estimated inter-rater reliability was slightly better for the thick film method. Conclusion The thin film method gave results closer to the true parasite count but is not feasible at a parasitaemia below 500 parasites per microlitre. The thick film method was both reproducible and practical for this project. The determination of malarial parasitaemia must be applied by skilled operators using standardized techniques. PMID:19939271

  20. A straightforward experimental method to evaluate the Lamb-Mössbauer factor of a 57Co/Rh source

    NASA Astrophysics Data System (ADS)

    Spina, G.; Lantieri, M.

    2014-01-01

    In analyzing Mössbauer spectra by means of the integral transmission function, a correct evaluation of the recoilless fs factor of the source at the position of the sample is needed. A novel method to evaluate fs for a 57Co source is proposed. The method uses the standard transmission experimental set up and it does not need further measurements but the ones that are mandatory in order to center the Mössbauer line and to calibrate the Mössbauer transducer. Firstly, the background counts are evaluated by collecting a standard Multi Channel Scaling (MCS) spectrum of a tick metal iron foil absorber and two Pulse Height Analysis (PHA) spectra with the same life-time and setting the maximum velocity of the transducer at the same value of the MCS spectrum. Secondly, fs is evaluated by fitting the collected MCS spectrum throughout the integral transmission approach. A test of the suitability of the technique is presented, too.

  1. Direct Solve of Electrically Large Integral Equations for Problem Sizes to 1M Unknowns

    NASA Technical Reports Server (NTRS)

    Shaeffer, John

    2008-01-01

    Matrix methods for solving integral equations via direct solve LU factorization are presently limited to weeks to months of very expensive supercomputer time for problems sizes of several hundred thousand unknowns. This report presents matrix LU factor solutions for electromagnetic scattering problems for problem sizes to one million unknowns with thousands of right hand sides that run in mere days on PC level hardware. This EM solution is accomplished by utilizing the numerical low rank nature of spatially blocked unknowns using the Adaptive Cross Approximation for compressing the rank deficient blocks of the system Z matrix, the L and U factors, the right hand side forcing function and the final current solution. This compressed matrix solution is applied to a frequency domain EM solution of Maxwell's equations using standard Method of Moments approach. Compressed matrix storage and operations count leads to orders of magnitude reduction in memory and run time.

  2. Smartphone and GPS technology for free-roaming dog population surveillance - a methodological study.

    PubMed

    Barnard, Shanis; Ippoliti, Carla; Di Flaviano, Daniele; De Ruvo, Andrea; Messori, Stefano; Giovannini, Armando; Dalla Villa, Paolo

    2015-01-01

    Free-roaming dogs (FRD) represent a potential threat to the quality of life in cities from an ecological, social and public health point of view. One of the most urgent concerns is the role of uncontrolled dogs as reservoirs of infectious diseases transmittable to humans and, above all, rabies. An estimate of the FRD population size and characteristics in a given area is the first step for any relevant intervention programme. Direct count methods are still prominent because of their non-invasive approach, information technologies can support such methods facilitating data collection and allowing for a more efficient data handling. This paper presents a new framework for data collection using a topological algorithm implemented as ArcScript in ESRI® ArcGIS software, which allows for a random selection of the sampling areas. It also supplies a mobile phone application for Android® operating system devices which integrates Global Positioning System (GPS) and Google MapsTM. The potential of such a framework was tested in 2 Italian regions. Coupling technological and innovative solutions associated with common counting methods facilitate data collection and transcription. It also paves the way to future applications, which could support dog population management systems.

  3. Evaluation of petrifilm series 2000 as a possible rapid method to count coliforms in foods.

    PubMed

    Priego, R; Medina, L M; Jordano, R

    2000-08-01

    This research note is a preliminary comparison between the Petrifilm 2000 method and a widely used traditional enumeration method (on violet red bile agar); six batches of different foods (egg, frozen green beans, fresh sausage, a bakery product, raw minced meat, and raw milk) were studied. The reliability of the presumptive counts taken at 10, 12, and 14 h of incubation using this method was also verified by comparing the counts with the total confirmed counts at 24 h. In all the batches studied, results obtained with Petrifilm 2000 presented a close correlation to those obtained using violet red bile agar (r = 0.860) and greater sensitivity (93.33% of the samples displayed higher counts on Petrifilm 2000), showing that this method is a reliable and efficient alternative. The count taken at 10-h incubation is of clear interest as an early indicator of results in microbiological food control, since it accounted for 90% of the final count in all the batches analyzed. Counts taken at 12 and 14 h bore a greater similarity to those taken at 24 h. The Petrifilm 2000 method provides results in less than 12 h of incubation, making it a possible rapid method that adapts perfectly to hazard analysis critical control point system by enabling the microbiological quality control of the processing.

  4. Evaluating the quality of a cell counting measurement process via a dilution series experimental design.

    PubMed

    Sarkar, Sumona; Lund, Steven P; Vyzasatya, Ravi; Vanguri, Padmavathy; Elliott, John T; Plant, Anne L; Lin-Gibson, Sheng

    2017-12-01

    Cell counting measurements are critical in the research, development and manufacturing of cell-based products, yet determining cell quantity with accuracy and precision remains a challenge. Validating and evaluating a cell counting measurement process can be difficult because of the lack of appropriate reference material. Here we describe an experimental design and statistical analysis approach to evaluate the quality of a cell counting measurement process in the absence of appropriate reference materials or reference methods. The experimental design is based on a dilution series study with replicate samples and observations as well as measurement process controls. The statistical analysis evaluates the precision and proportionality of the cell counting measurement process and can be used to compare the quality of two or more counting methods. As an illustration of this approach, cell counting measurement processes (automated and manual methods) were compared for a human mesenchymal stromal cell (hMSC) preparation. For the hMSC preparation investigated, results indicated that the automated method performed better than the manual counting methods in terms of precision and proportionality. By conducting well controlled dilution series experimental designs coupled with appropriate statistical analysis, quantitative indicators of repeatability and proportionality can be calculated to provide an assessment of cell counting measurement quality. This approach does not rely on the use of a reference material or comparison to "gold standard" methods known to have limited assurance of accuracy and precision. The approach presented here may help the selection, optimization, and/or validation of a cell counting measurement process. Published by Elsevier Inc.

  5. Randomized Controlled Trial of Antiseptic Hand Hygiene Methods in an Outpatient Surgery Clinic.

    PubMed

    Therattil, Paul J; Yueh, Janet H; Kordahi, Anthony M; Cherla, Deepa V; Lee, Edward S; Granick, Mark S

    2015-12-01

    Outpatient wound care plays an integral part in any plastic surgery practice. However, compliance with hand hygiene measures has shown to be low, due to skin irritation and lack of time. The objective of this trial was to determine whether single-use, long-acting antiseptics can be as effective as standard multiple-use hand hygiene methods in an outpatient surgical setting. A prospective, randomized controlled trial was performed in the authors' outpatient plastic surgery clinic at Rutgers New Jersey Medical School, Newark, NJ to compare the efficacy of an ethyl alcohol-based sanitizer (Avagard D Instant Hand Aniseptic, 3M Health Care, St. Paul, MN), a benzalkonium chloride-based sanitizer (Soft & Shield, Bioderm Technologies, Inc, Trenton, NJ, distributed by NAPP Technologies, Hackensack, NJ ), and soap and- water handwashing. Subjects included clinic personnel, who were followed throughout the course of a 3-hour clinic session with hourly hand bacterial counts taken. During the course of the trial, 95 subjects completed the clinic session utilizing 1 of the hand hygiene methods (36 ethyl alcohol-based sanitizer, 38 benzalkonium chloride-based sanitizer, and 21 soap-and-water handwashing). There was no difference between hand bacterial counts using the different methods at 4 hourly time points (P greater than 0.05). Hand bacterial counts increased significantly over the 3-hour clinic session with the ethyl alcohol-based sanitizer (9.24 to 21.90 CFU, P less than 0.05), benzalkonium chloride-based sanitizer (6.69 to 21.59 CFU, P less than 0.05), and soap-and-water handwashing (8.43 to 22.75 CFU, P less than 0.05). There does not appear to be any difference in efficacy between single-use, long-acting sanitizer, and standard multiple-use hand hygiene methods. Hand bacterial counts increased significantly over the course of the 3-hour clinic session regardless of the hand hygiene measure used. Hand condition of subjects was improved with the ethyl alcohol-based sanitizer and the benzalkonium chloride-based sanitizer compared with soap-and-water handwashing.

  6. Experimental evaluation of the extended Dytlewski-style dead time correction formalism for neutron multiplicity counting

    DOE PAGES

    Lockhart, M.; Henzlova, D.; Croft, S.; ...

    2017-09-20

    Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli (DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory andmore » implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. Here, we discuss and present the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. To assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. In conclusion, the DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.« less

  7. Experimental evaluation of the extended Dytlewski-style dead time correction formalism for neutron multiplicity counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lockhart, M.; Henzlova, D.; Croft, S.

    Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli (DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory andmore » implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. Here, we discuss and present the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. To assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. In conclusion, the DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.« less

  8. Incorporating Neutrophil-to-lymphocyte Ratio and Platelet-to-lymphocyte Ratio in Place of Neutrophil Count and Platelet Count Improves Prognostic Accuracy of the International Metastatic Renal Cell Carcinoma Database Consortium Model

    PubMed Central

    Chrom, Pawel; Stec, Rafal; Bodnar, Lubomir; Szczylik, Cezary

    2018-01-01

    Purpose The study investigated whether a replacement of neutrophil count and platelet count by neutrophil-to-lymphocyte ratio (NLR) and platelet-to-lymphocyte ratio (PLR) within the International Metastatic Renal Cell Carcinoma Database Consortium (IMDC) model would improve its prognostic accuracy. Materials and Methods This retrospective analysis included consecutive patients with metastatic renal cell carcinoma treated with first-line tyrosine kinase inhibitors. The IMDC and modified-IMDC models were compared using: concordance index (CI), bias-corrected concordance index (BCCI), calibration plots, the Grønnesby and Borgan test, Bayesian Information Criterion (BIC), generalized R2, Integrated Discrimination Improvement (IDI), and continuous Net Reclassification Index (cNRI) for individual risk factors and the three risk groups. Results Three hundred and twenty-one patients were eligible for analyses. The modified-IMDC model with NLR value of 3.6 and PLR value of 157 was selected for comparison with the IMDC model. Both models were well calibrated. All other measures favoured the modified-IMDC model over the IMDC model (CI, 0.706 vs. 0.677; BCCI, 0.699 vs. 0.671; BIC, 2,176.2 vs. 2,190.7; generalized R2, 0.238 vs. 0.202; IDI, 0.044; cNRI, 0.279 for individual risk factors; and CI, 0.669 vs. 0.641; BCCI, 0.669 vs. 0.641; BIC, 2,183.2 vs. 2,198.1; generalized R2, 0.163 vs. 0.123; IDI, 0.045; cNRI, 0.165 for the three risk groups). Conclusion Incorporation of NLR and PLR in place of neutrophil count and platelet count improved prognostic accuracy of the IMDC model. These findings require external validation before introducing into clinical practice. PMID:28253564

  9. Comparison of plate counts, Petrifilm, dipslides, and adenosine triphosphate bioluminescence for monitoring bacteria in cooling-tower waters.

    PubMed

    Mueller, Sherry A; Anderson, James E; Kim, Byung R; Ball, James C

    2009-04-01

    Effective bacterial control in cooling-tower systems requires accurate and timely methods to count bacteria. Plate-count methods are difficult to implement on-site, because they are time- and labor-intensive and require sterile techniques. Several field-applicable methods (dipslides, Petrifilm, and adenosine triphosphate [ATP] bioluminescence) were compared with the plate count for two sample matrices--phosphate-buffered saline solution containing a pure culture of Pseudomonas fluorescens and cooling-tower water containing an undefined mixed bacterial culture. For the pure culture, (1) counts determined on nutrient agar and plate-count agar (PCA) media and expressed as colony-forming units (CFU) per milliliter were equivalent to those on R2A medium (p = 1.0 and p = 1.0, respectively); (2) Petrifilm counts were not significantly different from R2A plate counts (p = 0.99); (3) the dipslide counts were up to 2 log units higher than R2A plate counts, but this discrepancy was not statistically significant (p = 0.06); and (4) a discernable correlation (r2 = 0.67) existed between ATP readings and plate counts. For cooling-tower water samples (n = 62), (1) bacterial counts using R2A medium were higher (but not significant; p = 0.63) than nutrient agar and significantly higher than tryptone-glucose yeast extract (TGE; p = 0.03) and PCA (p < 0.001); (2) Petrifilm counts were significantly lower than nutrient agar or R2A (p = 0.02 and p < 0.001, respectively), but not statistically different from TGE, PCA, and dipslides (p = 0.55, p = 0.69, and p = 0.91, respectively); (3) the dipslide method yielded bacteria counts 1 to 3 log units lower than nutrient agar and R2A (p < 0.001), but was not significantly different from Petrifilm (p = 0.91), PCA (p = 1.00) or TGE (p = 0.07); (4) the differences between dipslides and the other methods became greater with a 6-day incubation time; and (5) the correlation between ATP readings and plate counts varied from system to system, was poor (r2 values ranged from < 0.01 to 0.47), and the ATP method was not sufficiently sensitive to measure counts below approximately 10(4) CFU/mL.

  10. Validation of an automated colony counting system for group A Streptococcus.

    PubMed

    Frost, H R; Tsoi, S K; Baker, C A; Laho, D; Sanderson-Smith, M L; Steer, A C; Smeesters, P R

    2016-02-08

    The practice of counting bacterial colony forming units on agar plates has long been used as a method to estimate the concentration of live bacteria in culture. However, due to the laborious and potentially error prone nature of this measurement technique, an alternative method is desirable. Recent technologic advancements have facilitated the development of automated colony counting systems, which reduce errors introduced during the manual counting process and recording of information. An additional benefit is the significant reduction in time taken to analyse colony counting data. Whilst automated counting procedures have been validated for a number of microorganisms, the process has not been successful for all bacteria due to the requirement for a relatively high contrast between bacterial colonies and growth medium. The purpose of this study was to validate an automated counting system for use with group A Streptococcus (GAS). Twenty-one different GAS strains, representative of major emm-types, were selected for assessment. In order to introduce the required contrast for automated counting, 2,3,5-triphenyl-2H-tetrazolium chloride (TTC) dye was added to Todd-Hewitt broth with yeast extract (THY) agar. Growth on THY agar with TTC was compared with growth on blood agar and THY agar to ensure the dye was not detrimental to bacterial growth. Automated colony counts using a ProtoCOL 3 instrument were compared with manual counting to confirm accuracy over the stages of the growth cycle (latent, mid-log and stationary phases) and in a number of different assays. The average percentage differences between plating and counting methods were analysed using the Bland-Altman method. A percentage difference of ±10 % was determined as the cut-off for a critical difference between plating and counting methods. All strains measured had an average difference of less than 10 % when plated on THY agar with TTC. This consistency was also observed over all phases of the growth cycle and when plated in blood following bactericidal assays. Agreement between these methods suggest the use of an automated colony counting technique for GAS will significantly reduce time spent counting bacteria to enable a more efficient and accurate measurement of bacteria concentration in culture.

  11. HgCdTe APD-based linear-mode photon counting components and ladar receivers

    NASA Astrophysics Data System (ADS)

    Jack, Michael; Wehner, Justin; Edwards, John; Chapman, George; Hall, Donald N. B.; Jacobson, Shane M.

    2011-05-01

    Linear mode photon counting (LMPC) provides significant advantages in comparison with Geiger Mode (GM) Photon Counting including absence of after-pulsing, nanosecond pulse to pulse temporal resolution and robust operation in the present of high density obscurants or variable reflectivity objects. For this reason Raytheon has developed and previously reported on unique linear mode photon counting components and modules based on combining advanced APDs and advanced high gain circuits. By using HgCdTe APDs we enable Poisson number preserving photon counting. A metric of photon counting technology is dark count rate and detection probability. In this paper we report on a performance breakthrough resulting from improvement in design, process and readout operation enabling >10x reduction in dark counts rate to ~10,000 cps and >104x reduction in surface dark current enabling long 10 ms integration times. Our analysis of key dark current contributors suggest that substantial further reduction in DCR to ~ 1/sec or less can be achieved by optimizing wavelength, operating voltage and temperature.

  12. The importance of independent chronology in integrating records of past climate change for the 60-8 ka INTIMATE time interval

    NASA Astrophysics Data System (ADS)

    Brauer, Achim; Hajdas, Irka; Blockley, Simon P. E.; Bronk Ramsey, Christopher; Christl, Marcus; Ivy-Ochs, Susan; Moseley, Gina E.; Nowaczyk, Norbert N.; Rasmussen, Sune O.; Roberts, Helen M.; Spötl, Christoph; Staff, Richard A.; Svensson, Anders

    2014-12-01

    This paper provides a brief overview of the most common dating techniques applied in palaeoclimate and palaeoenvironmental studies including four radiometric and isotopic dating methods (radiocarbon, 230Th disequilibrium, luminescence, cosmogenic nuclides) and two incremental methods based on layer counting (ice layer, varves). For each method, concise background information about the fundamental principles and methodological approaches is provided. We concentrate on the time interval of focus for the INTIMATE (Integrating Ice core, MArine and TErrestrial records) community (60-8 ka). This dating guide addresses palaeoclimatologists who aim at interpretation of their often regional and local proxy time series in a wider spatial context and, therefore, have to rely on correlation with proxy records obtained from different archives from various regions. For this reason, we especially emphasise scientific approaches for harmonising chronologies for sophisticated and robust proxy data integration. In this respect, up-to-date age modelling techniques are presented as well as tools for linking records by age equivalence including tephrochronology, cosmogenic 10Be and palaeomagnetic variations. Finally, to avoid inadequate documentation of chronologies and assure reliable correlation of proxy time series, this paper provides recommendations for minimum standards of uncertainty and age datum reporting.

  13. 29 CFR 2590.701-5 - Evidence of creditable coverage.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... under paragraph (b)(2) of this section (relating to the alternative method of counting creditable... benefits described in § 2590.701-4(c) (relating to the alternative method of counting creditable coverage... using the alternative method of counting creditable coverage—(1) In general. After an individual...

  14. Determination of confidence limits for experiments with low numbers of counts. [Poisson-distributed photon counts from astrophysical sources

    NASA Technical Reports Server (NTRS)

    Kraft, Ralph P.; Burrows, David N.; Nousek, John A.

    1991-01-01

    Two different methods, classical and Bayesian, for determining confidence intervals involving Poisson-distributed data are compared. Particular consideration is given to cases where the number of counts observed is small and is comparable to the mean number of background counts. Reasons for preferring the Bayesian over the classical method are given. Tables of confidence limits calculated by the Bayesian method are provided for quick reference.

  15. Perfect count: a novel approach for the single platform enumeration of absolute CD4+ T-lymphocytes.

    PubMed

    Storie, Ian; Sawle, Alex; Goodfellow, Karen; Whitby, Liam; Granger, Vivian; Ward, Rosalie Y; Peel, Janet; Smart, Theresa; Reilly, John T; Barnett, David

    2004-01-01

    The derivation of reliable CD4(+) T lymphocyte counts is vital for the monitoring of disease progression and therapeutic effectiveness in HIV(+) individuals. Flow cytometry has emerged as the method of choice for CD4(+) T lymphocyte enumeration, with single-platform technology, coupled with reference counting beads, fast becoming the "gold standard." However, although single-platform, bead-based, sample acquisition requires the ratio of beads to cells to remain unchanged, there is no available method, until recently, to monitor this. Perfect Count beads have been developed to address this issue and to incorporate two bead populations, with different densities, to allow the detection of inadequate mixing. Comparison of the relative proportions of both beads with the manufacture's defined limits enables an internal QC check during sample acquisition. In this study, we have compared CD4(+) T lymphocyte counts, obtained from 104 HIV(+) patients, using TruCount beads with MultiSet software (defined as the predicated method) and the new Perfect Count beads, incorporating an in house sequential gating strategy. We have demonstrated an excellent degree of correlation between the predicate method and the Perfect Count system (r(2) = 0.9955; Bland Altman bias +27 CD4(+) T lymphocytes/microl). The Perfect Count system is a robust method for performing single platform absolute counts and has the added advantage of having internal QC checks. Such an approach enables the operator to identify potential problems during sample preparation, acquisition and analysis. Copyright 2003 Wiley-Liss, Inc.

  16. ddClone: joint statistical inference of clonal populations from single cell and bulk tumour sequencing data.

    PubMed

    Salehi, Sohrab; Steif, Adi; Roth, Andrew; Aparicio, Samuel; Bouchard-Côté, Alexandre; Shah, Sohrab P

    2017-03-01

    Next-generation sequencing (NGS) of bulk tumour tissue can identify constituent cell populations in cancers and measure their abundance. This requires computational deconvolution of allelic counts from somatic mutations, which may be incapable of fully resolving the underlying population structure. Single cell sequencing (SCS) is a more direct method, although its replacement of NGS is impeded by technical noise and sampling limitations. We propose ddClone, which analytically integrates NGS and SCS data, leveraging their complementary attributes through joint statistical inference. We show on real and simulated datasets that ddClone produces more accurate results than can be achieved by either method alone.

  17. A new approach for the estimation of phytoplankton cell counts associated with algal blooms.

    PubMed

    Nazeer, Majid; Wong, Man Sing; Nichol, Janet Elizabeth

    2017-07-15

    This study proposes a method for estimating phytoplankton cell counts associated with an algal bloom, using satellite images coincident with in situ and meteorological parameters. Satellite images from Landsat Thematic Mapper (TM), Enhanced Thematic Mapper Plus (ETM+), Operational Land Imager (OLI) and HJ-1 A/B Charge Couple Device (CCD) sensors were integrated with the meteorological observations to provide an estimate of phytoplankton cell counts. All images were atmospherically corrected using the Second Simulation of the Satellite Signal in the Solar Spectrum (6S) atmospheric correction method with a possible error of 1.2%, 2.6%, 1.4% and 2.3% for blue (450-520nm), green (520-600nm), red (630-690nm) and near infrared (NIR 760-900nm) wavelengths, respectively. Results showed that the developed Artificial Neural Network (ANN) model yields a correlation coefficient (R) of 0.95 with the in situ validation data with Sum of Squared Error (SSE) of 0.34cell/ml, Mean Relative Error (MRE) of 0.154cells/ml and a bias of -504.87. The integration of the meteorological parameters with remote sensing observations provided a promising estimation of the algal scum as compared to previous studies. The applicability of the ANN model was tested over Hong Kong as well as over Lake Kasumigaura, Japan and Lake Okeechobee, Florida USA, where algal blooms were also reported. Further, a 40-year (1975-2014) red tide occurrence map was developed and revealed that the eastern and southern waters of Hong Kong are more vulnerable to red tides. Over the 40 years, 66% of red tide incidents were associated with the Dinoflagellates group, while the remainder were associated with the Diatom group (14%) and several other minor groups (20%). The developed technology can be applied to other similar environments in an efficient and cost-saving manner. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Multiplexed Detection of Cytokines Based on Dual Bar-Code Strategy and Single-Molecule Counting.

    PubMed

    Li, Wei; Jiang, Wei; Dai, Shuang; Wang, Lei

    2016-02-02

    Cytokines play important roles in the immune system and have been regarded as biomarkers. While single cytokine is not specific and accurate enough to meet the strict diagnosis in practice, in this work, we constructed a multiplexed detection method for cytokines based on dual bar-code strategy and single-molecule counting. Taking interferon-γ (IFN-γ) and tumor necrosis factor-α (TNF-α) as model analytes, first, the magnetic nanobead was functionalized with the second antibody and primary bar-code strands, forming a magnetic nanoprobe. Then, through the specific reaction of the second antibody and the antigen that fixed by the primary antibody, sandwich-type immunocomplex was formed on the substrate. Next, the primary bar-code strands as amplification units triggered multibranched hybridization chain reaction (mHCR), producing nicked double-stranded polymers with multiple branched arms, which were served as secondary bar-code strands. Finally, the secondary bar-code strands hybridized with the multimolecule labeled fluorescence probes, generating enhanced fluorescence signals. The numbers of fluorescence dots were counted one by one for quantification with epi-fluorescence microscope. By integrating the primary and secondary bar-code-based amplification strategy and the multimolecule labeled fluorescence probes, this method displayed an excellent sensitivity with the detection limits were both 5 fM. Unlike the typical bar-code assay that the bar-code strands should be released and identified on a microarray, this method is more direct. Moreover, because of the selective immune reaction and the dual bar-code mechanism, the resulting method could detect the two targets simultaneously. Multiple analysis in human serum was also performed, suggesting that our strategy was reliable and had a great potential application in early clinical diagnosis.

  19. 21 CFR 1210.16 - Method of bacterial count.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED... FEDERAL IMPORT MILK ACT Inspection and Testing § 1210.16 Method of bacterial count. The bacterial count of milk and cream refers to the number of viable bacteria as determined by the standard plate method of...

  20. 21 CFR 1210.16 - Method of bacterial count.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED... FEDERAL IMPORT MILK ACT Inspection and Testing § 1210.16 Method of bacterial count. The bacterial count of milk and cream refers to the number of viable bacteria as determined by the standard plate method of...

  1. 45 CFR 146.115 - Certification and disclosure of previous coverage.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... to the alternative method of counting creditable coverage). Moreover, if the individual's coverage... benefits described in § 146.113(c) (relating to the alternative method of counting creditable coverage... of coverage to a plan or issuer using the alternative method of counting creditable coverage—(1) In...

  2. 26 CFR 54.9801-5 - Evidence of creditable coverage.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... under paragraph (b)(2) of this section (relating to the alternative method of counting creditable... in § 54.9801-4(c) (relating to the alternative method of counting creditable coverage). However, if... Act. (b) Disclosure of coverage to a plan or issuer using the alternative method of counting...

  3. Comparison of culture and qPCR methods in detection of mycobacteria from drinking waters.

    PubMed

    Räsänen, Noora H J; Rintala, Helena; Miettinen, Ilkka T; Torvinen, Eila

    2013-04-01

    Environmental mycobacteria are common bacteria in man-made water systems and may cause infections and hypersensitivity pneumonitis via exposure to water. We compared a generally used cultivation method and a quantitative polymerase chain reaction (qPCR) method to detect mycobacteria in 3 types of drinking waters: surface water, ozone-treated surface water, and groundwater. There was a correlation between the numbers of mycobacteria obtained by cultivation and qPCR methods, but the ratio of the counts obtained by the 2 methods varied among the types of water. The qPCR counts in the drinking waters produced from surface or groundwater were 5 to 34 times higher than culturable counts. In ozone-treated surface waters, both methods gave similar counts. The ozone-treated drinking waters had the highest concentration of assimilable organic carbon, which may explain the good culturability. In warm tap waters, qPCR gave 43 times higher counts than cultivation, but both qPCR counts and culturable counts were lower than those in the drinking waters collected from the same sites. The TaqMan qPCR method is a rapid and sensitive tool for total quantitation of mycobacteria in different types of clean waters. The raw water source and treatments affect both culturability and total numbers of mycobacteria in drinking waters.

  4. Hybrid statistics-simulations based method for atom-counting from ADF STEM images.

    PubMed

    De Wael, Annelies; De Backer, Annick; Jones, Lewys; Nellist, Peter D; Van Aert, Sandra

    2017-06-01

    A hybrid statistics-simulations based method for atom-counting from annular dark field scanning transmission electron microscopy (ADF STEM) images of monotype crystalline nanostructures is presented. Different atom-counting methods already exist for model-like systems. However, the increasing relevance of radiation damage in the study of nanostructures demands a method that allows atom-counting from low dose images with a low signal-to-noise ratio. Therefore, the hybrid method directly includes prior knowledge from image simulations into the existing statistics-based method for atom-counting, and accounts in this manner for possible discrepancies between actual and simulated experimental conditions. It is shown by means of simulations and experiments that this hybrid method outperforms the statistics-based method, especially for low electron doses and small nanoparticles. The analysis of a simulated low dose image of a small nanoparticle suggests that this method allows for far more reliable quantitative analysis of beam-sensitive materials. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Silicon Quantum Dots with Counted Antimony Donor Implants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, Meenakshi; Pacheco, Jose L.; Perry, Daniel Lee

    2015-10-01

    Deterministic control over the location and number of donors is crucial to donor spin quantum bits (qubits) in semiconductor based quantum computing. A focused ion beam is used to implant close to quantum dots. Ion detectors are integrated next to the quantum dots to sense the implants. The numbers of ions implanted can be counted to a precision of a single ion. Regular coulomb blockade is observed from the quantum dots. Charge offsets indicative of donor ionization, are observed in devices with counted implants.

  6. Some analytical and numerical approaches to understanding trap counts resulting from pest insect immigration.

    PubMed

    Bearup, Daniel; Petrovskaya, Natalia; Petrovskii, Sergei

    2015-05-01

    Monitoring of pest insects is an important part of the integrated pest management. It aims to provide information about pest insect abundance at a given location. This includes data collection, usually using traps, and their subsequent analysis and/or interpretation. However, interpretation of trap count (number of insects caught over a fixed time) remains a challenging problem. First, an increase in either the population density or insects activity can result in a similar increase in the number of insects trapped (the so called "activity-density" problem). Second, a genuine increase of the local population density can be attributed to qualitatively different ecological mechanisms such as multiplication or immigration. Identification of the true factor causing an increase in trap count is important as different mechanisms require different control strategies. In this paper, we consider a mean-field mathematical model of insect trapping based on the diffusion equation. Although the diffusion equation is a well-studied model, its analytical solution in closed form is actually available only for a few special cases, whilst in a more general case the problem has to be solved numerically. We choose finite differences as the baseline numerical method and show that numerical solution of the problem, especially in the realistic 2D case, is not at all straightforward as it requires a sufficiently accurate approximation of the diffusion fluxes. Once the numerical method is justified and tested, we apply it to the corresponding boundary problem where different types of boundary forcing describe different scenarios of pest insect immigration and reveal the corresponding patterns in the trap count growth. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Three-dimensional integral imaging displays using a quick-response encoded elemental image array: an overview

    NASA Astrophysics Data System (ADS)

    Markman, A.; Javidi, B.

    2016-06-01

    Quick-response (QR) codes are barcodes that can store information such as numeric data and hyperlinks. The QR code can be scanned using a QR code reader, such as those built into smartphone devices, revealing the information stored in the code. Moreover, the QR code is robust to noise, rotation, and illumination when scanning due to error correction built in the QR code design. Integral imaging is an imaging technique used to generate a three-dimensional (3D) scene by combining the information from two-dimensional (2D) elemental images (EIs) each with a different perspective of a scene. Transferring these 2D images in a secure manner can be difficult. In this work, we overview two methods to store and encrypt EIs in multiple QR codes. The first method uses run-length encoding with Huffman coding and the double-random-phase encryption (DRPE) to compress and encrypt an EI. This information is then stored in a QR code. An alternative compression scheme is to perform photon-counting on the EI prior to compression. Photon-counting is a non-linear transformation of data that creates redundant information thus improving image compression. The compressed data is encrypted using the DRPE. Once information is stored in the QR codes, it is scanned using a smartphone device. The information scanned is decompressed and decrypted and an EI is recovered. Once all EIs have been recovered, a 3D optical reconstruction is generated.

  8. Aerial population estimates of wild horses (Equus caballus) in the adobe town and salt wells creek herd management areas using an integrated simultaneous double-count and sightability bias correction technique

    USGS Publications Warehouse

    Lubow, Bruce C.; Ransom, Jason I.

    2007-01-01

    An aerial survey technique combining simultaneous double-count and sightability bias correction methodologies was used to estimate the population of wild horses inhabiting Adobe Town and Salt Wells Creek Herd Management Areas, Wyoming. Based on 5 surveys over 4 years, we conclude that the technique produced estimates consistent with the known number of horses removed between surveys and an annual population growth rate of 16.2 percent per year. Therefore, evidence from this series of surveys supports the validity of this survey method. Our results also indicate that the ability of aerial observers to see horse groups is very strongly dependent on skill of the individual observer, size of the horse group, and vegetation cover. It is also more modestly dependent on the ruggedness of the terrain and the position of the sun relative to the observer. We further conclude that censuses, or uncorrected raw counts, are inadequate estimates of population size for this herd. Such uncorrected counts were all undercounts in our trials, and varied in magnitude from year to year and observer to observer. As of April 2007, we estimate that the population of the Adobe Town /Salt Wells Creek complex is 906 horses with a 95 percent confidence interval ranging from 857 to 981 horses.

  9. Integrated WiFi/PDR/Smartphone Using an Unscented Kalman Filter Algorithm for 3D Indoor Localization.

    PubMed

    Chen, Guoliang; Meng, Xiaolin; Wang, Yunjia; Zhang, Yanzhe; Tian, Peng; Yang, Huachao

    2015-09-23

    Because of the high calculation cost and poor performance of a traditional planar map when dealing with complicated indoor geographic information, a WiFi fingerprint indoor positioning system cannot be widely employed on a smartphone platform. By making full use of the hardware sensors embedded in the smartphone, this study proposes an integrated approach to a three-dimensional (3D) indoor positioning system. First, an improved K-means clustering method is adopted to reduce the fingerprint database retrieval time and enhance positioning efficiency. Next, with the mobile phone's acceleration sensor, a new step counting method based on auto-correlation analysis is proposed to achieve cell phone inertial navigation positioning. Furthermore, the integration of WiFi positioning with Pedestrian Dead Reckoning (PDR) obtains higher positional accuracy with the help of the Unscented Kalman Filter algorithm. Finally, a hybrid 3D positioning system based on Unity 3D, which can carry out real-time positioning for targets in 3D scenes, is designed for the fluent operation of mobile terminals.

  10. Integrated WiFi/PDR/Smartphone Using an Unscented Kalman Filter Algorithm for 3D Indoor Localization

    PubMed Central

    Chen, Guoliang; Meng, Xiaolin; Wang, Yunjia; Zhang, Yanzhe; Tian, Peng; Yang, Huachao

    2015-01-01

    Because of the high calculation cost and poor performance of a traditional planar map when dealing with complicated indoor geographic information, a WiFi fingerprint indoor positioning system cannot be widely employed on a smartphone platform. By making full use of the hardware sensors embedded in the smartphone, this study proposes an integrated approach to a three-dimensional (3D) indoor positioning system. First, an improved K-means clustering method is adopted to reduce the fingerprint database retrieval time and enhance positioning efficiency. Next, with the mobile phone’s acceleration sensor, a new step counting method based on auto-correlation analysis is proposed to achieve cell phone inertial navigation positioning. Furthermore, the integration of WiFi positioning with Pedestrian Dead Reckoning (PDR) obtains higher positional accuracy with the help of the Unscented Kalman Filter algorithm. Finally, a hybrid 3D positioning system based on Unity 3D, which can carry out real-time positioning for targets in 3D scenes, is designed for the fluent operation of mobile terminals. PMID:26404314

  11. Photon counting, censor corrections, and lifetime imaging for improved detection in two-photon microscopy

    PubMed Central

    Driscoll, Jonathan D.; Shih, Andy Y.; Iyengar, Satish; Field, Jeffrey J.; White, G. Allen; Squier, Jeffrey A.; Cauwenberghs, Gert

    2011-01-01

    We present a high-speed photon counter for use with two-photon microscopy. Counting pulses of photocurrent, as opposed to analog integration, maximizes the signal-to-noise ratio so long as the uncertainty in the count does not exceed the gain-noise of the photodetector. Our system extends this improvement through an estimate of the count that corrects for the censored period after detection of an emission event. The same system can be rapidly reconfigured in software for fluorescence lifetime imaging, which we illustrate by distinguishing between two spectrally similar fluorophores in an in vivo model of microstroke. PMID:21471395

  12. Transforming the Classroom. Technology Counts, 2016. Education Week. Volume 35, Issue 35

    ERIC Educational Resources Information Center

    Edwards, Virginia B., Ed.

    2016-01-01

    The 2016 edition of "Education Week's" long-­running "Technology Counts" report combines in-depth reporting and insight from an original national survey to reveal teachers' confidence levels in ed tech, how teachers approach integrating technology into the classroom, and decision-making behind tech products. Contents include:…

  13. A compact 7-cell Si-drift detector module for high-count rate X-ray spectroscopy.

    PubMed

    Hansen, K; Reckleben, C; Diehl, I; Klär, H

    2008-05-01

    A new Si-drift detector module for fast X-ray spectroscopy experiments was developed and realized. The Peltier-cooled module comprises a sensor with 7 × 7-mm 2 active area, an integrated circuit for amplification, shaping and detection, storage, and derandomized readout of signal pulses in parallel, and amplifiers for line driving. The compactness and hexagonal shape of the module with a wrench size of 16mm allow very short distances to the specimen and multi-module arrangements. The power dissipation is 186mW. At a shaper peaking time of 190 ns and an integration time of 450 ns an electronic rms noise of ~11 electrons was achieved. When operated at 7 °C, FWHM line widths around 260 and 460 eV (Cu-K α ) were obtained at low rates and at sum-count rates of 1.7 MHz, respectively. The peak shift is below 1% for a broad range of count rates. At 1.7-MHz sum-count rate the throughput loss amounts to 30%.

  14. Comparison of Drive Counts and Mark-Resight As Methods of Population Size Estimation of Highly Dense Sika Deer (Cervus nippon) Populations.

    PubMed

    Takeshita, Kazutaka; Ikeda, Takashi; Takahashi, Hiroshi; Yoshida, Tsuyoshi; Igota, Hiromasa; Matsuura, Yukiko; Kaji, Koichi

    2016-01-01

    Assessing temporal changes in abundance indices is an important issue in the management of large herbivore populations. The drive counts method has been frequently used as a deer abundance index in mountainous regions. However, despite an inherent risk for observation errors in drive counts, which increase with deer density, evaluations of the utility of drive counts at a high deer density remain scarce. We compared the drive counts and mark-resight (MR) methods in the evaluation of a highly dense sika deer population (MR estimates ranged between 11 and 53 individuals/km2) on Nakanoshima Island, Hokkaido, Japan, between 1999 and 2006. This deer population experienced two large reductions in density; approximately 200 animals in total were taken from the population through a large-scale population removal and a separate winter mass mortality event. Although the drive counts tracked temporal changes in deer abundance on the island, they overestimated the counts for all years in comparison to the MR method. Increased overestimation in drive count estimates after the winter mass mortality event may be due to a double count derived from increased deer movement and recovery of body condition secondary to the mitigation of density-dependent food limitations. Drive counts are unreliable because they are affected by unfavorable factors such as bad weather, and they are cost-prohibitive to repeat, which precludes the calculation of confidence intervals. Therefore, the use of drive counts to infer the deer abundance needs to be reconsidered.

  15. Counting pollen grains using readily available, free image processing and analysis software.

    PubMed

    Costa, Clayton M; Yang, Suann

    2009-10-01

    Although many methods exist for quantifying the number of pollen grains in a sample, there are few standard methods that are user-friendly, inexpensive and reliable. The present contribution describes a new method of counting pollen using readily available, free image processing and analysis software. Pollen was collected from anthers of two species, Carduus acanthoides and C. nutans (Asteraceae), then illuminated on slides and digitally photographed through a stereomicroscope. Using ImageJ (NIH), these digital images were processed to remove noise and sharpen individual pollen grains, then analysed to obtain a reliable total count of the number of grains present in the image. A macro was developed to analyse multiple images together. To assess the accuracy and consistency of pollen counting by ImageJ analysis, counts were compared with those made by the human eye. Image analysis produced pollen counts in 60 s or less per image, considerably faster than counting with the human eye (5-68 min). In addition, counts produced with the ImageJ procedure were similar to those obtained by eye. Because count parameters are adjustable, this image analysis protocol may be used for many other plant species. Thus, the method provides a quick, inexpensive and reliable solution to counting pollen from digital images, not only reducing the chance of error but also substantially lowering labour requirements.

  16. Does the covariance structure matter in longitudinal modelling for the prediction of future CD4 counts?

    PubMed

    Taylor, J M; Law, N

    1998-10-30

    We investigate the importance of the assumed covariance structure for longitudinal modelling of CD4 counts. We examine how individual predictions of future CD4 counts are affected by the covariance structure. We consider four covariance structures: one based on an integrated Ornstein-Uhlenbeck stochastic process; one based on Brownian motion, and two derived from standard linear and quadratic random-effects models. Using data from the Multicenter AIDS Cohort Study and from a simulation study, we show that there is a noticeable deterioration in the coverage rate of confidence intervals if we assume the wrong covariance. There is also a loss in efficiency. The quadratic random-effects model is found to be the best in terms of correctly calibrated prediction intervals, but is substantially less efficient than the others. Incorrectly specifying the covariance structure as linear random effects gives too narrow prediction intervals with poor coverage rates. Fitting using the model based on the integrated Ornstein-Uhlenbeck stochastic process is the preferred one of the four considered because of its efficiency and robustness properties. We also use the difference between the future predicted and observed CD4 counts to assess an appropriate transformation of CD4 counts; a fourth root, cube root and square root all appear reasonable choices.

  17. Refrigerator storage of expressed human milk in the neonatal intensive care unit.

    PubMed

    Slutzah, Meredith; Codipilly, Champa N; Potak, Debra; Clark, Richard M; Schanler, Richard J

    2010-01-01

    To provide recommendations for refrigerator storage of human milk, the overall integrity (bacterial growth, cell counts, and component concentrations) of milk was examined during 96 hours of storage at 4 degrees C. Fresh milk samples (n = 36) were divided and stored at 4 degrees C for 0, 24, 48, 72, and 96 hours. At each time, pH, white cell count, and osmolality were measured and additional samples were stored at -80 degrees C until analyzed for bacteria and concentrations of lactoferrin, secretory (s)IgA, fat, fatty acids, and protein. There were no significant changes for osmolality, total and Gram-negative bacterial colony counts or concentrations of sIgA, lactoferrin, and fat. Gram-positive colony counts (2.9 to 1.6 x 10(5) colony-forming units per mL), pH (7.21 to 6.68), white blood cell counts (2.31 to 1.85 x 10(6) cells per mL), and total protein (17.5 to 16.7 g/L) declined, and free fatty acid concentrations increased (0.35 to 1.28 g/L) as storage duration increased, P < .001. Changes were minimal and the overall integrity of milk during refrigerator storage was preserved. Fresh mother's milk may be stored at refrigerator temperature for as long as 96 hours.

  18. Fractal analysis of mandibular trabecular bone: optimal tile sizes for the tile counting method

    PubMed Central

    Huh, Kyung-Hoe; Baik, Jee-Seon; Heo, Min-Suk; Lee, Sam-Sun; Choi, Soon-Chul; Lee, Sun-Bok; Lee, Seung-Pyo

    2011-01-01

    Purpose This study was performed to determine the optimal tile size for the fractal dimension of the mandibular trabecular bone using a tile counting method. Materials and Methods Digital intraoral radiographic images were obtained at the mandibular angle, molar, premolar, and incisor regions of 29 human dry mandibles. After preprocessing, the parameters representing morphometric characteristics of the trabecular bone were calculated. The fractal dimensions of the processed images were analyzed in various tile sizes by the tile counting method. Results The optimal range of tile size was 0.132 mm to 0.396 mm for the fractal dimension using the tile counting method. The sizes were closely related to the morphometric parameters. Conclusion The fractal dimension of mandibular trabecular bone, as calculated with the tile counting method, can be best characterized with a range of tile sizes from 0.132 to 0.396 mm. PMID:21977478

  19. Can reliable sage-grouse lek counts be obtained using aerial infrared technology

    USGS Publications Warehouse

    Gillette, Gifford L.; Coates, Peter S.; Petersen, Steven; Romero, John P.

    2013-01-01

    More effective methods for counting greater sage-grouse (Centrocercus urophasianus) are needed to better assess population trends through enumeration or location of new leks. We describe an aerial infrared technique for conducting sage-grouse lek counts and compare this method with conventional ground-based lek count methods. During the breeding period in 2010 and 2011, we surveyed leks from fixed-winged aircraft using cryogenically cooled mid-wave infrared cameras and surveyed the same leks on the same day from the ground following a standard lek count protocol. We did not detect significant differences in lek counts between surveying techniques. These findings suggest that using a cryogenically cooled mid-wave infrared camera from an aerial platform to conduct lek surveys is an effective alternative technique to conventional ground-based methods, but further research is needed. We discuss multiple advantages to aerial infrared surveys, including counting in remote areas, representing greater spatial variation, and increasing the number of counted leks per season. Aerial infrared lek counts may be a valuable wildlife management tool that releases time and resources for other conservation efforts. Opportunities exist for wildlife professionals to refine and apply aerial infrared techniques to wildlife monitoring programs because of the increasing reliability and affordability of this technology.

  20. A study of pile-up in integrated time-correlated single photon counting systems

    NASA Astrophysics Data System (ADS)

    Arlt, Jochen; Tyndall, David; Rae, Bruce R.; Li, David D.-U.; Richardson, Justin A.; Henderson, Robert K.

    2013-10-01

    Recent demonstration of highly integrated, solid-state, time-correlated single photon counting (TCSPC) systems in CMOS technology is set to provide significant increases in performance over existing bulky, expensive hardware. Arrays of single photon single photon avalanche diode (SPAD) detectors, timing channels, and signal processing can be integrated on a single silicon chip with a degree of parallelism and computational speed that is unattainable by discrete photomultiplier tube and photon counting card solutions. New multi-channel, multi-detector TCSPC sensor architectures with greatly enhanced throughput due to minimal detector transit (dead) time or timing channel dead time are now feasible. In this paper, we study the potential for future integrated, solid-state TCSPC sensors to exceed the photon pile-up limit through analytic formula and simulation. The results are validated using a 10% fill factor SPAD array and an 8-channel, 52 ps resolution time-to-digital conversion architecture with embedded lifetime estimation. It is demonstrated that pile-up insensitive acquisition is attainable at greater than 10 times the pulse repetition rate providing over 60 dB of extended dynamic range to the TCSPC technique. Our results predict future CMOS TCSPC sensors capable of live-cell transient observations in confocal scanning microscopy, improved resolution of near-infrared optical tomography systems, and fluorescence lifetime activated cell sorting.

  1. A study of pile-up in integrated time-correlated single photon counting systems.

    PubMed

    Arlt, Jochen; Tyndall, David; Rae, Bruce R; Li, David D-U; Richardson, Justin A; Henderson, Robert K

    2013-10-01

    Recent demonstration of highly integrated, solid-state, time-correlated single photon counting (TCSPC) systems in CMOS technology is set to provide significant increases in performance over existing bulky, expensive hardware. Arrays of single photon single photon avalanche diode (SPAD) detectors, timing channels, and signal processing can be integrated on a single silicon chip with a degree of parallelism and computational speed that is unattainable by discrete photomultiplier tube and photon counting card solutions. New multi-channel, multi-detector TCSPC sensor architectures with greatly enhanced throughput due to minimal detector transit (dead) time or timing channel dead time are now feasible. In this paper, we study the potential for future integrated, solid-state TCSPC sensors to exceed the photon pile-up limit through analytic formula and simulation. The results are validated using a 10% fill factor SPAD array and an 8-channel, 52 ps resolution time-to-digital conversion architecture with embedded lifetime estimation. It is demonstrated that pile-up insensitive acquisition is attainable at greater than 10 times the pulse repetition rate providing over 60 dB of extended dynamic range to the TCSPC technique. Our results predict future CMOS TCSPC sensors capable of live-cell transient observations in confocal scanning microscopy, improved resolution of near-infrared optical tomography systems, and fluorescence lifetime activated cell sorting.

  2. Comparing census methods for the endangered Kirtland's Warbler

    Treesearch

    John R. Probst; Deahn M. Donner; Mike Worland; Jerry Weinrich; Phillip Huber; Kenneth R. Ennis

    2005-01-01

    We compared transect counts used for the annual official count of male Kirtland`s Warblers (Dendroica kirtlandii) to an observation-based mapping method of individually sighted males in 155 stands over 10 yrs. The annual census count almost tripled from 1990 to 1999. The transect and observation-based mapping method showed the same increasing trend...

  3. EVALUATION OF THE USE OF DIFFERENT ANTIBIOTICS IN THE DIRECT VIABLE COUNT METHOD TO DETECT FECAL ENTEROCOCCI

    EPA Science Inventory

    The detection of fecal pollution is performed via culturing methods in spite of the fact that culturable counts can severely underestimate the densities of fecal microorganisms. One approach that has been used to enumerate bacteria is the direct viable count method (DVC). The ob...

  4. Mapping of bird distributions from point count surveys

    USGS Publications Warehouse

    Sauer, J.R.; Pendleton, G.W.; Orsillo, Sandra; Ralph, C.J.; Sauer, J.R.; Droege, S.

    1995-01-01

    Maps generated from bird survey data are used for a variety of scientific purposes, but little is known about their bias and precision. We review methods for preparing maps from point count data and appropriate sampling methods for maps based on point counts. Maps based on point counts can be affected by bias associated with incomplete counts, primarily due to changes in proportion counted as a function of observer or habitat differences. Large-scale surveys also generally suffer from regional and temporal variation in sampling intensity. A simulated surface is used to demonstrate sampling principles for maps.

  5. Rapid enumeration of viable bacteria by image analysis

    NASA Technical Reports Server (NTRS)

    Singh, A.; Pyle, B. H.; McFeters, G. A.

    1989-01-01

    A direct viable counting method for enumerating viable bacteria was modified and made compatible with image analysis. A comparison was made between viable cell counts determined by the spread plate method and direct viable counts obtained using epifluorescence microscopy either manually or by automatic image analysis. Cultures of Escherichia coli, Salmonella typhimurium, Vibrio cholerae, Yersinia enterocolitica and Pseudomonas aeruginosa were incubated at 35 degrees C in a dilute nutrient medium containing nalidixic acid. Filtered samples were stained for epifluorescence microscopy and analysed manually as well as by image analysis. Cells enlarged after incubation were considered viable. The viable cell counts determined using image analysis were higher than those obtained by either the direct manual count of viable cells or spread plate methods. The volume of sample filtered or the number of cells in the original sample did not influence the efficiency of the method. However, the optimal concentration of nalidixic acid (2.5-20 micrograms ml-1) and length of incubation (4-8 h) varied with the culture tested. The results of this study showed that under optimal conditions, the modification of the direct viable count method in combination with image analysis microscopy provided an efficient and quantitative technique for counting viable bacteria in a short time.

  6. Mitosis Counting in Breast Cancer: Object-Level Interobserver Agreement and Comparison to an Automatic Method

    PubMed Central

    Veta, Mitko; van Diest, Paul J.; Jiwa, Mehdi; Al-Janabi, Shaimaa; Pluim, Josien P. W.

    2016-01-01

    Background Tumor proliferation speed, most commonly assessed by counting of mitotic figures in histological slide preparations, is an important biomarker for breast cancer. Although mitosis counting is routinely performed by pathologists, it is a tedious and subjective task with poor reproducibility, particularly among non-experts. Inter- and intraobserver reproducibility of mitosis counting can be improved when a strict protocol is defined and followed. Previous studies have examined only the agreement in terms of the mitotic count or the mitotic activity score. Studies of the observer agreement at the level of individual objects, which can provide more insight into the procedure, have not been performed thus far. Methods The development of automatic mitosis detection methods has received large interest in recent years. Automatic image analysis is viewed as a solution for the problem of subjectivity of mitosis counting by pathologists. In this paper we describe the results from an interobserver agreement study between three human observers and an automatic method, and make two unique contributions. For the first time, we present an analysis of the object-level interobserver agreement on mitosis counting. Furthermore, we train an automatic mitosis detection method that is robust with respect to staining appearance variability and compare it with the performance of expert observers on an “external” dataset, i.e. on histopathology images that originate from pathology labs other than the pathology lab that provided the training data for the automatic method. Results The object-level interobserver study revealed that pathologists often do not agree on individual objects, even if this is not reflected in the mitotic count. The disagreement is larger for objects from smaller size, which suggests that adding a size constraint in the mitosis counting protocol can improve reproducibility. The automatic mitosis detection method can perform mitosis counting in an unbiased way, with substantial agreement with human experts. PMID:27529701

  7. Portable multiplicity counter

    DOEpatents

    Newell, Matthew R [Los Alamos, NM; Jones, David Carl [Los Alamos, NM

    2009-09-01

    A portable multiplicity counter has signal input circuitry, processing circuitry and a user/computer interface disposed in a housing. The processing circuitry, which can comprise a microcontroller integrated circuit operably coupled to shift register circuitry implemented in a field programmable gate array, is configured to be operable via the user/computer interface to count input signal pluses receivable at said signal input circuitry and record time correlations thereof in a total counting mode, coincidence counting mode and/or a multiplicity counting mode. The user/computer interface can be for example an LCD display/keypad and/or a USB interface. The counter can include a battery pack for powering the counter and low/high voltage power supplies for biasing external detectors so that the counter can be configured as a hand-held device for counting neutron events.

  8. MicroCT with energy-resolved photon-counting detectors

    PubMed Central

    Wang, X; Meier, D; Mikkelsen, S; Maehlum, G E; Wagenaar, D J; Tsui, BMW; Patt, B E; Frey, E C

    2011-01-01

    The goal of this paper was to investigate the benefits that could be realistically achieved on a microCT imaging system with an energy-resolved photon-counting x-ray detector. To this end, we built and evaluated a prototype microCT system based on such a detector. The detector is based on cadmium telluride (CdTe) radiation sensors and application-specific integrated circuit (ASIC) readouts. Each detector pixel can simultaneously count x-ray photons above six energy thresholds, providing the capability for energy-selective x-ray imaging. We tested the spectroscopic performance of the system using polychromatic x-ray radiation and various filtering materials with Kabsorption edges. Tomographic images were then acquired of a cylindrical PMMA phantom containing holes filled with various materials. Results were also compared with those acquired using an intensity-integrating x-ray detector and single-energy (i.e. non-energy-selective) CT. This paper describes the functionality and performance of the system, and presents preliminary spectroscopic and tomographic results. The spectroscopic experiments showed that the energy-resolved photon-counting detector was capable of measuring energy spectra from polychromatic sources like a standard x-ray tube, and resolving absorption edges present in the energy range used for imaging. However, the spectral quality was degraded by spectral distortions resulting from degrading factors, including finite energy resolution and charge sharing. We developed a simple charge-sharing model to reproduce these distortions. The tomographic experiments showed that the availability of multiple energy thresholds in the photon-counting detector allowed us to simultaneously measure target-to-background contrasts in different energy ranges. Compared with single-energy CT with an integrating detector, this feature was especially useful to improve differentiation of materials with different attenuation coefficient energy dependences. PMID:21464527

  9. MicroCT with energy-resolved photon-counting detectors.

    PubMed

    Wang, X; Meier, D; Mikkelsen, S; Maehlum, G E; Wagenaar, D J; Tsui, B M W; Patt, B E; Frey, E C

    2011-05-07

    The goal of this paper was to investigate the benefits that could be realistically achieved on a microCT imaging system with an energy-resolved photon-counting x-ray detector. To this end, we built and evaluated a prototype microCT system based on such a detector. The detector is based on cadmium telluride (CdTe) radiation sensors and application-specific integrated circuit (ASIC) readouts. Each detector pixel can simultaneously count x-ray photons above six energy thresholds, providing the capability for energy-selective x-ray imaging. We tested the spectroscopic performance of the system using polychromatic x-ray radiation and various filtering materials with K-absorption edges. Tomographic images were then acquired of a cylindrical PMMA phantom containing holes filled with various materials. Results were also compared with those acquired using an intensity-integrating x-ray detector and single-energy (i.e. non-energy-selective) CT. This paper describes the functionality and performance of the system, and presents preliminary spectroscopic and tomographic results. The spectroscopic experiments showed that the energy-resolved photon-counting detector was capable of measuring energy spectra from polychromatic sources like a standard x-ray tube, and resolving absorption edges present in the energy range used for imaging. However, the spectral quality was degraded by spectral distortions resulting from degrading factors, including finite energy resolution and charge sharing. We developed a simple charge-sharing model to reproduce these distortions. The tomographic experiments showed that the availability of multiple energy thresholds in the photon-counting detector allowed us to simultaneously measure target-to-background contrasts in different energy ranges. Compared with single-energy CT with an integrating detector, this feature was especially useful to improve differentiation of materials with different attenuation coefficient energy dependences.

  10. A novel concentration and viability detection method for Brettanomyces using the Cellometer image cytometry.

    PubMed

    Martyniak, Brian; Bolton, Jason; Kuksin, Dmitry; Shahin, Suzanne M; Chan, Leo Li-Ying

    2017-01-01

    Brettanomyces spp. can present unique cell morphologies comprised of excessive pseudohyphae and budding, leading to difficulties in enumerating cells. The current cell counting methods include manual counting of methylene blue-stained yeasts or measuring optical densities using a spectrophotometer. However, manual counting can be time-consuming and has high operator-dependent variations due to subjectivity. Optical density measurement can also introduce uncertainties where instead of individual cells counted, an average of a cell population is measured. In contrast, by utilizing the fluorescence capability of an image cytometer to detect acridine orange and propidium iodide viability dyes, individual cell nuclei can be counted directly in the pseudohyphae chains, which can improve the accuracy and efficiency of cell counting, as well as eliminating the subjectivity from manual counting. In this work, two experiments were performed to demonstrate the capability of Cellometer image cytometer to monitor Brettanomyces concentrations, viabilities, and budding/pseudohyphae percentages. First, a yeast propagation experiment was conducted to optimize software counting parameters for monitoring the growth of Brettanomyces clausenii, Brettanomyces bruxellensis, and Brettanomyces lambicus, which showed increasing cell concentrations, and varying pseudohyphae percentages. The pseudohyphae formed during propagation were counted either as multiple nuclei or a single multi-nuclei organism, where the results of counting the yeast as a single multi-nuclei organism were directly compared to manual counting. Second, a yeast fermentation experiment was conducted to demonstrate that the proposed image cytometric analysis method can monitor the growth pattern of B. lambicus and B. clausenii during beer fermentation. The results from both experiments displayed different growth patterns, viability, and budding/pseudohyphae percentages for each Brettanomyces species. The proposed Cellometer image cytometry method can improve efficiency and eliminate operator-dependent variations of cell counting compared with the traditional methods, which can potentially improve the quality of beverage products employing Brettanomyces yeasts.

  11. Platelet Counts in Insoluble Platelet-Rich Fibrin Clots: A Direct Method for Accurate Determination.

    PubMed

    Kitamura, Yutaka; Watanabe, Taisuke; Nakamura, Masayuki; Isobe, Kazushige; Kawabata, Hideo; Uematsu, Kohya; Okuda, Kazuhiro; Nakata, Koh; Tanaka, Takaaki; Kawase, Tomoyuki

    2018-01-01

    Platelet-rich fibrin (PRF) clots have been used in regenerative dentistry most often, with the assumption that growth factor levels are concentrated in proportion to the platelet concentration. Platelet counts in PRF are generally determined indirectly by platelet counting in other liquid fractions. This study shows a method for direct estimation of platelet counts in PRF. To validate this method by determination of the recovery rate, whole-blood samples were obtained with an anticoagulant from healthy donors, and platelet-rich plasma (PRP) fractions were clotted with CaCl 2 by centrifugation and digested with tissue-plasminogen activator. Platelet counts were estimated before clotting and after digestion using an automatic hemocytometer. The method was then tested on PRF clots. The quality of platelets was examined by scanning electron microscopy and flow cytometry. In PRP-derived fibrin matrices, the recovery rate of platelets and white blood cells was 91.6 and 74.6%, respectively, after 24 h of digestion. In PRF clots associated with small and large red thrombi, platelet counts were 92.6 and 67.2% of the respective total platelet counts. These findings suggest that our direct method is sufficient for estimating the number of platelets trapped in an insoluble fibrin matrix and for determining that platelets are distributed in PRF clots and red thrombi roughly in proportion to their individual volumes. Therefore, we propose this direct digestion method for more accurate estimation of platelet counts in most types of platelet-enriched fibrin matrix.

  12. Platelet Counts in Insoluble Platelet-Rich Fibrin Clots: A Direct Method for Accurate Determination

    PubMed Central

    Kitamura, Yutaka; Watanabe, Taisuke; Nakamura, Masayuki; Isobe, Kazushige; Kawabata, Hideo; Uematsu, Kohya; Okuda, Kazuhiro; Nakata, Koh; Tanaka, Takaaki; Kawase, Tomoyuki

    2018-01-01

    Platelet-rich fibrin (PRF) clots have been used in regenerative dentistry most often, with the assumption that growth factor levels are concentrated in proportion to the platelet concentration. Platelet counts in PRF are generally determined indirectly by platelet counting in other liquid fractions. This study shows a method for direct estimation of platelet counts in PRF. To validate this method by determination of the recovery rate, whole-blood samples were obtained with an anticoagulant from healthy donors, and platelet-rich plasma (PRP) fractions were clotted with CaCl2 by centrifugation and digested with tissue-plasminogen activator. Platelet counts were estimated before clotting and after digestion using an automatic hemocytometer. The method was then tested on PRF clots. The quality of platelets was examined by scanning electron microscopy and flow cytometry. In PRP-derived fibrin matrices, the recovery rate of platelets and white blood cells was 91.6 and 74.6%, respectively, after 24 h of digestion. In PRF clots associated with small and large red thrombi, platelet counts were 92.6 and 67.2% of the respective total platelet counts. These findings suggest that our direct method is sufficient for estimating the number of platelets trapped in an insoluble fibrin matrix and for determining that platelets are distributed in PRF clots and red thrombi roughly in proportion to their individual volumes. Therefore, we propose this direct digestion method for more accurate estimation of platelet counts in most types of platelet-enriched fibrin matrix. PMID:29450197

  13. Comparison of Drive Counts and Mark-Resight As Methods of Population Size Estimation of Highly Dense Sika Deer (Cervus nippon) Populations

    PubMed Central

    Takeshita, Kazutaka; Yoshida, Tsuyoshi; Igota, Hiromasa; Matsuura, Yukiko

    2016-01-01

    Assessing temporal changes in abundance indices is an important issue in the management of large herbivore populations. The drive counts method has been frequently used as a deer abundance index in mountainous regions. However, despite an inherent risk for observation errors in drive counts, which increase with deer density, evaluations of the utility of drive counts at a high deer density remain scarce. We compared the drive counts and mark-resight (MR) methods in the evaluation of a highly dense sika deer population (MR estimates ranged between 11 and 53 individuals/km2) on Nakanoshima Island, Hokkaido, Japan, between 1999 and 2006. This deer population experienced two large reductions in density; approximately 200 animals in total were taken from the population through a large-scale population removal and a separate winter mass mortality event. Although the drive counts tracked temporal changes in deer abundance on the island, they overestimated the counts for all years in comparison to the MR method. Increased overestimation in drive count estimates after the winter mass mortality event may be due to a double count derived from increased deer movement and recovery of body condition secondary to the mitigation of density-dependent food limitations. Drive counts are unreliable because they are affected by unfavorable factors such as bad weather, and they are cost-prohibitive to repeat, which precludes the calculation of confidence intervals. Therefore, the use of drive counts to infer the deer abundance needs to be reconsidered. PMID:27711181

  14. Recommended methods for monitoring change in bird populations by counting and capture of migrants

    Treesearch

    David J. T. Hussell; C. John Ralph

    2005-01-01

    Counts and banding captures of spring or fall migrants can generate useful information on the status and trends of the source populations. To do so, the counts and captures must be taken and recorded in a standardized and consistent manner. We present recommendations for field methods for counting and capturing migrants at intensively operated sites, such as bird...

  15. Fractal analysis of mandibular trabecular bone: optimal tile sizes for the tile counting method.

    PubMed

    Huh, Kyung-Hoe; Baik, Jee-Seon; Yi, Won-Jin; Heo, Min-Suk; Lee, Sam-Sun; Choi, Soon-Chul; Lee, Sun-Bok; Lee, Seung-Pyo

    2011-06-01

    This study was performed to determine the optimal tile size for the fractal dimension of the mandibular trabecular bone using a tile counting method. Digital intraoral radiographic images were obtained at the mandibular angle, molar, premolar, and incisor regions of 29 human dry mandibles. After preprocessing, the parameters representing morphometric characteristics of the trabecular bone were calculated. The fractal dimensions of the processed images were analyzed in various tile sizes by the tile counting method. The optimal range of tile size was 0.132 mm to 0.396 mm for the fractal dimension using the tile counting method. The sizes were closely related to the morphometric parameters. The fractal dimension of mandibular trabecular bone, as calculated with the tile counting method, can be best characterized with a range of tile sizes from 0.132 to 0.396 mm.

  16. Bayesian model selection techniques as decision support for shaping a statistical analysis plan of a clinical trial: An example from a vertigo phase III study with longitudinal count data as primary endpoint

    PubMed Central

    2012-01-01

    Background A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. Methods We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). Results The instruments under study provide excellent tools for preparing decisions within the SAP in a transparent way when structuring the primary analysis, sensitivity or ancillary analyses, and specific analyses for secondary endpoints. The mean logarithmic score and DIC discriminate well between different model scenarios. It becomes obvious that the naive choice of a conventional random effects Poisson model is often inappropriate for real-life count data. The findings are used to specify an appropriate mixed model employed in the sensitivity analyses of an ongoing phase III trial. Conclusions The proposed Bayesian methods are not only appealing for inference but notably provide a sophisticated insight into different aspects of model performance, such as forecast verification or calibration checks, and can be applied within the model selection process. The mean of the logarithmic score is a robust tool for model ranking and is not sensitive to sample size. Therefore, these Bayesian model selection techniques offer helpful decision support for shaping sensitivity and ancillary analyses in a statistical analysis plan of a clinical trial with longitudinal count data as the primary endpoint. PMID:22962944

  17. Current automated 3D cell detection methods are not a suitable replacement for manual stereologic cell counting

    PubMed Central

    Schmitz, Christoph; Eastwood, Brian S.; Tappan, Susan J.; Glaser, Jack R.; Peterson, Daniel A.; Hof, Patrick R.

    2014-01-01

    Stereologic cell counting has had a major impact on the field of neuroscience. A major bottleneck in stereologic cell counting is that the user must manually decide whether or not each cell is counted according to three-dimensional (3D) stereologic counting rules by visual inspection within hundreds of microscopic fields-of-view per investigated brain or brain region. Reliance on visual inspection forces stereologic cell counting to be very labor-intensive and time-consuming, and is the main reason why biased, non-stereologic two-dimensional (2D) “cell counting” approaches have remained in widespread use. We present an evaluation of the performance of modern automated cell detection and segmentation algorithms as a potential alternative to the manual approach in stereologic cell counting. The image data used in this study were 3D microscopic images of thick brain tissue sections prepared with a variety of commonly used nuclear and cytoplasmic stains. The evaluation compared the numbers and locations of cells identified unambiguously and counted exhaustively by an expert observer with those found by three automated 3D cell detection algorithms: nuclei segmentation from the FARSIGHT toolkit, nuclei segmentation by 3D multiple level set methods, and the 3D object counter plug-in for ImageJ. Of these methods, FARSIGHT performed best, with true-positive detection rates between 38 and 99% and false-positive rates from 3.6 to 82%. The results demonstrate that the current automated methods suffer from lower detection rates and higher false-positive rates than are acceptable for obtaining valid estimates of cell numbers. Thus, at present, stereologic cell counting with manual decision for object inclusion according to unbiased stereologic counting rules remains the only adequate method for unbiased cell quantification in histologic tissue sections. PMID:24847213

  18. A matrix-inversion method for gamma-source mapping from gamma-count data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adsley, Ian; Burgess, Claire; Bull, Richard K

    In a previous paper it was proposed that a simple matrix inversion method could be used to extract source distributions from gamma-count maps, using simple models to calculate the response matrix. The method was tested using numerically generated count maps. In the present work a 100 kBq Co{sup 60} source has been placed on a gridded surface and the count rate measured using a NaI scintillation detector. The resulting map of gamma counts was used as input to the matrix inversion procedure and the source position recovered. A multi-source array was simulated by superposition of several single-source count maps andmore » the source distribution was again recovered using matrix inversion. The measurements were performed for several detector heights. The effects of uncertainties in source-detector distances on the matrix inversion method are also examined. The results from this work give confidence in the application of the method to practical applications, such as the segregation of highly active objects amongst fuel-element debris. (authors)« less

  19. Metamethod study of qualitative psychotherapy research on clients' experiences: Review and recommendations.

    PubMed

    Levitt, Heidi M; Pomerville, Andrew; Surace, Francisco I; Grabowski, Lauren M

    2017-11-01

    A metamethod study is a qualitative meta-analysis focused upon the methods and procedures used in a given research domain. These studies are rare in psychological research. They permit both the documentation of the informal standards within a field of research and recommendations for future work in that area. This paper presents a metamethod analysis of a substantial body of qualitative research that focused on clients' experiences in psychotherapy (109 studies). This review examined the ways that methodological integrity has been established across qualitative research methods. It identified the numbers of participants recruited and the form of data collection used (e.g., semistructured interviews, diaries). As well, it examined the types of checks employed to increase methodological integrity, such as participant counts, saturation, reflexivity techniques, participant feedback, or consensus and auditing processes. Central findings indicated that the researchers quite flexibly integrated procedures associated with one method into studies using other methods in order to strengthen their rigor. It appeared normative to adjust procedures to advance methodological integrity. These findings encourage manuscript reviewers to assess the function of procedures within a study rather than to require researchers to adhere to the set of procedures associated with a method. In addition, when epistemological approaches were mentioned they were overwhelmingly constructivist in nature, despite the increasing use of procedures traditionally associated with objectivist perspectives. It is recommended that future researchers do more to explicitly describe the functions of their procedures so that they are coherently situated within the epistemological approaches in use. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  20. Bird biodiversity assessments in temperate forest: the value of point count versus acoustic monitoring protocols.

    PubMed

    Klingbeil, Brian T; Willig, Michael R

    2015-01-01

    Effective monitoring programs for biodiversity are needed to assess trends in biodiversity and evaluate the consequences of management. This is particularly true for birds and faunas that occupy interior forest and other areas of low human population density, as these are frequently under-sampled compared to other habitats. For birds, Autonomous Recording Units (ARUs) have been proposed as a supplement or alternative to point counts made by human observers to enhance monitoring efforts. We employed two strategies (i.e., simultaneous-collection and same-season) to compare point count and ARU methods for quantifying species richness and composition of birds in temperate interior forests. The simultaneous-collection strategy compares surveys by ARUs and point counts, with methods matched in time, location, and survey duration such that the person and machine simultaneously collect data. The same-season strategy compares surveys from ARUs and point counts conducted at the same locations throughout the breeding season, but methods differ in the number, duration, and frequency of surveys. This second strategy more closely follows the ways in which monitoring programs are likely to be implemented. Site-specific estimates of richness (but not species composition) differed between methods; however, the nature of the relationship was dependent on the assessment strategy. Estimates of richness from point counts were greater than estimates from ARUs in the simultaneous-collection strategy. Woodpeckers in particular, were less frequently identified from ARUs than point counts with this strategy. Conversely, estimates of richness were lower from point counts than ARUs in the same-season strategy. Moreover, in the same-season strategy, ARUs detected the occurrence of passerines at a higher frequency than did point counts. Differences between ARU and point count methods were only detected in site-level comparisons. Importantly, both methods provide similar estimates of species richness and composition for the region. Consequently, if single visits to sites or short-term monitoring are the goal, point counts will likely perform better than ARUs, especially if species are rare or vocalize infrequently. However, if seasonal or annual monitoring of sites is the goal, ARUs offer a viable alternative to standard point-count methods, especially in the context of large-scale or long-term monitoring of temperate forest birds.

  1. Automatic vehicle counting using background subtraction method on gray scale images and morphology operation

    NASA Astrophysics Data System (ADS)

    Adi, K.; Widodo, A. P.; Widodo, C. E.; Pamungkas, A.; Putranto, A. B.

    2018-05-01

    Traffic monitoring on road needs to be done, the counting of the number of vehicles passing the road is necessary. It is more emphasized for highway transportation management in order to prevent efforts. Therefore, it is necessary to develop a system that is able to counting the number of vehicles automatically. Video processing method is able to counting the number of vehicles automatically. This research has development a system of vehicle counting on toll road. This system includes processes of video acquisition, frame extraction, and image processing for each frame. Video acquisition is conducted in the morning, at noon, in the afternoon, and in the evening. This system employs of background subtraction and morphology methods on gray scale images for vehicle counting. The best vehicle counting results were obtained in the morning with a counting accuracy of 86.36 %, whereas the lowest accuracy was in the evening, at 21.43 %. Differences in morning and evening results are caused by different illumination in the morning and evening. This will cause the values in the image pixels to be different.

  2. Evaluation of the platelet counting by Abbott CELL-DYN SAPPHIRE haematology analyser compared with flow cytometry.

    PubMed

    Grimaldi, E; Del Vecchio, L; Scopacasa, F; Lo Pardo, C; Capone, F; Pariante, S; Scalia, G; De Caterina, M

    2009-04-01

    The Abbot Cell-Dyn Sapphire is a new generation haematology analyser. The system uses optical/fluorescence flow cytometry in combination with electronic impedance to produce a full blood count. Optical and impedance are the default methods for platelet counting while automated CD61-immunoplatelet analysis can be run as selectable test. The aim of this study was to determine the platelet count performance of the three counting methods available on the instrument and to compare the results with those provided by Becton Dickinson FACSCalibur flow cytometer used as reference method. A lipid interference experiment was also performed. Linearity, carryover and precision were good, and satisfactory agreement with reference method was found for the impedance, optical and CD61-immunoplatelet analysis, although this latter provided the closest results in comparison with flow cytometry. In the lipid interference experiment, a moderate inaccuracy of optical and immunoplatelet counts was observed starting from a very high lipid value.

  3. voom: precision weights unlock linear model analysis tools for RNA-seq read counts

    PubMed Central

    2014-01-01

    New normal linear modeling strategies are presented for analyzing read counts from RNA-seq experiments. The voom method estimates the mean-variance relationship of the log-counts, generates a precision weight for each observation and enters these into the limma empirical Bayes analysis pipeline. This opens access for RNA-seq analysts to a large body of methodology developed for microarrays. Simulation studies show that voom performs as well or better than count-based RNA-seq methods even when the data are generated according to the assumptions of the earlier methods. Two case studies illustrate the use of linear modeling and gene set testing methods. PMID:24485249

  4. voom: Precision weights unlock linear model analysis tools for RNA-seq read counts.

    PubMed

    Law, Charity W; Chen, Yunshun; Shi, Wei; Smyth, Gordon K

    2014-02-03

    New normal linear modeling strategies are presented for analyzing read counts from RNA-seq experiments. The voom method estimates the mean-variance relationship of the log-counts, generates a precision weight for each observation and enters these into the limma empirical Bayes analysis pipeline. This opens access for RNA-seq analysts to a large body of methodology developed for microarrays. Simulation studies show that voom performs as well or better than count-based RNA-seq methods even when the data are generated according to the assumptions of the earlier methods. Two case studies illustrate the use of linear modeling and gene set testing methods.

  5. Mitosis Counting in Breast Cancer: Object-Level Interobserver Agreement and Comparison to an Automatic Method.

    PubMed

    Veta, Mitko; van Diest, Paul J; Jiwa, Mehdi; Al-Janabi, Shaimaa; Pluim, Josien P W

    2016-01-01

    Tumor proliferation speed, most commonly assessed by counting of mitotic figures in histological slide preparations, is an important biomarker for breast cancer. Although mitosis counting is routinely performed by pathologists, it is a tedious and subjective task with poor reproducibility, particularly among non-experts. Inter- and intraobserver reproducibility of mitosis counting can be improved when a strict protocol is defined and followed. Previous studies have examined only the agreement in terms of the mitotic count or the mitotic activity score. Studies of the observer agreement at the level of individual objects, which can provide more insight into the procedure, have not been performed thus far. The development of automatic mitosis detection methods has received large interest in recent years. Automatic image analysis is viewed as a solution for the problem of subjectivity of mitosis counting by pathologists. In this paper we describe the results from an interobserver agreement study between three human observers and an automatic method, and make two unique contributions. For the first time, we present an analysis of the object-level interobserver agreement on mitosis counting. Furthermore, we train an automatic mitosis detection method that is robust with respect to staining appearance variability and compare it with the performance of expert observers on an "external" dataset, i.e. on histopathology images that originate from pathology labs other than the pathology lab that provided the training data for the automatic method. The object-level interobserver study revealed that pathologists often do not agree on individual objects, even if this is not reflected in the mitotic count. The disagreement is larger for objects from smaller size, which suggests that adding a size constraint in the mitosis counting protocol can improve reproducibility. The automatic mitosis detection method can perform mitosis counting in an unbiased way, with substantial agreement with human experts.

  6. Blood-brain barrier integrity, intrathecal immunoactivation, and neuronal injury in HIV.

    PubMed

    Anesten, Birgitta; Yilmaz, Aylin; Hagberg, Lars; Zetterberg, Henrik; Nilsson, Staffan; Brew, Bruce J; Fuchs, Dietmar; Price, Richard W; Gisslén, Magnus

    2016-12-01

    Although blood-brain barrier (BBB) impairment has been reported in HIV-infected individuals, characterization of this impairment has not been clearly defined. BBB integrity was measured by CSF/plasma albumin ratio in this cross-sectional study of 631 HIV-infected individuals and 71 controls. We also analyzed CSF and blood HIV RNA and neopterin, CSF leukocyte count, and neurofilament light chain protein (NFL) concentrations. The HIV-infected participants included untreated neuroasymptomatic patients, patients with untreated HIV-associated dementia (HAD), and participants on suppressive antiretroviral treatment (ART). The albumin ratio was significantly increased in patients with HAD compared to all other groups. There were no significant differences between untreated neuroasymptomatic participants, treated participants, and controls. BBB integrity, however, correlated significantly with CSF leukocyte count, CSF HIV RNA, serum and CSF neopterin, and age in untreated neuroasymptomatic participants. In a multiple linear regression analysis, age, CSF neopterin, and CSF leukocyte count stood out as independent predictors of albumin ratio. A significant correlation was found between albumin ratio and CSF NFL in untreated neuroasymptomatic patients and in participants on ART. Albumin ratio, age, and CD4 cell count were confirmed as independent predictors of CSF NFL in multivariable analysis. BBB disruption was mainly found in patients with HAD, where BBB damage correlated with CNS immunoactivation. Albumin ratios also correlated with CSF inflammatory markers and NFL in untreated neuroasymptomatic participants. These findings give support to the association among BBB deterioration, intrathecal immunoactivation, and neuronal injury in untreated neuroasymptomatic HIV-infected individuals.

  7. Preliminary investigation of a water-based method for fast integrating mobility spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spielman, Steven R.; Hering, Susanne V.; Kuang, Chongai

    A water-based condensational growth channel was developed for imaging mobility-separated particles within a parallel plate separation channel of the Fast Integrated Mobility Spectrometer (FIMS). Reported are initial tests of that system, in which the alcohol condenser of the FIMS was replaced by a water-based condensational growth channel. Tests with monodispersed sodium chloride aerosol verify that the water-condensational growth maintained the laminar flow, while providing sufficient growth for particle imaging. Particle positions mapped onto particle mobility, in accordance with theoretical expectations. Particles ranging in size from 12 nm to 100 nm were counted with the same efficiency as with a butanol-based ultrafine particlemore » counter, once inlet and line losses were taken into account.« less

  8. Preliminary investigation of a water-based method for fast integrating mobility spectrometry

    DOE PAGES

    Spielman, Steven R.; Hering, Susanne V.; Kuang, Chongai; ...

    2017-06-06

    A water-based condensational growth channel was developed for imaging mobility-separated particles within a parallel plate separation channel of the Fast Integrated Mobility Spectrometer (FIMS). Reported are initial tests of that system, in which the alcohol condenser of the FIMS was replaced by a water-based condensational growth channel. Tests with monodispersed sodium chloride aerosol verify that the water-condensational growth maintained the laminar flow, while providing sufficient growth for particle imaging. Particle positions mapped onto particle mobility, in accordance with theoretical expectations. Particles ranging in size from 12 nm to 100 nm were counted with the same efficiency as with a butanol-based ultrafine particlemore » counter, once inlet and line losses were taken into account.« less

  9. Research integrity and everyday practice of science.

    PubMed

    Grinnell, Frederick

    2013-09-01

    Science traditionally is taught as a linear process based on logic and carried out by objective researchers following the scientific method. Practice of science is a far more nuanced enterprise, one in which intuition and passion become just as important as objectivity and logic. Whether the activity is committing to study a particular research problem, drawing conclusions about a hypothesis under investigation, choosing whether to count results as data or experimental noise, or deciding what information to present in a research paper, ethical challenges inevitably will arise because of the ambiguities inherent in practice. Unless these ambiguities are acknowledged and their sources understood explicitly, responsible conduct of science education will not adequately prepare the individuals receiving the training for the kinds of decisions essential to research integrity that they will have to make as scientists.

  10. Investigation of ultra low-dose scans in the context of quantum-counting clinical CT

    NASA Astrophysics Data System (ADS)

    Weidinger, T.; Buzug, T. M.; Flohr, T.; Fung, G. S. K.; Kappler, S.; Stierstorfer, K.; Tsui, B. M. W.

    2012-03-01

    In clinical computed tomography (CT), images from patient examinations taken with conventional scanners exhibit noise characteristics governed by electronics noise, when scanning strongly attenuating obese patients or with an ultra-low X-ray dose. Unlike CT systems based on energy integrating detectors, a system with a quantum counting detector does not suffer from this drawback. Instead, the noise from the electronics mainly affects the spectral resolution of these detectors. Therefore, it does not contribute to the image noise in spectrally non-resolved CT images. This promises improved image quality due to image noise reduction in scans obtained from clinical CT examinations with lowest X-ray tube currents or obese patients. To quantify the benefits of quantum counting detectors in clinical CT we have carried out an extensive simulation study of the complete scanning and reconstruction process for both kinds of detectors. The simulation chain encompasses modeling of the X-ray source, beam attenuation in the patient, and calculation of the detector response. Moreover, in each case the subsequent image preprocessing and reconstruction is modeled as well. The simulation-based, theoretical evaluation is validated by experiments with a novel prototype quantum counting system and a Siemens Definition Flash scanner with a conventional energy integrating CT detector. We demonstrate and quantify the improvement from image noise reduction achievable with quantum counting techniques in CT examinations with ultra-low X-ray dose and strong attenuation.

  11. UNO DMRG CASCI calculations of effective exchange integrals for m-phenylene-bis-methylene spin clusters

    NASA Astrophysics Data System (ADS)

    Kawakami, Takashi; Sano, Shinsuke; Saito, Toru; Sharma, Sandeep; Shoji, Mitsuo; Yamada, Satoru; Takano, Yu; Yamanaka, Shusuke; Okumura, Mitsutaka; Nakajima, Takahito; Yamaguchi, Kizashi

    2017-09-01

    Theoretical examinations of the ferromagnetic coupling in the m-phenylene-bis-methylene molecule and its oligomer were carried out. These systems are good candidates for exchange-coupled systems to investigate strong electronic correlations. We studied effective exchange integrals (J), which indicated magnetic coupling between interacting spins in these species. First, theoretical calculations based on a broken-symmetry single-reference procedure, i.e. the UHF, UMP2, UMP4, UCCSD(T) and UB3LYP methods, were carried out with a GAUSSIAN program code under an SR wave function. From these results, the J value by the UHF method was largely positive because of the strong ferromagnetic spin polarisation effect. The J value by the UCCSD(T) and UB3LYP methods improved an overestimation problem by correcting the dynamical electronic correlation. Next, magnetic coupling among these spins was studied using the CAS-based method of the symmetry-adapted multireference methods procedure. Thus, the UNO DMRG CASCI (UNO, unrestricted natural orbital; DMRG, density matrix renormalised group; CASCI, complete active space configuration interaction) method was mainly employed with a combination of ORCA and BLOCK program codes. DMRG CASCI calculations in valence electron counting, which included all orbitals to full valence CI, provided the most reliable result, and support the UB3LYP method for extended systems.

  12. Comparison of cell counting methods in rodent pulmonary toxicity studies: automated and manual protocols and considerations for experimental design

    PubMed Central

    Zeidler-Erdely, Patti C.; Antonini, James M.; Meighan, Terence G.; Young, Shih-Houng; Eye, Tracy J.; Hammer, Mary Ann; Erdely, Aaron

    2016-01-01

    Pulmonary toxicity studies often use bronchoalveolar lavage (BAL) to investigate potential adverse lung responses to a particulate exposure. The BAL cellular fraction is counted, using automated (i.e. Coulter Counter®), flow cytometry or manual (i.e. hemocytometer) methods, to determine inflammatory cell influx. The goal of the study was to compare the different counting methods to determine which is optimal for examining BAL cell influx after exposure by inhalation or intratracheal instillation (ITI) to different particles with varying inherent pulmonary toxicities in both rat and mouse models. General findings indicate that total BAL cell counts using the automated and manual methods tended to agree after inhalation or ITI exposure to particle samples that are relatively nontoxic or at later time points after exposure to a pneumotoxic particle when the response resolves. However, when the initial lung inflammation and cytotoxicity was high after exposure to a pneumotoxic particle, significant differences were observed when comparing cell counts from the automated, flow cytometry and manual methods. When using total BAL cell count for differential calculations from the automated method, depending on the cell diameter size range cutoff, the data suggest that the number of lung polymorphonuclear leukocytes (PMN) varies. Importantly, the automated counts, regardless of the size cutoff, still indicated a greater number of total lung PMN when compared with the manual method, which agreed more closely with flow cytometry. The results suggest that either the manual method or flow cytometry would be better suited for BAL studies where cytotoxicity is an unknown variable. PMID:27251196

  13. Comparison of cell counting methods in rodent pulmonary toxicity studies: automated and manual protocols and considerations for experimental design.

    PubMed

    Zeidler-Erdely, Patti C; Antonini, James M; Meighan, Terence G; Young, Shih-Houng; Eye, Tracy J; Hammer, Mary Ann; Erdely, Aaron

    2016-08-01

    Pulmonary toxicity studies often use bronchoalveolar lavage (BAL) to investigate potential adverse lung responses to a particulate exposure. The BAL cellular fraction is counted, using automated (i.e. Coulter Counter®), flow cytometry or manual (i.e. hemocytometer) methods, to determine inflammatory cell influx. The goal of the study was to compare the different counting methods to determine which is optimal for examining BAL cell influx after exposure by inhalation or intratracheal instillation (ITI) to different particles with varying inherent pulmonary toxicities in both rat and mouse models. General findings indicate that total BAL cell counts using the automated and manual methods tended to agree after inhalation or ITI exposure to particle samples that are relatively nontoxic or at later time points after exposure to a pneumotoxic particle when the response resolves. However, when the initial lung inflammation and cytotoxicity was high after exposure to a pneumotoxic particle, significant differences were observed when comparing cell counts from the automated, flow cytometry and manual methods. When using total BAL cell count for differential calculations from the automated method, depending on the cell diameter size range cutoff, the data suggest that the number of lung polymorphonuclear leukocytes (PMN) varies. Importantly, the automated counts, regardless of the size cutoff, still indicated a greater number of total lung PMN when compared with the manual method, which agreed more closely with flow cytometry. The results suggest that either the manual method or flow cytometry would be better suited for BAL studies where cytotoxicity is an unknown variable.

  14. Multivariate random-parameters zero-inflated negative binomial regression model: an application to estimate crash frequencies at intersections.

    PubMed

    Dong, Chunjiao; Clarke, David B; Yan, Xuedong; Khattak, Asad; Huang, Baoshan

    2014-09-01

    Crash data are collected through police reports and integrated with road inventory data for further analysis. Integrated police reports and inventory data yield correlated multivariate data for roadway entities (e.g., segments or intersections). Analysis of such data reveals important relationships that can help focus on high-risk situations and coming up with safety countermeasures. To understand relationships between crash frequencies and associated variables, while taking full advantage of the available data, multivariate random-parameters models are appropriate since they can simultaneously consider the correlation among the specific crash types and account for unobserved heterogeneity. However, a key issue that arises with correlated multivariate data is the number of crash-free samples increases, as crash counts have many categories. In this paper, we describe a multivariate random-parameters zero-inflated negative binomial (MRZINB) regression model for jointly modeling crash counts. The full Bayesian method is employed to estimate the model parameters. Crash frequencies at urban signalized intersections in Tennessee are analyzed. The paper investigates the performance of MZINB and MRZINB regression models in establishing the relationship between crash frequencies, pavement conditions, traffic factors, and geometric design features of roadway intersections. Compared to the MZINB model, the MRZINB model identifies additional statistically significant factors and provides better goodness of fit in developing the relationships. The empirical results show that MRZINB model possesses most of the desirable statistical properties in terms of its ability to accommodate unobserved heterogeneity and excess zero counts in correlated data. Notably, in the random-parameters MZINB model, the estimated parameters vary significantly across intersections for different crash types. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Comparison of methods for estimating density of forest songbirds from point counts

    Treesearch

    Jennifer L. Reidy; Frank R. Thompson; J. Wesley. Bailey

    2011-01-01

    New analytical methods have been promoted for estimating the probability of detection and density of birds from count data but few studies have compared these methods using real data. We compared estimates of detection probability and density from distance and time-removal models and survey protocols based on 5- or 10-min counts and outer radii of 50 or 100 m. We...

  16. Comparison of fluorescence microscopy and solid-phase cytometry methods for counting bacteria in water

    USGS Publications Warehouse

    Lisle, John T.; Hamilton, Martin A.; Willse, Alan R.; McFeters, Gordon A.

    2004-01-01

    Total direct counts of bacterial abundance are central in assessing the biomass and bacteriological quality of water in ecological and industrial applications. Several factors have been identified that contribute to the variability in bacterial abundance counts when using fluorescent microscopy, the most significant of which is retaining an adequate number of cells per filter to ensure an acceptable level of statistical confidence in the resulting data. Previous studies that have assessed the components of total-direct-count methods that contribute to this variance have attempted to maintain a bacterial cell abundance value per filter of approximately 106 cells filter-1. In this study we have established the lower limit for the number of bacterial cells per filter at which the statistical reliability of the abundance estimate is no longer acceptable. Our results indicate that when the numbers of bacterial cells per filter were progressively reduced below 105, the microscopic methods increasingly overestimated the true bacterial abundance (range, 15.0 to 99.3%). The solid-phase cytometer only slightly overestimated the true bacterial abundances and was more consistent over the same range of bacterial abundances per filter (range, 8.9 to 12.5%). The solid-phase cytometer method for conducting total direct counts of bacteria was less biased and performed significantly better than any of the microscope methods. It was also found that microscopic count data from counting 5 fields on three separate filters were statistically equivalent to data from counting 20 fields on a single filter.

  17. Integration of Pseudomonas aeruginosa and Legionella pneumophila in drinking water biofilms grown on domestic plumbing materials.

    PubMed

    Moritz, Miriam M; Flemming, Hans-Curt; Wingender, Jost

    2010-06-01

    Drinking water biofilms were grown on coupons of plumbing materials, including ethylene-propylene-diene-monomer (EPDM) rubber, silane cross-linked polyethylene (PE-X b), electron-ray cross-linked PE (PE-X c) and copper under constant flow-through of cold tap water. After 14 days, the biofilms were spiked with Pseudomonas aeruginosa, Legionella pneumophila and Enterobacter nimipressuralis (10(6) cells/mL each). The test bacteria were environmental isolates from contamination events in drinking water systems. After static incubation for 24 h, water flow was resumed and continued for 4 weeks. Total cell count and heterotrophic plate count (HPC) of biofilms were monitored, and P. aeruginosa, L. pneumophila and E. nimipressuralis were quantified, using standard culture-based methods or culture-independent fluorescence in situ hybridization (FISH). After 14 days total cell counts and HPC values were highest on EPDM followed by the plastic materials and copper. P. aeruginosa and L. pneumophila became incorporated into drinking water biofilms and were capable to persist in biofilms on EPDM and PE-X materials for several weeks, while copper biofilms were colonized only by L. pneumophila in low culturable numbers. E. nimipressuralis was not detected in any of the biofilms. Application of the FISH method often yielded orders of magnitude higher levels of P. aeruginosa and L. pneumophila than culture methods. These observations indicate that drinking water biofilms grown under cold water conditions on domestic plumbing materials, especially EPDM and PE-X in the present study, can be a reservoir for P. aeruginosa and L. pneumophila that persist in these habitats mostly in a viable but non-culturable state.

  18. Understanding poisson regression.

    PubMed

    Hayat, Matthew J; Higgins, Melinda

    2014-04-01

    Nurse investigators often collect study data in the form of counts. Traditional methods of data analysis have historically approached analysis of count data either as if the count data were continuous and normally distributed or with dichotomization of the counts into the categories of occurred or did not occur. These outdated methods for analyzing count data have been replaced with more appropriate statistical methods that make use of the Poisson probability distribution, which is useful for analyzing count data. The purpose of this article is to provide an overview of the Poisson distribution and its use in Poisson regression. Assumption violations for the standard Poisson regression model are addressed with alternative approaches, including addition of an overdispersion parameter or negative binomial regression. An illustrative example is presented with an application from the ENSPIRE study, and regression modeling of comorbidity data is included for illustrative purposes. Copyright 2014, SLACK Incorporated.

  19. Estimating the Effects of Detection Heterogeneity and Overdispersion on Trends Estimated from Avian Point Counts

    EPA Science Inventory

    Point counts are a common method for sampling avian distribution and abundance. Though methods for estimating detection probabilities are available, many analyses use raw counts and do not correct for detectability. We use a removal model of detection within an N-mixture approa...

  20. A THUMBNAIL HISTORY OF HETEROTROPHIC PLATE COUNT (HPC) METHODOLOGY IN THE UNITED STATES

    EPA Science Inventory

    Over the past 100 years, the method of determining the number of bacteria in water, foods or other materials has been termed variously as: bacterial plate count, total plate count, total viable plate count, aerobic plate count, standard plate cound and more recently, heterotrophi...

  1. Evaluation of Kirkwood-Buff integrals via finite size scaling: a large scale molecular dynamics study

    NASA Astrophysics Data System (ADS)

    Dednam, W.; Botha, A. E.

    2015-01-01

    Solvation of bio-molecules in water is severely affected by the presence of co-solvent within the hydration shell of the solute structure. Furthermore, since solute molecules can range from small molecules, such as methane, to very large protein structures, it is imperative to understand the detailed structure-function relationship on the microscopic level. For example, it is useful know the conformational transitions that occur in protein structures. Although such an understanding can be obtained through large-scale molecular dynamic simulations, it is often the case that such simulations would require excessively large simulation times. In this context, Kirkwood-Buff theory, which connects the microscopic pair-wise molecular distributions to global thermodynamic properties, together with the recently developed technique, called finite size scaling, may provide a better method to reduce system sizes, and hence also the computational times. In this paper, we present molecular dynamics trial simulations of biologically relevant low-concentration solvents, solvated by aqueous co-solvent solutions. In particular we compare two different methods of calculating the relevant Kirkwood-Buff integrals. The first (traditional) method computes running integrals over the radial distribution functions, which must be obtained from large system-size NVT or NpT simulations. The second, newer method, employs finite size scaling to obtain the Kirkwood-Buff integrals directly by counting the particle number fluctuations in small, open sub-volumes embedded within a larger reservoir that can be well approximated by a much smaller simulation cell. In agreement with previous studies, which made a similar comparison for aqueous co-solvent solutions, without the additional solvent, we conclude that the finite size scaling method is also applicable to the present case, since it can produce computationally more efficient results which are equivalent to the more costly radial distribution function method.

  2. Characterization of a hybrid energy-resolving photon-counting detector

    NASA Astrophysics Data System (ADS)

    Zang, A.; Pelzer, G.; Anton, G.; Ballabriga Sune, R.; Bisello, F.; Campbell, M.; Fauler, A.; Fiederle, M.; Llopart Cudie, X.; Ritter, I.; Tennert, F.; Wölfel, S.; Wong, W. S.; Michel, T.

    2014-03-01

    Photon-counting detectors in medical x-ray imaging provide a higher dose efficiency than integrating detectors. Even further possibilities for imaging applications arise, if the energy of each photon counted is measured, as for example K-edge-imaging or optimizing image quality by applying energy weighting factors. In this contribution, we show results of the characterization of the Dosepix detector. This hybrid photon- counting pixel detector allows energy resolved measurements with a novel concept of energy binning included in the pixel electronics. Based on ideas of the Medipix detector family, it provides three different modes of operation: An integration mode, a photon-counting mode, and an energy-binning mode. In energy-binning mode, it is possible to set 16 energy thresholds in each pixel individually to derive a binned energy spectrum in every pixel in one acquisition. The hybrid setup allows using different sensor materials. For the measurements 300 μm Si and 1 mm CdTe were used. The detector matrix consists of 16 x 16 square pixels for CdTe (16 x 12 for Si) with a pixel pitch of 220 μm. The Dosepix was originally intended for applications in the field of radiation measurement. Therefore it is not optimized towards medical imaging. The detector concept itself still promises potential as an imaging detector. We present spectra measured in one single pixel as well as in the whole pixel matrix in energy-binning mode with a conventional x-ray tube. In addition, results concerning the count rate linearity for the different sensor materials are shown as well as measurements regarding energy resolution.

  3. Use of Surveillance Data on HIV Diagnoses with HIV-Related Symptoms to Estimate the Number of People Living with Undiagnosed HIV in Need of Antiretroviral Therapy

    PubMed Central

    van Sighem, Ard; Sabin, Caroline A.; Phillips, Andrew N.

    2015-01-01

    Background It is important to have methods available to estimate the number of people who have undiagnosed HIV and are in need of antiretroviral therapy (ART). Methods The method uses the concept that a predictable level of occurrence of AIDS or other HIV-related clinical symptoms which lead to presentation for care, and hence diagnosis of HIV, arises in undiagnosed people with a given CD4 count. The method requires surveillance data on numbers of new HIV diagnoses with HIV-related symptoms, and the CD4 count at diagnosis. The CD4 count-specific rate at which HIV-related symptoms develop are estimated from cohort data. 95% confidence intervals can be constructed using a simple simulation method. Results For example, if there were 13 HIV diagnoses with HIV-related symptoms made in one year with CD4 count at diagnosis between 150–199 cells/mm3, then since the CD4 count-specific rate of HIV-related symptoms is estimated as 0.216 per person-year, the estimated number of person years lived in people with undiagnosed HIV with CD4 count 150–199 cells/mm3 is 13/0.216 = 60 (95% confidence interval: 29–100), which is considered an estimate of the number of people living with undiagnosed HIV in this CD4 count stratum. Conclusions The method is straightforward to implement within a short period once a surveillance system of all new HIV diagnoses, collecting data on HIV-related symptoms at diagnosis, is in place and is most suitable for estimating the number of undiagnosed people with CD4 count <200 cells/mm3 due to the low rate of developing HIV-related symptoms at higher CD4 counts. A potential source of bias is under-diagnosis and under-reporting of diagnoses with HIV-related symptoms. Although this method has limitations as with all approaches, it is important for prompting increased efforts to identify undiagnosed people, particularly those with low CD4 count, and for informing levels of unmet need for ART. PMID:25768925

  4. Portable Cytometry Using Microscale Electronic Sensing

    PubMed Central

    Emaminejad, Sam; Paik, Kee-Hyun; Tabard-Cossa, Vincent; Javanmard, Mehdi

    2015-01-01

    In this manuscript, we present three different micro-impedance sensing architectures for electronic counting of cells and beads. The first method of sensing is based on using an open circuit sensing electrode integrated in a micro-pore, which measures the shift in potential as a micron-sized particle passes through. Our micro-pore, based on a funnel shaped microchannel, was fabricated in PDMS and was bound covalently to a glass substrate patterned with a gold open circuit electrode. The amplification circuitry was integrated onto a battery-powered custom printed circuit board. The second method is based on a three electrode differential measurement, which opens up the potential of using signal processing techniques to increase signal to noise ratio post measurement. The third architecture uses a contactless sensing approach, which significantly minimizes the cost of the consumable component of the impedance cytometer. We demonstrated proof of concept for the three sensing architectures by measuring the detected signal due to the passage of micron sized beads through the pore. PMID:27647950

  5. Fast and High Dynamic Range Imaging with Superconducting Tunnel Junction Detectors

    NASA Astrophysics Data System (ADS)

    Matsuo, Hiroshi

    2014-08-01

    We have demonstrated a combined test of the submillimeter-wave SIS photon detectors and GaAs-JFET cryogenic integrated circuits. A relatively large background photo-current can be read out by fast-reset integrating amplifiers. An integration time of 1 ms enables fast frame rate readout and large dynamic range imaging, with an expected dynamic range of 8,000 in 1 ms. Ultimate fast and high dynamic range performance of superconducting tunnel junction detectors (STJ) will be obtained when photon counting capabilities are employed. In the terahertz frequencies, when input photon rate of 100 MHz is measured, the photon bunching gives us enough timing resolution to be used as phase information of intensity fluctuation. Application of photon statistics will be a new tool in the terahertz frequency region. The design parameters of STJ terahertz photon counting detectors are discussed.

  6. A Method for Counting Moving People in Video Surveillance Videos

    NASA Astrophysics Data System (ADS)

    Conte, Donatello; Foggia, Pasquale; Percannella, Gennaro; Tufano, Francesco; Vento, Mario

    2010-12-01

    People counting is an important problem in video surveillance applications. This problem has been faced either by trying to detect people in the scene and then counting them or by establishing a mapping between some scene feature and the number of people (avoiding the complex detection problem). This paper presents a novel method, following this second approach, that is based on the use of SURF features and of an [InlineEquation not available: see fulltext.]-SVR regressor provide an estimate of this count. The algorithm takes specifically into account problems due to partial occlusions and to perspective. In the experimental evaluation, the proposed method has been compared with the algorithm by Albiol et al., winner of the PETS 2009 contest on people counting, using the same PETS 2009 database. The provided results confirm that the proposed method yields an improved accuracy, while retaining the robustness of Albiol's algorithm.

  7. [Comments on policy--elderly count for much, or counting the elderly? A new statement on policy for the elderly].

    PubMed

    Knapen, M; van der Zanden, G H

    1990-12-01

    The Dutch government has published a new white paper 'Elderly count for much' on policy for the aged. In this document the central principle for social policy is the integration of the elderly in society. Old age policy is characterized as 'integral policy' that is it tries to integrate the traditional fields of social and economic policy, and as 'complementary' policy, that is it tries to complement general policy. The main characteristics of the action program 1990-1994 include: prevention, the integration of housing and services, care for elderly with chronic diseases, education, strengthening of labor-market participation of 50+, the position of elderly women and societal attitudes towards aging and the elderly. In this comment it is argued that this white paper initiates positive developments, but there remain several minor and major problems. We are critical about the role of education, the instruments for an active labor-market policy, the lack of attention for the European dimension, and about the lack of attention for future developments in generational equity and age-rationing of service allocation. We appreciate the attention for age discrimination, and possibilities for longitudinal research. We conclude that 'integrated policy' is only in its initial phase. In this white paper the government is only successful in an integrated policy in the fields of housing and care, not in other fields like technology, labor and education. 'Complementary' policy is not enough to create a firm infrastructure in the aging field. If initiatives in the field of aging are considered as 'extra's' this policy will soon be confronted with the boundaries it creates itself. Although attention for the challenges of graying is growing, old age policy is still marginal compared to the main general policy.

  8. [Count of salivary Streptococci mutans in pregnant women of the metropolitan region of Chile: cross-sectional study].

    PubMed

    Villagrán, E; Linossier, A; Donoso, E

    1999-02-01

    Salivary Streptococci mutans contamination is considered the main microbiological risk factor for the initiation of caries. To assess the oral health of pregnant women, counting Salivary Streptococci mutants. One hundred seventy four pregnant women, in the first, second and third trimester of pregnancy, aged 27 +/- 5 years old, consulting at a public primary health center, were studied. Puerperal women that had their delivery two months before, were studied as a control group. Salivary samples were obtained and Streptococci mutans colonies were counted using quantitative and semiquantitative methods. There was a good concordance between both counting methods. No differences in Streptococci mutans counts were observed among the three groups of pregnant women, but the latter as a group had higher counts than puerperal women. Women with more than 5 caries had also higher counts. Semiquantitative Streptococci mutans counts are easy, rapid and non invasive and have a good concordance with quantitative counts in saliva.

  9. BMPix and PEAK tools: New methods for automated laminae recognition and counting—Application to glacial varves from Antarctic marine sediment

    NASA Astrophysics Data System (ADS)

    Weber, M. E.; Reichelt, L.; Kuhn, G.; Pfeiffer, M.; Korff, B.; Thurow, J.; Ricken, W.

    2010-03-01

    We present tools for rapid and quantitative detection of sediment lamination. The BMPix tool extracts color and gray scale curves from images at pixel resolution. The PEAK tool uses the gray scale curve and performs, for the first time, fully automated counting of laminae based on three methods. The maximum count algorithm counts every bright peak of a couplet of two laminae (annual resolution) in a smoothed curve. The zero-crossing algorithm counts every positive and negative halfway passage of the curve through a wide moving average, separating the record into bright and dark intervals (seasonal resolution). The same is true for the frequency truncation method, which uses Fourier transformation to decompose the curve into its frequency components before counting positive and negative passages. The algorithms are available at doi:10.1594/PANGAEA.729700. We applied the new methods successfully to tree rings, to well-dated and already manually counted marine varves from Saanich Inlet, and to marine laminae from the Antarctic continental margin. In combination with AMS14C dating, we found convincing evidence that laminations in Weddell Sea sites represent varves, deposited continuously over several millennia during the last glacial maximum. The new tools offer several advantages over previous methods. The counting procedures are based on a moving average generated from gray scale curves instead of manual counting. Hence, results are highly objective and rely on reproducible mathematical criteria. Also, the PEAK tool measures the thickness of each year or season. Since all information required is displayed graphically, interactive optimization of the counting algorithms can be achieved quickly and conveniently.

  10. Effect of distance-related heterogeneity on population size estimates from point counts

    USGS Publications Warehouse

    Efford, Murray G.; Dawson, Deanna K.

    2009-01-01

    Point counts are used widely to index bird populations. Variation in the proportion of birds counted is a known source of error, and for robust inference it has been advocated that counts be converted to estimates of absolute population size. We used simulation to assess nine methods for the conduct and analysis of point counts when the data included distance-related heterogeneity of individual detection probability. Distance from the observer is a ubiquitous source of heterogeneity, because nearby birds are more easily detected than distant ones. Several recent methods (dependent double-observer, time of first detection, time of detection, independent multiple-observer, and repeated counts) do not account for distance-related heterogeneity, at least in their simpler forms. We assessed bias in estimates of population size by simulating counts with fixed radius w over four time intervals (occasions). Detection probability per occasion was modeled as a half-normal function of distance with scale parameter sigma and intercept g(0) = 1.0. Bias varied with sigma/w; values of sigma inferred from published studies were often 50% for a 100-m fixed-radius count. More critically, the bias of adjusted counts sometimes varied more than that of unadjusted counts, and inference from adjusted counts would be less robust. The problem was not solved by using mixture models or including distance as a covariate. Conventional distance sampling performed well in simulations, but its assumptions are difficult to meet in the field. We conclude that no existing method allows effective estimation of population size from point counts.

  11. Comparison of Dry Medium Culture Plates for Mesophilic Aerobic Bacteria in Milk, Ice Cream, Ham, and Codfish Fillet Products

    PubMed Central

    Park, Junghyun; Kim, Myunghee

    2013-01-01

    This study was performed to compare the performance of Sanita-Kun dry medium culture plate with those of traditional culture medium and Petrifilm dry medium culture plate for the enumeration of the mesophilic aerobic bacteria in milk, ice cream, ham, and codfish fillet. Mesophilic aerobic bacteria were comparatively evaluated in milk, ice cream, ham, and codfish fillet using Sanita-Kun aerobic count (SAC), Petrifilm aerobic count (PAC), and traditional plate count agar (PCA) media. According to the results, all methods showed high correlations of 0.989~1.000 and no significant differences were observed for enumerating the mesophilic aerobic bacteria in the tested food products. SAC method was easier to perform and count colonies efficiently as compared to the PCA and PAC methods. Therefore, we concluded that the SAC method offers an acceptable alternative to the PCA and PAC methods for counting the mesophilic aerobic bacteria in milk, ice cream, ham, and codfish fillet products. PMID:24551829

  12. Emotional expressiveness and avoidance in narratives of unaccompanied refugee minors

    PubMed Central

    Huemer, Julia; Nelson, Kristin; Karnik, Niranjan; Völkl-Kernstock, Sabine; Seidel, Stefan; Ebner, Nina; Ryst, Erika; Friedrich, Max; Shaw, Richard J.; Realubit, Cassey; Steiner, Hans; Skala, Katrin

    2016-01-01

    Objective The aim of this study was to examine a cohort of unaccompanied refugee minors (URMs) by means of psycholinguistic methods in order to obtain a more subtle picture of their degree of traumatization. Methods Twenty-eight participants were included in the Stress-Inducing Speech Task (SIST) consisting of a free association (FA) and a stress (STR) condition. Narratives were examined by means of (1) quantitative parameters (word count); (2) psycholinguistic variables (temporal junctures, TJs), narrative structure, referential activity (RA)—a measure of emotional expressivity; and (3) content analysis ratings. Results Word count was significantly lower than in age-matched norms. In the FA condition, TJs were lower, but in the STR condition, rates were comparable. RA was significantly higher in both conditions. Content analysis ratings showed that the experiences described by these youths were potentially traumatic in nature. Conclusions This pattern of narrative shows a mixture of fulfilling the task demand, while containing an emotionally charged narrative. Narrative structure was absent in the FA condition, but preserved in the STR condition, as URMs struggled with the description of non-normative events. This indicates that these youths have not yet emotionally dealt with and fully integrated their trauma experiences. PMID:26955827

  13. Characterization of photon-counting multislit breast tomosynthesis.

    PubMed

    Berggren, Karl; Cederström, Björn; Lundqvist, Mats; Fredenberg, Erik

    2018-02-01

    It has been shown that breast tomosynthesis may improve sensitivity and specificity compared to two-dimensional mammography, resulting in increased detection-rate of cancers or lowered call-back rates. The purpose of this study is to characterize a spectral photon-counting multislit breast tomosynthesis system that is able to do single-scan spectral imaging with multiple collimated x-ray beams. The system differs in many aspects compared to conventional tomosynthesis using energy-integrating flat-panel detectors. The investigated system was a prototype consisting of a dual-threshold photon-counting detector with 21 collimated line detectors scanning across the compressed breast. A review of the system is done in terms of detector, acquisition geometry, and reconstruction methods. Three reconstruction methods were used, simple back-projection, filtered back-projection and an iterative algebraic reconstruction technique. The image quality was evaluated by measuring the modulation transfer-function (MTF), normalized noise-power spectrum, detective quantum-efficiency (DQE), and artifact spread-function (ASF) on reconstructed spectral tomosynthesis images for a total-energy bin (defined by a low-energy threshold calibrated to remove electronic noise) and for a high-energy bin (with a threshold calibrated to split the spectrum in roughly equal parts). Acquisition was performed using a 29 kVp W/Al x-ray spectrum at a 0.24 mGy exposure. The difference in MTF between the two energy bins was negligible, that is, there was no energy dependence on resolution. The MTF dropped to 50% at 1.5 lp/mm to 2.3 lp/mm in the scan direction and 2.4 lp/mm to 3.3 lp/mm in the slit direction, depending on the reconstruction method. The full width at half maximum of the ASF was found to range from 13.8 mm to 18.0 mm for the different reconstruction methods. The zero-frequency DQE of the system was found to be 0.72. The fraction of counts in the high-energy bin was measured to be 59% of the total detected spectrum. Scantimes ranged from 4 s to 16.5 s depending on voltage and current settings. The characterized system generates spectral tomosynthesis images with a dual-energy photon-counting detector. Measurements show a high DQE, enabling high image quality at a low dose, which is beneficial for low-dose applications such as screening. The single-scan spectral images open up for applications such as quantitative material decomposition and contrast-enhanced tomosynthesis. © 2017 American Association of Physicists in Medicine.

  14. Evaluation of the performance of a point-of-care method for total and differential white blood cell count in clozapine users.

    PubMed

    Bui, H N; Bogers, J P A M; Cohen, D; Njo, T; Herruer, M H

    2016-12-01

    We evaluated the performance of the HemoCue WBC DIFF, a point-of-care device for total and differential white cell count, primarily to test its suitability for the mandatory white blood cell monitoring in clozapine use. Leukocyte count and 5-part differentiation was performed by the point-of-care device and by routine laboratory method in venous EDTA-blood samples from 20 clozapine users, 20 neutropenic patients, and 20 healthy volunteers. From the volunteers, also a capillary sample was drawn. Intra-assay reproducibility and drop-to-drop variation were tested. The correlation between both methods in venous samples was r > 0.95 for leukocyte, neutrophil, and lymphocyte counts. The correlation between point-of-care (capillary sample) and routine (venous sample) methods for these cells was 0.772; 0.817 and 0.798, respectively. Only for leukocyte and neutrophil counts, the intra-assay reproducibility was sufficient. The point-of-care device can be used to screen for leukocyte and neutrophil counts. Because of the relatively high measurement uncertainty and poor correlation with venous samples, we recommend to repeat the measurement with a venous sample if cell counts are in the lower reference range. In case of clozapine therapy, neutropenia can probably be excluded if high neutrophil counts are found and patients can continue their therapy. © 2016 John Wiley & Sons Ltd.

  15. Estimating consumer familiarity with health terminology: a context-based approach.

    PubMed

    Zeng-Treitler, Qing; Goryachev, Sergey; Tse, Tony; Keselman, Alla; Boxwala, Aziz

    2008-01-01

    Effective health communication is often hindered by a "vocabulary gap" between language familiar to consumers and jargon used in medical practice and research. To present health information to consumers in a comprehensible fashion, we need to develop a mechanism to quantify health terms as being more likely or less likely to be understood by typical members of the lay public. Prior research has used approaches including syllable count, easy word list, and frequency count, all of which have significant limitations. In this article, we present a new method that predicts consumer familiarity using contextual information. The method was applied to a large query log data set and validated using results from two previously conducted consumer surveys. We measured the correlation between the survey result and the context-based prediction, syllable count, frequency count, and log normalized frequency count. The correlation coefficient between the context-based prediction and the survey result was 0.773 (p < 0.001), which was higher than the correlation coefficients between the survey result and the syllable count, frequency count, and log normalized frequency count (p < or = 0.012). The context-based approach provides a good alternative to the existing term familiarity assessment methods.

  16. A Vision-Based Counting and Recognition System for Flying Insects in Intelligent Agriculture.

    PubMed

    Zhong, Yuanhong; Gao, Junyuan; Lei, Qilun; Zhou, Yao

    2018-05-09

    Rapid and accurate counting and recognition of flying insects are of great importance, especially for pest control. Traditional manual identification and counting of flying insects is labor intensive and inefficient. In this study, a vision-based counting and classification system for flying insects is designed and implemented. The system is constructed as follows: firstly, a yellow sticky trap is installed in the surveillance area to trap flying insects and a camera is set up to collect real-time images. Then the detection and coarse counting method based on You Only Look Once (YOLO) object detection, the classification method and fine counting based on Support Vector Machines (SVM) using global features are designed. Finally, the insect counting and recognition system is implemented on Raspberry PI. Six species of flying insects including bee, fly, mosquito, moth, chafer and fruit fly are selected to assess the effectiveness of the system. Compared with the conventional methods, the test results show promising performance. The average counting accuracy is 92.50% and average classifying accuracy is 90.18% on Raspberry PI. The proposed system is easy-to-use and provides efficient and accurate recognition data, therefore, it can be used for intelligent agriculture applications.

  17. A Vision-Based Counting and Recognition System for Flying Insects in Intelligent Agriculture

    PubMed Central

    Zhong, Yuanhong; Gao, Junyuan; Lei, Qilun; Zhou, Yao

    2018-01-01

    Rapid and accurate counting and recognition of flying insects are of great importance, especially for pest control. Traditional manual identification and counting of flying insects is labor intensive and inefficient. In this study, a vision-based counting and classification system for flying insects is designed and implemented. The system is constructed as follows: firstly, a yellow sticky trap is installed in the surveillance area to trap flying insects and a camera is set up to collect real-time images. Then the detection and coarse counting method based on You Only Look Once (YOLO) object detection, the classification method and fine counting based on Support Vector Machines (SVM) using global features are designed. Finally, the insect counting and recognition system is implemented on Raspberry PI. Six species of flying insects including bee, fly, mosquito, moth, chafer and fruit fly are selected to assess the effectiveness of the system. Compared with the conventional methods, the test results show promising performance. The average counting accuracy is 92.50% and average classifying accuracy is 90.18% on Raspberry PI. The proposed system is easy-to-use and provides efficient and accurate recognition data, therefore, it can be used for intelligent agriculture applications. PMID:29747429

  18. Learning to Count: School Finance Formula Count Methods and Attendance-Related Student Outcomes

    ERIC Educational Resources Information Center

    Ely, Todd L.; Fermanich, Mark L.

    2013-01-01

    School systems are under increasing pressure to improve student performance. Several states have recently explored adopting student count methods for school funding purposes that incentivize school attendance and continuous enrollment by adjusting funding for changes in enrollment or attendance over the course of the school year. However, no…

  19. COMPARISON OF LABORATORY SUBSAMPLING METHODS OF BENTHIC SAMPLES FROM BOATABLE RIVERS USING ACTUAL AND SIMULATED COUNT DATA

    EPA Science Inventory

    We examined the effects of using a fixed-count subsample of 300 organisms on metric values using macroinvertebrate samples collected with 3 field sampling methods at 12 boatable river sites. For each sample, we used metrics to compare an initial fixed-count subsample of approxima...

  20. Evaluation of seven aquatic sampling methods for amphibians and other aquatic fauna

    USGS Publications Warehouse

    Gunzburger, M.S.

    2007-01-01

    To design effective and efficient research and monitoring programs researchers must have a thorough understanding of the capabilities and limitations of their sampling methods. Few direct comparative studies exist for aquatic sampling methods for amphibians. The objective of this study was to simultaneously employ seven aquatic sampling methods in 10 wetlands to compare amphibian species richness and number of individuals detected with each method. Four sampling methods allowed counts of individuals (metal dipnet, D-frame dipnet, box trap, crayfish trap), whereas the other three methods allowed detection of species (visual encounter, aural, and froglogger). Amphibian species richness was greatest with froglogger, box trap, and aural samples. For anuran species, the sampling methods by which each life stage was detected was related to relative length of larval and breeding periods and tadpole size. Detection probability of amphibians varied across sampling methods. Box trap sampling resulted in the most precise amphibian count, but the precision of all four count-based methods was low (coefficient of variation > 145 for all methods). The efficacy of the four count sampling methods at sampling fish and aquatic invertebrates was also analyzed because these predatory taxa are known to be important predictors of amphibian habitat distribution. Species richness and counts were similar for fish with the four methods, whereas invertebrate species richness and counts were greatest in box traps. An effective wetland amphibian monitoring program in the southeastern United States should include multiple sampling methods to obtain the most accurate assessment of species community composition at each site. The combined use of frogloggers, crayfish traps, and dipnets may be the most efficient and effective amphibian monitoring protocol. ?? 2007 Brill Academic Publishers.

  1. Primary Standardization of 152Eu by 4πβ(LS) – γ (Nal) coincidence counting and CIEMAT-NIST method

    NASA Astrophysics Data System (ADS)

    Ruzzarin, A.; da Cruz, P. A. L.; Ferreira Filho, A. L.; Iwahara, A.

    2018-03-01

    The 4πβ-γ coincidence counting and CIEMAT/NIST liquid scintillation method were used in the standardization of a solution of 152Eu. In CIEMAT/NIST method, measurements were performed in a Liquid Scintillation Counter model Wallac 1414. In the 4πβ-γ coincidence counting, the solution was standardized using a coincidence method with ‘‘beta-efficiency extrapolation”. A simple 4πβ-γ coincidence system was used, with acrylic scintillation cell coupled to two coincident photomultipliers at 180° each other and NaI(Tl) detector. The activity concentrations obtained were 156.934 ± 0.722 and 157.403 ± 0.113 kBq/g, respectively, for CIEMAT/NIST and 4πβ-γ coincidence counting measurement methods.

  2. Circulating Tumor Cell Counts Are Prognostic of Overall Survival in SWOG S0421: A Phase III Trial of Docetaxel With or Without Atrasentan for Metastatic Castration-Resistant Prostate Cancer

    PubMed Central

    Goldkorn, Amir; Ely, Benjamin; Quinn, David I.; Tangen, Catherine M.; Fink, Louis M.; Xu, Tong; Twardowski, Przemyslaw; Van Veldhuizen, Peter J.; Agarwal, Neeraj; Carducci, Michael A.; Monk, J. Paul; Datar, Ram H.; Garzotto, Mark; Mack, Philip C.; Lara, Primo; Higano, Celestia S.; Hussain, Maha; Thompson, Ian Murchie; Cote, Richard J.; Vogelzang, Nicholas J.

    2014-01-01

    Purpose Circulating tumor cell (CTC) enumeration has not been prospectively validated in standard first-line docetaxel treatment for metastatic castration-resistant prostate cancer. We assessed the prognostic value of CTCs for overall survival (OS) and disease response in S0421, a phase III trial of docetaxel plus prednisone with or without atrasentan. Patients and Methods CTCs were enumerated at baseline (day 0) and before cycle two (day 21) using CellSearch. Baseline counts and changes in counts from day 0 to 21 were evaluated for association with OS, prostate-specific antigen (PSA), and RECIST response using Cox regression as well as receiver operator characteristic (ROC) curves, integrated discrimination improvement (IDI) analysis, and regression trees. Results Median day-0 CTC count was five cells per 7.5 mL, and CTCs < versus ≥ five per 7.5 mL were significantly associated with baseline PSA, bone pain, liver disease, hemoglobin, alkaline phosphatase, and subsequent PSA and RECIST response. Median OS was 26 months for < five versus 13 months for ≥ five CTCs per 7.5 mL at day 0 (hazard ratio [HR], 2.74 [adjusting for covariates]). ROC curves had higher areas under the curve for day-0 CTCs than for PSA, and IDI analysis showed that adding day-0 CTCs to baseline PSA and other covariates increased predictive accuracy for survival by 8% to 10%. Regression trees yielded new prognostic subgroups, and rising CTC count from day 0 to 21 was associated with shorter OS (HR, 2.55). Conclusion These data validate the prognostic utility of CTC enumeration in a large docetaxel-based prospective cohort. Baseline CTC counts were prognostic, and rising CTCs at 3 weeks heralded significantly worse OS, potentially serving as an early metric to help redirect and optimize therapy in this clinical setting. PMID:24616308

  3. Experimental Study for Automatic Colony Counting System Based Onimage Processing

    NASA Astrophysics Data System (ADS)

    Fang, Junlong; Li, Wenzhe; Wang, Guoxin

    Colony counting in many colony experiments is detected by manual method at present, therefore it is difficult for man to execute the method quickly and accurately .A new automatic colony counting system was developed. Making use of image-processing technology, a study was made on the feasibility of distinguishing objectively white bacterial colonies from clear plates according to the RGB color theory. An optimal chromatic value was obtained based upon a lot of experiments on the distribution of the chromatic value. It has been proved that the method greatly improves the accuracy and efficiency of the colony counting and the counting result is not affected by using inoculation, shape or size of the colony. It is revealed that automatic detection of colony quantity using image-processing technology could be an effective way.

  4. Delineating Species with DNA Barcodes: A Case of Taxon Dependent Method Performance in Moths

    PubMed Central

    Kekkonen, Mari; Mutanen, Marko; Kaila, Lauri; Nieminen, Marko; Hebert, Paul D. N.

    2015-01-01

    The accelerating loss of biodiversity has created a need for more effective ways to discover species. Novel algorithmic approaches for analyzing sequence data combined with rapidly expanding DNA barcode libraries provide a potential solution. While several analytical methods are available for the delineation of operational taxonomic units (OTUs), few studies have compared their performance. This study compares the performance of one morphology-based and four DNA-based (BIN, parsimony networks, ABGD, GMYC) methods on two groups of gelechioid moths. It examines 92 species of Finnish Gelechiinae and 103 species of Australian Elachistinae which were delineated by traditional taxonomy. The results reveal a striking difference in performance between the two taxa with all four DNA-based methods. OTU counts in the Elachistinae showed a wider range and a relatively low (ca. 65%) OTU match with reference species while OTU counts were more congruent and performance was higher (ca. 90%) in the Gelechiinae. Performance rose when only monophyletic species were compared, but the taxon-dependence remained. None of the DNA-based methods produced a correct match with non-monophyletic species, but singletons were handled well. A simulated test of morphospecies-grouping performed very poorly in revealing taxon diversity in these small, dull-colored moths. Despite the strong performance of analyses based on DNA barcodes, species delineated using single-locus mtDNA data are best viewed as OTUs that require validation by subsequent integrative taxonomic work. PMID:25849083

  5. INTERPRETING FLUX FROM BROADBAND PHOTOMETRY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Peter J.; Breeveld, Alice; Roming, Peter W. A.

    2016-10-01

    We discuss the transformation of observed photometry into flux for the creation of spectral energy distributions (SED) and the computation of bolometric luminosities. We do this in the context of supernova studies, particularly as observed with the Swift spacecraft, but the concepts and techniques should be applicable to many other types of sources and wavelength regimes. Traditional methods of converting observed magnitudes to flux densities are not very accurate when applied to UV photometry. Common methods for extinction and the integration of pseudo-bolometric fluxes can also lead to inaccurate results. The sources of inaccuracy, though, also apply to other wavelengths.more » Because of the complicated nature of translating broadband photometry into monochromatic flux densities, comparison between observed photometry and a spectroscopic model is best done by forward modeling the spectrum into the count rates or magnitudes of the observations. We recommend that integrated flux measurements be made using a spectrum or SED which is consistent with the multi-band photometry rather than converting individual photometric measurements to flux densities, linearly interpolating between the points, and integrating. We also highlight some specific areas where the UV flux can be mischaracterized.« less

  6. Study on ultra-fast single photon counting spectrometer based on PCI

    NASA Astrophysics Data System (ADS)

    Zhang, Xi-feng

    2010-10-01

    The time-correlated single photon counting spectrometer developed uses PCI bus technology. We developed the ultrafast data acquisition card based on PCI, replace multi-channel analyzer primary. The system theory and design of the spectrometer are presented in detail, and the process of operation is introduced with the integration of the system. Many standard samples have been measured and the data have been analyzed and contrasted. Experimental results show that the spectrometer, s sensitive is single photon counting, and fluorescence life-span and time resolution is picosecond level. And the instrument could measure time-resolved spectroscopy.

  7. Recent trends in counts of migrant hawks from northeastern North America

    USGS Publications Warehouse

    Titus, K.; Fuller, M.R.

    1990-01-01

    Using simple regression, pooled-sites route-regression, and nonparametric rank-trend analyses, we evaluated trends in counts of hawks migrating past 6 eastern hawk lookouts from 1972 to 1987. The indexing variable was the total count for a season. Bald eagle (Haliaeetus leucocephalus), peregrine falcon (Falco peregrinus), merlin (F. columbarius), osprey (Pandion haliaetus), and Cooper's hawk (Accipiter cooperii) counts increased using route-regression and nonparametric methods (P 0.10). We found no consistent trends (P > 0.10) in counts of sharp-shinned hawks (A. striatus), northern goshawks (A. gentilis) red-shouldered hawks (Buteo lineatus), red-tailed hawks (B. jamaicensis), rough-legged hawsk (B. lagopus), and American kestrels (F. sparverius). Broad-winged hawk (B. platypterus) counts declined (P < 0.05) based on the route-regression method. Empirical comparisons of our results with those for well-studied species such as the peregrine falcon, bald eagle, and osprey indicated agreement with nesting surveys. We suggest that counts of migrant hawks are a useful and economical method for detecting long-term trends in species across regions, particularly for species that otherwise cannot be easily surveyed.

  8. The use of multiple indices of physiological activity to access viability in chlorine disinfected Escherichia coli O157:H7

    NASA Technical Reports Server (NTRS)

    Lisle, J. T.; Pyle, B. H.; McFeters, G. A.

    1999-01-01

    A suite of fluorescent intracellular stains and probes was used, in conjunction with viable plate counts, to assess the effect of chlorine disinfection on membrane potential (rhodamine 123; Rh123 and bis-(1,3-dibutylbarbituric acid) trimethine oxonol; DiBAC4(3)), membrane integrity (LIVE/DEAD BacLight kit), respiratory activity (5-cyano-2,3-ditolyl tetrazolium chloride; CTC) and substrate responsiveness (direct viable counts; DVC) in the commensal pathogen Escherichia coli O157:H7. After a 5 min exposure to the disinfectant, physiological indices were affected in the following order: viable plate counts > substrate responsiveness > membrane potential > respiratory activity > membrane integrity. In situ assessment of physiological activity by examining multiple targets, as demonstrated in this study, permits a more comprehensive determination of the site and extent of injury in bacterial cells following sublethal disinfection with chlorine. This approach to assessing altered bacterial physiology has application in various fields where detection of stressed bacteria is of interest.

  9. Mass Spectrometric Identification of Phospholipids in Human Tears and Tear Lipocalin

    PubMed Central

    Dean, Austin W.; Glasgow, Ben J.

    2012-01-01

    Purpose. The purpose of this article was to identify by mass spectrometry phosphocholine lipids in stimulated human tears and determine the molecules bound to tear lipocalin or other proteins. Methods. Tear proteins were separated isocratically from pooled stimulated human tears by gel filtration fast performance liquid chromatography. Separation of tear lipocalin was confirmed by SDS tricine gradient PAGE. Protein fractions were extracted with chloroform/methanol and analyzed with electrospray ionization MS/MS triple quadrupole mass spectrometry in precursor ion scan mode for select leaving groups. For quantification, integrated ion counts were derived from standard curves of authentic compounds of phosphatidylcholine (PC) and phosphatidylserine. Results. Linear approximation was possible from integration of the mass spectrometrically obtained ion peaks at 760 Da for the PC standard. Tears contained 194 ng/mL of the major intact PC (34:2), m/z 758.6. Ten other monoisotopic phosphocholines were found in tears. A peak at 703.3 Da was assigned as a sphingomyelin. Four lysophosphatidylcholines (m/z 490–540) accounted for about 80% of the total integrated ion count. The [M+H]+ compound, m/z 496.3, accounted for 60% of the signal intensity. Only the tear lipocalin–bearing fractions showed phosphocholines (104 ng/mL). Although the intact phospholipids bound to tear lipocalin corresponded precisely in mass and relative signal intensity to that found in tears, we did not identify phosphocholines between m/z 490 and 540 in any of the gel-filtration fractions. Conclusions. Phospholipids, predominantly lysophospholipids, are present in tears. The higher mass intact PCs in tears are native ligands of tear lipocalin. PMID:22395887

  10. Evaluation of Pulse Counting for the Mars Organic Mass Analyzer (MOMA) Ion Trap Detection Scheme

    NASA Technical Reports Server (NTRS)

    Van Amerom, Friso H.; Short, Tim; Brinckerhoff, William; Mahaffy, Paul; Kleyner, Igor; Cotter, Robert J.; Pinnick, Veronica; Hoffman, Lars; Danell, Ryan M.; Lyness, Eric I.

    2011-01-01

    The Mars Organic Mass Analyzer is being developed at Goddard Space Flight Center to identify organics and possible biological compounds on Mars. In the process of characterizing mass spectrometer size, weight, and power consumption, the use of pulse counting was considered for ion detection. Pulse counting has advantages over analog-mode amplification of the electron multiplier signal. Some advantages are reduced size of electronic components, low power consumption, ability to remotely characterize detector performance, and avoidance of analog circuit noise. The use of pulse counting as a detection method with ion trap instruments is relatively rare. However, with the recent development of high performance electrical components, this detection method is quite suitable and can demonstrate significant advantages over analog methods. Methods A prototype quadrupole ion trap mass spectrometer with an internal electron ionization source was used as a test setup to develop and evaluate the pulse-counting method. The anode signal from the electron multiplier was preamplified. The an1plified signal was fed into a fast comparator for pulse-level discrimination. The output of the comparator was fed directly into a Xilinx FPGA development board. Verilog HDL software was written to bin the counts at user-selectable intervals. This system was able to count pulses at rates in the GHz range. The stored ion count nun1ber per bin was transferred to custom ion trap control software. Pulse-counting mass spectra were compared with mass spectra obtained using the standard analog-mode ion detection. Prelin1inary Data Preliminary mass spectra have been obtained for both analog mode and pulse-counting mode under several sets of instrument operating conditions. Comparison of the spectra revealed better peak shapes for pulse-counting mode. Noise levels are as good as, or better than, analog-mode detection noise levels. To artificially force ion pile-up conditions, the ion trap was overfilled and ions were ejected at very high scan rates. Pile-up of ions was not significant for the ion trap under investigation even though the ions are ejected in so-called 'ion-micro packets'. It was found that pulse counting mode had higher dynamic range than analog mode, and that the first amplification stage in analog mode can distort mass peaks. The inherent speed of the pulse counting method also proved to be beneficial to ion trap operation and ion ejection characterization. Very high scan rates were possible with pulse counting since the digital circuitry response time is so much smaller than with the analog method. Careful investigation of the pulse-counting data also allowed observation of the applied resonant ejection frequency during mass analysis. Ejection of ion micro packets could be clearly observed in the binned data. A second oscillation frequency, much lower than the secular frequency, was also observed. Such an effect was earlier attributed to the oscillation of the total plasma cloud in the ion trap. While the components used to implement pulse counting are quite advanced, due to their prevalence in consumer electronics, the cost of this detection system is no more than that of an analog mode system. Total pulse-counting detection system electronics cost is under $250

  11. Reduction of CMOS Image Sensor Read Noise to Enable Photon Counting.

    PubMed

    Guidash, Michael; Ma, Jiaju; Vogelsang, Thomas; Endsley, Jay

    2016-04-09

    Recent activity in photon counting CMOS image sensors (CIS) has been directed to reduction of read noise. Many approaches and methods have been reported. This work is focused on providing sub 1 e(-) read noise by design and operation of the binary and small signal readout of photon counting CIS. Compensation of transfer gate feed-through was used to provide substantially reduced CDS time and source follower (SF) bandwidth. SF read noise was reduced by a factor of 3 with this method. This method can be applied broadly to CIS devices to reduce the read noise for small signals to enable use as a photon counting sensor.

  12. Use of surveillance data on HIV diagnoses with HIV-related symptoms to estimate the number of people living with undiagnosed HIV in need of antiretroviral therapy.

    PubMed

    Lodwick, Rebecca K; Nakagawa, Fumiyo; van Sighem, Ard; Sabin, Caroline A; Phillips, Andrew N

    2015-01-01

    It is important to have methods available to estimate the number of people who have undiagnosed HIV and are in need of antiretroviral therapy (ART). The method uses the concept that a predictable level of occurrence of AIDS or other HIV-related clinical symptoms which lead to presentation for care, and hence diagnosis of HIV, arises in undiagnosed people with a given CD4 count. The method requires surveillance data on numbers of new HIV diagnoses with HIV-related symptoms, and the CD4 count at diagnosis. The CD4 count-specific rate at which HIV-related symptoms develop are estimated from cohort data. 95% confidence intervals can be constructed using a simple simulation method. For example, if there were 13 HIV diagnoses with HIV-related symptoms made in one year with CD4 count at diagnosis between 150-199 cells/mm3, then since the CD4 count-specific rate of HIV-related symptoms is estimated as 0.216 per person-year, the estimated number of person years lived in people with undiagnosed HIV with CD4 count 150-199 cells/mm3 is 13/0.216 = 60 (95% confidence interval: 29-100), which is considered an estimate of the number of people living with undiagnosed HIV in this CD4 count stratum. The method is straightforward to implement within a short period once a surveillance system of all new HIV diagnoses, collecting data on HIV-related symptoms at diagnosis, is in place and is most suitable for estimating the number of undiagnosed people with CD4 count <200 cells/mm3 due to the low rate of developing HIV-related symptoms at higher CD4 counts. A potential source of bias is under-diagnosis and under-reporting of diagnoses with HIV-related symptoms. Although this method has limitations as with all approaches, it is important for prompting increased efforts to identify undiagnosed people, particularly those with low CD4 count, and for informing levels of unmet need for ART.

  13. Elliptic Curve Integral Points on y2 = x3 + 3x ‑ 14

    NASA Astrophysics Data System (ADS)

    Zhao, Jianhong

    2018-03-01

    The positive integer points and integral points of elliptic curves are very important in the theory of number and arithmetic algebra, it has a wide range of applications in cryptography and other fields. There are some results of positive integer points of elliptic curve y 2 = x 3 + ax + b, a, b ∈ Z In 1987, D. Zagier submit the question of the integer points on y 2 = x 3 ‑ 27x + 62, it count a great deal to the study of the arithmetic properties of elliptic curves. In 2009, Zhu H L and Chen J H solved the problem of the integer points on y 2 = x 3 ‑ 27x + 62 by using algebraic number theory and P-adic analysis method. In 2010, By using the elementary method, Wu H M obtain all the integral points of elliptic curves y 2 = x 3 ‑ 27x ‑ 62. In 2015, Li Y Z and Cui B J solved the problem of the integer points on y 2 = x 3 ‑ 21x ‑ 90 By using the elementary method. In 2016, Guo J solved the problem of the integer points on y 2 = x 3 + 27x + 62 by using the elementary method. In 2017, Guo J proved that y 2 = x 3 ‑ 21x + 90 has no integer points by using the elementary method. Up to now, there is no relevant conclusions on the integral points of elliptic curves y 2 = x 3 + 3x ‑ 14, which is the subject of this paper. By using congruence and Legendre Symbol, it can be proved that elliptic curve y 2 = x 3 + 3x ‑ 14 has only one integer point: (x, y) = (2, 0).

  14. Count data, detection probabilities, and the demography, dynamics, distribution, and decline of amphibians.

    PubMed

    Schmidt, Benedikt R

    2003-08-01

    The evidence for amphibian population declines is based on count data that were not adjusted for detection probabilities. Such data are not reliable even when collected using standard methods. The formula C = Np (where C is a count, N the true parameter value, and p is a detection probability) relates count data to demography, population size, or distributions. With unadjusted count data, one assumes a linear relationship between C and N and that p is constant. These assumptions are unlikely to be met in studies of amphibian populations. Amphibian population data should be based on methods that account for detection probabilities.

  15. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  16. [Reassessment of a combination of cerebrospinal fluid scintigraphy and nasal pledget counts in patients with suspected rhinorrhea].

    PubMed

    Kosuda, S; Arai, S; Hohshito, Y; Tokumitsu, H; Kusano, S; Ishihara, S; Shima, K

    1998-07-01

    A combination study of cerebrospinal fluid scintigraphy and nasal pledget counts was performed using 37 MBq of 111In-DTPA in 12 patients with suspected rhinorrhea. A pledget was inserted and dwelled in each nasal cavity for 6 hours, with the patient prone during at least 30 minutes. A total of 18 studies was implemented and nasal pledget counting method successfully diagnosed all of CSF rhinorrhea. Diagnosis was possible when pledget counts were greater than 1 kcpm. In patients with persistent, intermittent and occult/no nasal discharge, rhinorrhea was found in 100% (5/5), 60% (3/5), 25% (2/8), respectively. Two cases only exhibited positive scintigraphy. MRI or CT cisternography should be first performed in patients with persistent discharge, but in patients with intermittent/occult discharge pledget counting method might take priority of other diagnostic modalities. In conclusion, nasal pledget counting method is a simple and useful tool for detecting rhinorrhea.

  17. Reviving common standards in point-count surveys for broad inference across studies

    USGS Publications Warehouse

    Matsuoka, Steven M.; Mahon, C. Lisa; Handel, Colleen M.; Solymos, Peter; Bayne, Erin M.; Fontaine, Patricia C.; Ralph, C.J.

    2014-01-01

    We revisit the common standards recommended by Ralph et al. (1993, 1995a) for conducting point-count surveys to assess the relative abundance of landbirds breeding in North America. The standards originated from discussions among ornithologists in 1991 and were developed so that point-count survey data could be broadly compared and jointly analyzed by national data centers with the goals of monitoring populations and managing habitat. Twenty years later, we revisit these standards because (1) they have not been universally followed and (2) new methods allow estimation of absolute abundance from point counts, but these methods generally require data beyond the original standards to account for imperfect detection. Lack of standardization and the complications it introduces for analysis become apparent from aggregated data. For example, only 3% of 196,000 point counts conducted during the period 1992-2011 across Alaska and Canada followed the standards recommended for the count period and count radius. Ten-minute, unlimited-count-radius surveys increased the number of birds detected by >300% over 3-minute, 50-m-radius surveys. This effect size, which could be eliminated by standardized sampling, was ≥10 times the published effect sizes of observers, time of day, and date of the surveys. We suggest that the recommendations by Ralph et al. (1995a) continue to form the common standards when conducting point counts. This protocol is inexpensive and easy to follow but still allows the surveys to be adjusted for detection probabilities. Investigators might optionally collect additional information so that they can analyze their data with more flexible forms of removal and time-of-detection models, distance sampling, multiple-observer methods, repeated counts, or combinations of these methods. Maintaining the common standards as a base protocol, even as these study-specific modifications are added, will maximize the value of point-count data, allowing compilation and analysis by regional and national data centers.

  18. The BMPix and PEAK Tools: New Methods for Automated Laminae Recognition and Counting - Application to Glacial Varves From Antarctic Marine Sediment

    NASA Astrophysics Data System (ADS)

    Weber, M. E.; Reichelt, L.; Kuhn, G.; Thurow, J. W.; Ricken, W.

    2009-12-01

    We present software-based tools for rapid and quantitative detection of sediment lamination. The BMPix tool extracts color and gray-scale curves from images at ultrahigh (pixel) resolution. The PEAK tool uses the gray-scale curve and performs, for the first time, fully automated counting of laminae based on three methods. The maximum count algorithm counts every bright peak of a couplet of two laminae (annual resolution) in a Gaussian smoothed gray-scale curve. The zero-crossing algorithm counts every positive and negative halfway-passage of the gray-scale curve through a wide moving average. Hence, the record is separated into bright and dark intervals (seasonal resolution). The same is true for the frequency truncation method, which uses Fourier transformation to decompose the gray-scale curve into its frequency components, before positive and negative passages are count. We applied the new methods successfully to tree rings and to well-dated and already manually counted marine varves from Saanich Inlet before we adopted the tools to rather complex marine laminae from the Antarctic continental margin. In combination with AMS14C dating, we found convincing evidence that the laminations from three Weddell Sea sites represent true varves that were deposited on sediment ridges over several millennia during the last glacial maximum (LGM). There are apparently two seasonal layers of terrigenous composition, a coarser-grained bright layer, and a finer-grained dark layer. The new tools offer several advantages over previous tools. The counting procedures are based on a moving average generated from gray-scale curves instead of manual counting. Hence, results are highly objective and rely on reproducible mathematical criteria. Since PEAK associates counts with a specific depth, the thickness of each year or each season is also measured which is an important prerequisite for later spectral analysis. Since all information required to conduct the analysis is displayed graphically, interactive optimization of the counting algorithms can be achieved quickly and conveniently.

  19. Integration of the shallow water equations on the sphere using a vector semi-Lagrangian scheme with a multigrid solver

    NASA Technical Reports Server (NTRS)

    Bates, J. R.; Semazzi, F. H. M.; Higgins, R. W.; Barros, Saulo R. M.

    1990-01-01

    A vector semi-Lagrangian semi-implicit two-time-level finite-difference integration scheme for the shallow water equations on the sphere is presented. A C-grid is used for the spatial differencing. The trajectory-centered discretization of the momentum equation in vector form eliminates pole problems and, at comparable cost, gives greater accuracy than a previous semi-Lagrangian finite-difference scheme which used a rotated spherical coordinate system. In terms of the insensitivity of the results to increasing timestep, the new scheme is as successful as recent spectral semi-Lagrangian schemes. In addition, the use of a multigrid method for solving the elliptic equation for the geopotential allows efficient integration with an operation count which, at high resolution, is of lower order than in the case of the spectral models. The properties of the new scheme should allow finite-difference models to compete with spectral models more effectively than has previously been possible.

  20. Measurements of the thermal neutron cross-section and resonance integral for the 108Pd(n,γ)109Pd reaction

    NASA Astrophysics Data System (ADS)

    Hien, Nguyen Thi; Kim, Guinyun; Kim, Kwangsoo; Do, Nguyen Van; Khue, Pham Duc; Thanh, Kim Tien; Shin, Sung-Gyun; Cho, Moo-Hyun

    2018-06-01

    The thermal neutron capture cross-section (σ0) and resonance integral (I0) of the 108Pd(n,γ)109Pd reaction have been measured relative to that of the monitor reaction 197Au(n,γ)198Au. The measurements were carried out using the neutron activation with the cadmium ratio method. Both the samples and monitors were irradiated with and without cadmium cover of 0.5 mm thickness. The induced activities of the reaction products were measured with a well calibrated HPGe γ-ray detector. In order to improve the accuracy of the results, the necessary corrections for the counting losses were made. The thermal neutron capture cross-section and resonance integral of the 108Pd(n,γ)109Pd reaction were determined to be σ0,Pd = 8.68 ± 0.41 barn and I0,Pd = 245.6 ± 24.8 barn, respectively. The obtained results are compared with literature values and discussed.

  1. Automated vehicle counting using image processing and machine learning

    NASA Astrophysics Data System (ADS)

    Meany, Sean; Eskew, Edward; Martinez-Castro, Rosana; Jang, Shinae

    2017-04-01

    Vehicle counting is used by the government to improve roadways and the flow of traffic, and by private businesses for purposes such as determining the value of locating a new store in an area. A vehicle count can be performed manually or automatically. Manual counting requires an individual to be on-site and tally the traffic electronically or by hand. However, this can lead to miscounts due to factors such as human error A common form of automatic counting involves pneumatic tubes, but pneumatic tubes disrupt traffic during installation and removal, and can be damaged by passing vehicles. Vehicle counting can also be performed via the use of a camera at the count site recording video of the traffic, with counting being performed manually post-recording or using automatic algorithms. This paper presents a low-cost procedure to perform automatic vehicle counting using remote video cameras with an automatic counting algorithm. The procedure would utilize a Raspberry Pi micro-computer to detect when a car is in a lane, and generate an accurate count of vehicle movements. The method utilized in this paper would use background subtraction to process the images and a machine learning algorithm to provide the count. This method avoids fatigue issues that are encountered in manual video counting and prevents the disruption of roadways that occurs when installing pneumatic tubes

  2. Advances in HgCdTe APDs and LADAR Receivers

    NASA Technical Reports Server (NTRS)

    Bailey, Steven; McKeag, William; Wang, Jinxue; Jack, Michael; Amzajerdian, Farzin

    2010-01-01

    Raytheon is developing NIR sensor chip assemblies (SCAs) for scanning and staring 3D LADAR systems. High sensitivity is obtained by integrating high performance detectors with gain i.e. APDs with very low noise Readout Integrated Circuits. Unique aspects of these designs include: independent acquisition (non-gated) of pulse returns, multiple pulse returns with both time and intensity reported to enable full 3D reconstruction of the image. Recent breakthrough in device design has resulted in HgCdTe APDs operating at 300K with essentially no excess noise to gains in excess of 100, low NEP <1nW and GHz bandwidths and have demonstrated linear mode photon counting. SCAs utilizing these high performance APDs have been integrated and demonstrated excellent spatial and range resolution enabling detailed 3D imagery both at short range and long ranges. In this presentation we will review progress in high resolution scanning, staring and ultra-high sensitivity photon counting LADAR sensors.

  3. The solution of linear systems of equations with a structural analysis code on the NAS CRAY-2

    NASA Technical Reports Server (NTRS)

    Poole, Eugene L.; Overman, Andrea L.

    1988-01-01

    Two methods for solving linear systems of equations on the NAS Cray-2 are described. One is a direct method; the other is an iterative method. Both methods exploit the architecture of the Cray-2, particularly the vectorization, and are aimed at structural analysis applications. To demonstrate and evaluate the methods, they were installed in a finite element structural analysis code denoted the Computational Structural Mechanics (CSM) Testbed. A description of the techniques used to integrate the two solvers into the Testbed is given. Storage schemes, memory requirements, operation counts, and reformatting procedures are discussed. Finally, results from the new methods are compared with results from the initial Testbed sparse Choleski equation solver for three structural analysis problems. The new direct solvers described achieve the highest computational rates of the methods compared. The new iterative methods are not able to achieve as high computation rates as the vectorized direct solvers but are best for well conditioned problems which require fewer iterations to converge to the solution.

  4. Validation of analytical methods in GMP: the disposable Fast Read 102® device, an alternative practical approach for cell counting

    PubMed Central

    2012-01-01

    Background The quality and safety of advanced therapy products must be maintained throughout their production and quality control cycle to ensure their final use in patients. We validated the cell count method according to the International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use and European Pharmacopoeia, considering the tests’ accuracy, precision, repeatability, linearity and range. Methods As the cell count is a potency test, we checked accuracy, precision, and linearity, according to ICH Q2. Briefly our experimental approach was first to evaluate the accuracy of Fast Read 102® compared to the Bürker chamber. Once the accuracy of the alternative method was demonstrated, we checked the precision and linearity test only using Fast Read 102®. The data were statistically analyzed by average, standard deviation and coefficient of variation percentages inter and intra operator. Results All the tests performed met the established acceptance criteria of a coefficient of variation of less than ten percent. For the cell count, the precision reached by each operator had a coefficient of variation of less than ten percent (total cells) and under five percent (viable cells). The best range of dilution, to obtain a slope line value very similar to 1, was between 1:8 and 1:128. Conclusions Our data demonstrated that the Fast Read 102® count method is accurate, precise and ensures the linearity of the results obtained in a range of cell dilution. Under our standard method procedures, this assay may thus be considered a good quality control method for the cell count as a batch release quality control test. Moreover, the Fast Read 102® chamber is a plastic, disposable device that allows a number of samples to be counted in the same chamber. Last but not least, it overcomes the problem of chamber washing after use and so allows a cell count in a clean environment such as that in a Cell Factory. In a good manufacturing practice setting the disposable cell counting devices will allow a single use of the count chamber they can then be thrown away, thus avoiding the waste disposal of vital dye (e.g. Trypan Blue) or lysing solution (e.g. Tuerk solution). PMID:22650233

  5. Advances in LADAR Components and Subsystems at Raytheon

    NASA Technical Reports Server (NTRS)

    Jack, Michael; Chapman, George; Edwards, John; McKeag, William; Veeder, Tricia; Wehner, Justin; Roberts, Tom; Robinson, Tom; Neisz, James; Andressen, Cliff; hide

    2012-01-01

    Raytheon is developing NIR sensor chip assemblies (SCAs) for scanning and staring 3D LADAR systems. High sensitivity is obtained by integrating high performance detectors with gain, i.e., APDs with very low noise Readout Integrated Circuits (ROICs). Unique aspects of these designs include: independent acquisition (non-gated) of pulse returns, multiple pulse returns with both time and intensity reported to enable full 3D reconstruction of the image. Recent breakthrough in device design has resulted in HgCdTe APDs operating at 300K with essentially no excess noise to gains in excess of 100, low NEP <1nW and GHz bandwidths and have demonstrated linear mode photon counting. SCAs utilizing these high performance APDs have been integrated and demonstrated excellent spatial and range resolution enabling detailed 3D imagery both at short range and long ranges. In the following we will review progress in real-time 3D LADAR imaging receiver products in three areas: (1) scanning 256 x 4 configuration for the Multi-Mode Sensor Seeker (MMSS) program and (2) staring 256 x 256 configuration for the Autonomous Landing and Hazard Avoidance Technology (ALHAT) lunar landing mission and (3) Photon-Counting SCAs which have demonstrated a dramatic reduction in dark count rate due to improved design, operation and processing.

  6. A rapid and universal bacteria-counting approach using CdSe/ZnS/SiO2 composite nanoparticles as fluorescence probe.

    PubMed

    Fu, Xin; Huang, Kelong; Liu, Suqin

    2010-02-01

    In this paper, a rapid, simple, and sensitive method was described for detection of the total bacterial count using SiO(2)-coated CdSe/ZnS quantum dots (QDs) as a fluorescence marker that covalently coupled with bacteria using glutaraldehyde as the crosslinker. Highly luminescent CdSe/ZnS were prepared by applying cadmium oxide and zinc stearate as precursors instead of pyrophoric organometallic precursors. A reverse-microemulsion technique was used to synthesize CdSe/ZnS/SiO(2) composite nanoparticles with a SiO(2) surface coating. Our results showed that CdSe/ZnS/SiO(2) composite nanoparticles prepared with this method possessed highly luminescent, biologically functional, and monodispersive characteristics, and could successfully be covalently conjugated with the bacteria. As a demonstration, it was found that the method had higher sensitivity and could count bacteria in 3 x 10(2) CFU/mL, lower than the conventional plate counting and organic dye-based method. A linear relationship of the fluorescence peak intensity (Y) and the total bacterial count (X) was established in the range of 3 x 10(2)-10(7) CFU/mL using the equation Y = 374.82X-938.27 (R = 0.99574). The results of the determination for the total count of bacteria in seven real samples were identical with the conventional plate count method, and the standard deviation was satisfactory.

  7. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Manjunath, Naren; Samajdar, Rhine; Jain, Sudhir R., E-mail: srjain@barc.gov.in

    Recently, the nodal domain counts of planar, integrable billiards with Dirichlet boundary conditions were shown to satisfy certain difference equations in Samajdar and Jain (2014). The exact solutions of these equations give the number of domains explicitly. For complete generality, we demonstrate this novel formulation for three additional separable systems and thus extend the statement to all integrable billiards.

  8. Where Cultural Games Count: The Voices of Primary Classroom Teachers

    ERIC Educational Resources Information Center

    Nabie, Michael Johnson

    2015-01-01

    This study explored Ghanaian primary school teachers' values and challenges of integrating cultural games in teaching mathematics. Using an In-depth conversational interview, ten (10) certificated teachers' voices on the values and challenges of integrating games were examined. Thematic data analysis was applied to the qualitative data from the…

  9. Bibliometrics, Librarians, and Bibliograms

    ERIC Educational Resources Information Center

    White, Howard D.

    2016-01-01

    This paper sets forth an integrated way of introducing bibliometrics to relatively non-quantitative audiences, such as librarians and iSchool students. The integrative device is the bibliogram, a linguistic object consisting of a seed term and the terms that co-occur with it, ranked by their co-occurrence counts with the seed--a standard…

  10. Integration and task shifting for TB/HIV care and treatment in highly resource-scarce settings: one size may not fit all.

    PubMed

    Van Rie, Annelies; Patel, Monita R; Nana, Mbonze; Vanden Driessche, Koen; Tabala, Martine; Yotebieng, Marcel; Behets, Frieda

    2014-03-01

    A crucial question in managing HIV-infected patients with tuberculosis (TB) concerns when and how to initiate antiretroviral therapy (ART). The effectiveness of CD4-stratified ART initiation in a nurse-centered, integrated TB/HIV program at primary care in Kinshasa, Democratic Republic of Congo, was assessed. Prospective cohort study was conducted to assess the effect of CD4-stratified ART initiation by primary care nurses (513 TB patients, August 2007 to November 2009). ART was to be initiated at 1 month of TB treatment if CD4 count is <100 cells per cubic millimeter, at 2 months if CD4 count is 100-350 cells per cubic millimeter, and at the end of TB treatment after CD4 count reassessment if CD4 count is >350 cells per cubic millimeter. ART uptake and mortality were compared with a historical prospective cohort of 373 HIV-infected TB patients referred for ART to a centralized facility and 3577 HIV-negative TB patients (January 2006 to May 2007). ART uptake increased (17%-69%, P < 0.0001) and mortality during TB treatment decreased (20.1% vs 9.8%, P < 0.0003) after decentralized, nurse-initiated, CD4-stratified ART. Mortality among TB patients with CD4 count >100 cells per cubic millimeter was similar to that of HIV-negative TB patients (5.6% vs 6.3%, P = 0.65), but mortality among those with CD4 count <100 cells per cubic millimeter remained high (18.8%). Nurse-centered, CD4-stratified ART initiation at primary care level was effective in increasing timely ART uptake and reducing mortality among TB patients but may not be adequate to prevent mortality among those presenting with severe immunosuppression. Further research is needed to determine the optimal management at primary care level of TB patients with CD4 counts <100 cells per cubic millimeter.

  11. The use of portable 2D echocardiography and 'frame-based' bubble counting as a tool to evaluate diving decompression stress.

    PubMed

    Germonpré, Peter; Papadopoulou, Virginie; Hemelryck, Walter; Obeid, Georges; Lafère, Pierre; Eckersley, Robert J; Tang, Meng-Xing; Balestra, Costantino

    2014-03-01

    'Decompression stress' is commonly evaluated by scoring circulating bubble numbers post dive using Doppler or cardiac echography. This information may be used to develop safer decompression algorithms, assuming that the lower the numbers of venous gas emboli (VGE) observed post dive, the lower the statistical risk of decompression sickness (DCS). Current echocardiographic evaluation of VGE, using the Eftedal and Brubakk method, has some disadvantages as it is less well suited for large-scale evaluation of recreational diving profiles. We propose and validate a new 'frame-based' VGE-counting method which offers a continuous scale of measurement. Nine 'raters' of varying familiarity with echocardiography were asked to grade 20 echocardiograph recordings using both the Eftedal and Brubakk grading and the new 'frame-based' counting method. They were also asked to count the number of bubbles in 50 still-frame images, some of which were randomly repeated. A Wilcoxon Spearman ρ calculation was used to assess test-retest reliability of each rater for the repeated still frames. For the video images, weighted kappa statistics, with linear and quadratic weightings, were calculated to measure agreement between raters for the Eftedal and Brubakk method. Bland-Altman plots and intra-class correlation coefficients were used to measure agreement between raters for the frame-based counting method. Frame-based counting showed a better inter-rater agreement than the Eftedal and Brubakk grading, even with relatively inexperienced assessors, and has good intra- and inter-rater reliability. Frame-based bubble counting could be used to evaluate post-dive decompression stress, and offers possibilities for computer-automated algorithms to allow near-real-time counting.

  12. Detecting swift fox: Smoked-plate scent stations versus spotlighting

    Treesearch

    Daniel W. Uresk; Kieth E. Severson; Jody Javersak

    2003-01-01

    We compared two methods of detecting presence of swift fox: smoked-plate scent stations and spotlight counts. Tracks were counted on ten 1-mile (1.6-km) transects with bait/tracking plate stations every 0.1 mile (0.16 km). Vehicle spotlight counts were conducted on the same transects. Methods were compared with Spearman's rank order correlation. Repeated measures...

  13. Verification of the hygienic adequacy of beef carcass cooling processes by microbiological culture and the temperature-function integration technique.

    PubMed

    Jericho, K W; O'Laney, G; Kozub, G C

    1998-10-01

    To enhance food safety and keeping quality, beef carcasses are cooled immediately after leaving the slaughter floor. Within hazard analysis and critical control point (HACCP) systems, this cooling process needs to be monitored by the industry and verified by regulatory agencies. This study assessed the usefulness of the temperature-function integration technique (TFIT) for the verification of the hygienic adequacy of two cooling processes for beef carcasses at one abattoir. The cooling process passes carcasses through a spray cooler for at least 17 h and a holding cooler for at least 7 h. The TFIT is faster and cheaper than culture methods. For spray cooler 1, the Escherichia coli generations predicted by TFIT for carcass surfaces (pelvic and shank sites) were compared to estimated E. coli counts from 120 surface excision samples (rump, brisket, and sacrum; 5 by 5 by 0.2 cm) before and after cooling. Counts of aerobic bacteria, coliforms, and E. coli were decreased after spray cooler 1 (P < or = 0.001). The number of E. coli generations (with lag) at the pelvic site calculated by TFIT averaged 0.85 +/- 0.19 and 0.15 +/- 0.04 after emerging from spray coolers 1 and 2, respectively. The TFIT (with lag) was considered convenient and appropriate for the inspection service to verify HACCP systems for carcass cooling processes.

  14. Reduction of CMOS Image Sensor Read Noise to Enable Photon Counting

    PubMed Central

    Guidash, Michael; Ma, Jiaju; Vogelsang, Thomas; Endsley, Jay

    2016-01-01

    Recent activity in photon counting CMOS image sensors (CIS) has been directed to reduction of read noise. Many approaches and methods have been reported. This work is focused on providing sub 1 e− read noise by design and operation of the binary and small signal readout of photon counting CIS. Compensation of transfer gate feed-through was used to provide substantially reduced CDS time and source follower (SF) bandwidth. SF read noise was reduced by a factor of 3 with this method. This method can be applied broadly to CIS devices to reduce the read noise for small signals to enable use as a photon counting sensor. PMID:27070625

  15. Comparison of point counts and territory mapping for detecting effects of forest management on songbirds

    USGS Publications Warehouse

    Newell, Felicity L.; Sheehan, James; Wood, Petra Bohall; Rodewald, Amanda D.; Buehler, David A.; Keyser, Patrick D.; Larkin, Jeffrey L.; Beachy, Tiffany A.; Bakermans, Marja H.; Boves, Than J.; Evans, Andrea; George, Gregory A.; McDermott, Molly E.; Perkins, Kelly A.; White, Matthew; Wigley, T. Bently

    2013-01-01

    Point counts are commonly used to assess changes in bird abundance, including analytical approaches such as distance sampling that estimate density. Point-count methods have come under increasing scrutiny because effects of detection probability and field error are difficult to quantify. For seven forest songbirds, we compared fixed-radii counts (50 m and 100 m) and density estimates obtained from distance sampling to known numbers of birds determined by territory mapping. We applied point-count analytic approaches to a typical forest management question and compared results to those obtained by territory mapping. We used a before–after control impact (BACI) analysis with a data set collected across seven study areas in the central Appalachians from 2006 to 2010. Using a 50-m fixed radius, variance in error was at least 1.5 times that of the other methods, whereas a 100-m fixed radius underestimated actual density by >3 territories per 10 ha for the most abundant species. Distance sampling improved accuracy and precision compared to fixed-radius counts, although estimates were affected by birds counted outside 10-ha units. In the BACI analysis, territory mapping detected an overall treatment effect for five of the seven species, and effects were generally consistent each year. In contrast, all point-count methods failed to detect two treatment effects due to variance and error in annual estimates. Overall, our results highlight the need for adequate sample sizes to reduce variance, and skilled observers to reduce the level of error in point-count data. Ultimately, the advantages and disadvantages of different survey methods should be considered in the context of overall study design and objectives, allowing for trade-offs among effort, accuracy, and power to detect treatment effects.

  16. An automated approach for annual layer counting in ice cores

    NASA Astrophysics Data System (ADS)

    Winstrup, M.; Svensson, A.; Rasmussen, S. O.; Winther, O.; Steig, E.; Axelrod, A.

    2012-04-01

    The temporal resolution of some ice cores is sufficient to preserve seasonal information in the ice core record. In such cases, annual layer counting represents one of the most accurate methods to produce a chronology for the core. Yet, manual layer counting is a tedious and sometimes ambiguous job. As reliable layer recognition becomes more difficult, a manual approach increasingly relies on human interpretation of the available data. Thus, much may be gained by an automated and therefore objective approach for annual layer identification in ice cores. We have developed a novel method for automated annual layer counting in ice cores, which relies on Bayesian statistics. It uses algorithms from the statistical framework of Hidden Markov Models (HMM), originally developed for use in machine speech recognition. The strength of this layer detection algorithm lies in the way it is able to imitate the manual procedures for annual layer counting, while being based on purely objective criteria for annual layer identification. With this methodology, it is possible to determine the most likely position of multiple layer boundaries in an entire section of ice core data at once. It provides a probabilistic uncertainty estimate of the resulting layer count, hence ensuring a proper treatment of ambiguous layer boundaries in the data. Furthermore multiple data series can be incorporated to be used at once, hence allowing for a full multi-parameter annual layer counting method similar to a manual approach. In this study, the automated layer counting algorithm has been applied to data from the NGRIP ice core, Greenland. The NGRIP ice core has very high temporal resolution with depth, and hence the potential to be dated by annual layer counting far back in time. In previous studies [Andersen et al., 2006; Svensson et al., 2008], manual layer counting has been carried out back to 60 kyr BP. A comparison between the counted annual layers based on the two approaches will be presented and their differences discussed. Within the estimated uncertainties, the two methodologies agree. This shows the potential for a fully automated annual layer counting method to be operational for data sections where the annual layering is unknown.

  17. Steam versus hot-water scalding in reducing bacterial loads on the skin of commercially processed poultry.

    PubMed

    Patrick, T E; Goodwin, T L; Collins, J A; Wyche, R C; Love, B E

    1972-04-01

    A comparison of two types of scalders was conducted to determine their effectiveness in reducing bacterial contamination of poultry carcasses. A conventional hot-water scalder and a prototype model of a steam scalder were tested under commercial conditions. Total plate counts from steam-scalded birds were significantly lower than the counts of water-scalded birds immediately after scalding and again after picking. No differences in the two methods could be found after chilling. Coliform counts from steam-scalded birds were significantly lower than the counts from water-scalded birds immediately after scalding. No significant differences in coliform counts were detected when the two scald methods were compared after defeathering and chilling.

  18. [Automated hematology analysers and spurious counts Part 3. Haemoglobin, red blood cells, cell count and indices, reticulocytes].

    PubMed

    Godon, Alban; Genevieve, Franck; Marteau-Tessier, Anne; Zandecki, Marc

    2012-01-01

    Several situations lead to abnormal haemoglobin measurement or to abnormal red blood cells (RBC) counts, including hyperlipemias, agglutinins and cryoglobulins, haemolysis, or elevated white blood cells (WBC) counts. Mean (red) cell volume may be also subject to spurious determination, because of agglutinins (mainly cold), high blood glucose level, natremia, anticoagulants in excess and at times technological considerations. Abnormality related to one measured parameter eventually leads to abnormal calculated RBC indices: mean cell haemoglobin content is certainly the most important RBC parameter to consider, maybe as important as flags generated by the haematology analysers (HA) themselves. In many circumstances, several of the measured parameters from cell blood counts (CBC) may be altered, and the discovery of a spurious change on one parameter frequently means that the validity of other parameters should be considered. Sensitive flags allow now the identification of several spurious counts, but only the most sophisticated HA have optimal flagging, and simpler ones, especially those without any WBC differential scattergram, do not share the same capacity to detect abnormal results. Reticulocytes are integrated into the CBC in many HA, and several situations may lead to abnormal counts, including abnormal gating, interference with intraerythrocytic particles, erythroblastosis or high WBC counts.

  19. Predicting miRNA targets for head and neck squamous cell carcinoma using an ensemble method.

    PubMed

    Gao, Hong; Jin, Hui; Li, Guijun

    2018-01-01

    This study aimed to uncover potential microRNA (miRNA) targets in head and neck squamous cell carcinoma (HNSCC) using an ensemble method which combined 3 different methods: Pearson's correlation coefficient (PCC), Lasso and a causal inference method (i.e., intervention calculus when the directed acyclic graph (DAG) is absent [IDA]), based on Borda count election. The Borda count election method was used to integrate the top 100 predicted targets of each miRNA generated by individual methods. Afterwards, to validate the performance ability of our method, we checked the TarBase v6.0, miRecords v2013, miRWalk v2.0 and miRTarBase v4.5 databases to validate predictions for miRNAs. Pathway enrichment analysis of target genes in the top 1,000 miRNA-messenger RNA (mRNA) interactions was conducted to focus on significant KEGG pathways. Finally, we extracted target genes based on occurrence frequency ≥3. Based on an absolute value of PCC >0.7, we found 33 miRNAs and 288 mRNAs for further analysis. We extracted 10 target genes with predicted frequencies not less than 3. The target gene MYO5C possessed the highest frequency, which was predicted by 7 different miRNAs. Significantly, a total of 8 pathways were identified; the pathways of cytokine-cytokine receptor interaction and chemokine signaling pathway were the most significant. We successfully predicted target genes and pathways for HNSCC relying on miRNA expression data, mRNA expression profile, an ensemble method and pathway information. Our results may offer new information for the diagnosis and estimation of the prognosis of HNSCC.

  20. A Comparison of the OSHA Modified NIOSH Physical and Chemical Analytical Method (P and CAM) 304 and the Dust Trak Photometric Aerosol Sampler for 0-Chlorobenzylidine Malonitrile

    DTIC Science & Technology

    2013-04-02

    photometric particle counting instrument, DustTrak, to the established OSHA modified NIOSH P&CAM 304 method to determine correlation between the two...study compared the non-specific, rapid photometric particle counting instrument, DustTrak, to the established OSHA modified NIOSH P&CAM 304 method...mask confidence training (27) . This study will compare a direct reading, non-specific photometric particle count instrument (DustTrak TSI Model

  1. Can simple mobile phone applications provide reliable counts of respiratory rates in sick infants and children? An initial evaluation of three new applications.

    PubMed

    Black, James; Gerdtz, Marie; Nicholson, Pat; Crellin, Dianne; Browning, Laura; Simpson, Julie; Bell, Lauren; Santamaria, Nick

    2015-05-01

    Respiratory rate is an important sign that is commonly either not recorded or recorded incorrectly. Mobile phone ownership is increasing even in resource-poor settings. Phone applications may improve the accuracy and ease of counting of respiratory rates. The study assessed the reliability and initial users' impressions of four mobile phone respiratory timer approaches, compared to a 60-second count by the same participants. Three mobile applications (applying four different counting approaches plus a standard 60-second count) were created using the Java Mobile Edition and tested on Nokia C1-01 phones. Apart from the 60-second timer application, the others included a counter based on the time for ten breaths, and three based on the time interval between breaths ('Once-per-Breath', in which the user presses for each breath and the application calculates the rate after 10 or 20 breaths, or after 60s). Nursing and physiotherapy students used the applications to count respiratory rates in a set of brief video recordings of children with different respiratory illnesses. Limits of agreement (compared to the same participant's standard 60-second count), intra-class correlation coefficients and standard errors of measurement were calculated to compare the reliability of the four approaches, and a usability questionnaire was completed by the participants. There was considerable variation in the counts, with large components of the variation related to the participants and the videos, as well as the methods. None of the methods was entirely reliable, with no limits of agreement better than -10 to +9 breaths/min. Some of the methods were superior to the others, with ICCs from 0.24 to 0.92. By ICC the Once-per-Breath 60-second count and the Once-per-Breath 20-breath count were the most consistent, better even than the 60-second count by the participants. The 10-breath approaches performed least well. Users' initial impressions were positive, with little difference between the applications found. This study provides evidence that applications running on simple phones can be used to count respiratory rates in children. The Once-per-Breath methods are the most reliable, outperforming the 60-second count. For children with raised respiratory rates the 20-breath version of the Once-per-Breath method is faster, so it is a more suitable option where health workers are under time pressure. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Performance evaluation of the new hematology analyzer Sysmex XN-series.

    PubMed

    Seo, J Y; Lee, S-T; Kim, S-H

    2015-04-01

    The Sysmex XN-series is a new automated hematology analyzer designed to improve the accuracy of cell counts and the specificity of the flagging events. The basic characteristics and the performance of new measurement channels of the XN were evaluated and compared with the Sysmex XE-2100 and the manual method. Fluorescent platelet count (PLT-F) was compared with the flow cytometric method. The low WBC mode and body fluid mode were also evaluated. For workflow analysis, 1005 samples were analyzed on both the XN and the XE-2100, and manual review rates were compared. All parameters measured by the XN correlated well with the XE-2100. PLT-F showed better correlation with the flow cytometric method (r(2)  = 0.80) compared with optical platelet count (r(2)  = 0.73) for platelet counts <70 × 10(9) /L. The low WBC mode reported accurate leukocyte differentials for samples with a WBC count <0.5 × 10(9) /L. Relatively good correlation was found for WBC counts between the manual method and the body fluid mode (r = 0.88). The XN made less flags than the XE-2100, while the sensitivities of both instruments were comparable. The XN provided reliable results on low cell counts, as well as reduced manual blood film reviews, while maintaining a proper level of diagnostic sensitivity. © 2014 John Wiley & Sons Ltd.

  3. An Evaluation of the Accuracy of the Subtraction Method Used for Determining Platelet Counts in Advanced Platelet-Rich Fibrin and Concentrated Growth Factor Preparations

    PubMed Central

    Watanabe, Taisuke; Isobe, Kazushige; Suzuki, Taiji; Kawabata, Hideo; Nakamura, Masayuki; Tsukioka, Tsuneyuki; Okudera, Toshimitsu; Okudera, Hajime; Uematsu, Kohya; Okuda, Kazuhiro; Nakata, Koh; Kawase, Tomoyuki

    2017-01-01

    Platelet concentrates should be quality-assured of purity and identity prior to clinical use. Unlike for the liquid form of platelet-rich plasma, platelet counts cannot be directly determined in solid fibrin clots and are instead calculated by subtracting the counts in other liquid or semi-clotted fractions from those in whole blood samples. Having long suspected the validity of this method, we herein examined the possible loss of platelets in the preparation process. Blood samples collected from healthy male donors were immediately centrifuged for advanced platelet-rich fibrin (A-PRF) and concentrated growth factors (CGF) according to recommended centrifugal protocols. Blood cells in liquid and semi-clotted fractions were directly counted. Platelets aggregated on clot surfaces were observed by scanning electron microscopy. A higher centrifugal force increased the numbers of platelets and platelet aggregates in the liquid red blood cell fraction and the semi-clotted red thrombus in the presence and absence of the anticoagulant, respectively. Nevertheless, the calculated platelet counts in A-PRF/CGF preparations were much higher than expected, rendering the currently accepted subtraction method inaccurate for determining platelet counts in fibrin clots. To ensure the quality of solid types of platelet concentrates chairside in a timely manner, a simple and accurate platelet-counting method should be developed immediately. PMID:29563413

  4. Technical comparison of four different extracorporeal photopheresis systems.

    PubMed

    Brosig, Andreas; Hähnel, Viola; Orsó, Evelyn; Wolff, Daniel; Holler, Ernst; Ahrens, Norbert

    2016-10-01

    Extracorporeal photopheresis (ECP) is a therapeutic technique that combines leukapheresis and ultraviolet (UV)A irradiation of the leukapheresate after 8-methoxypsoralen treatment with subsequent retransfusion. It can be achieved with a single device (online) or by combining an apheresis machine with a separate UVA light source (offline). The comparability of both established methods is unknown. In a prospective setting, four ECP systems were evaluated: one with integrated UVA irradiation for online ECP (Therakos) and three with external UVA irradiation for offline ECP (Amicus, Optia, and Cobe Spectra). Apheresis variables and cell counts were determined by methods including flow cytometry. The duration of apheresis ranged from 120 minutes (Amicus, Optia) to 275 minutes (Therakos). Mononuclear cell (MNC) counts in the treatment bags were comparable between offline ECP methods (median, 57 × 10 8 - 66 × 10 8 ) and lower for online ECP (14 × 10 8 ). CD16+ monocytes were abundant in online ECP (82%) but rarer in offline ECP (median, 14% - 19%). Hematocrit ranged from 0.1% (Therakos) to 8% (Amicus). There were no side effects in any patients. All offline ECP systems studied yielded comparable cellular compositions and highly enriched populations of MNCs. In contrast, white blood cells from online ECP displayed enrichment of nonclassical monocytes. The relevance of these findings is unknown as there is no established biomarker to predict the therapeutic efficacy of these procedures. © 2016 AABB.

  5. Electrostatically defined silicon quantum dots with counted antimony donor implants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singh, M., E-mail: msingh@sandia.gov; Luhman, D. R.; Lilly, M. P.

    2016-02-08

    Deterministic control over the location and number of donors is crucial to donor spin quantum bits (qubits) in semiconductor based quantum computing. In this work, a focused ion beam is used to implant antimony donors in 100 nm × 150 nm windows straddling quantum dots. Ion detectors are integrated next to the quantum dots to sense the implants. The numbers of donors implanted can be counted to a precision of a single ion. In low-temperature transport measurements, regular Coulomb blockade is observed from the quantum dots. Charge offsets indicative of donor ionization are also observed in devices with counted donor implants.

  6. Electrostatically defined silicon quantum dots with counted antimony donor implants

    NASA Astrophysics Data System (ADS)

    Singh, M.; Pacheco, J. L.; Perry, D.; Garratt, E.; Ten Eyck, G.; Bishop, N. C.; Wendt, J. R.; Manginell, R. P.; Dominguez, J.; Pluym, T.; Luhman, D. R.; Bielejec, E.; Lilly, M. P.; Carroll, M. S.

    2016-02-01

    Deterministic control over the location and number of donors is crucial to donor spin quantum bits (qubits) in semiconductor based quantum computing. In this work, a focused ion beam is used to implant antimony donors in 100 nm × 150 nm windows straddling quantum dots. Ion detectors are integrated next to the quantum dots to sense the implants. The numbers of donors implanted can be counted to a precision of a single ion. In low-temperature transport measurements, regular Coulomb blockade is observed from the quantum dots. Charge offsets indicative of donor ionization are also observed in devices with counted donor implants.

  7. Learning linear transformations between counting-based and prediction-based word embeddings

    PubMed Central

    Hayashi, Kohei; Kawarabayashi, Ken-ichi

    2017-01-01

    Despite the growing interest in prediction-based word embedding learning methods, it remains unclear as to how the vector spaces learnt by the prediction-based methods differ from that of the counting-based methods, or whether one can be transformed into the other. To study the relationship between counting-based and prediction-based embeddings, we propose a method for learning a linear transformation between two given sets of word embeddings. Our proposal contributes to the word embedding learning research in three ways: (a) we propose an efficient method to learn a linear transformation between two sets of word embeddings, (b) using the transformation learnt in (a), we empirically show that it is possible to predict distributed word embeddings for novel unseen words, and (c) empirically it is possible to linearly transform counting-based embeddings to prediction-based embeddings, for frequent words, different POS categories, and varying degrees of ambiguities. PMID:28926629

  8. Spot counting on fluorescence in situ hybridization in suspension images using Gaussian mixture model

    NASA Astrophysics Data System (ADS)

    Liu, Sijia; Sa, Ruhan; Maguire, Orla; Minderman, Hans; Chaudhary, Vipin

    2015-03-01

    Cytogenetic abnormalities are important diagnostic and prognostic criteria for acute myeloid leukemia (AML). A flow cytometry-based imaging approach for FISH in suspension (FISH-IS) was established that enables the automated analysis of several log-magnitude higher number of cells compared to the microscopy-based approaches. The rotational positioning can occur leading to discordance between spot count. As a solution of counting error from overlapping spots, in this study, a Gaussian Mixture Model based classification method is proposed. The Akaike information criterion (AIC) and Bayesian information criterion (BIC) of GMM are used as global image features of this classification method. Via Random Forest classifier, the result shows that the proposed method is able to detect closely overlapping spots which cannot be separated by existing image segmentation based spot detection methods. The experiment results show that by the proposed method we can obtain a significant improvement in spot counting accuracy.

  9. [Left ventricular volume determination by first-pass radionuclide angiocardiography using a semi-geometric count-based method].

    PubMed

    Kinoshita, S; Suzuki, T; Yamashita, S; Muramatsu, T; Ide, M; Dohi, Y; Nishimura, K; Miyamae, T; Yamamoto, I

    1992-01-01

    A new radionuclide technique for the calculation of left ventricular (LV) volume by the first-pass (FP) method was developed and examined. Using a semi-geometric count-based method, the LV volume can be measured by the following equation: CV = CM/(L/d). V = (CT/CV) x d3 = (CT/CM) x L x d2. (V = LV volume, CV = voxel count, CM = the maximum LV count, CT = the total LV count, L = LV depth where the maximum count was obtained, and d = pixel size.) This theorem was applied to FP LV images obtained in the 30-degree right anterior oblique position. Frame-mode acquisition was performed and the LV end-diastolic maximum count and total count were obtained. The maximum LV depth was obtained as the maximum width of the LV on the FP end-diastolic image, using the assumption that the LV cross-section is circular. These values were substituted in the above equation and the LV end-diastolic volume (FP-EDV) was calculated. A routine equilibrium (EQ) study was done, and the end-diastolic maximum count and total count were obtained. The LV maximum depth was measured on the FP end-diastolic frame, as the maximum length of the LV image. Using these values, the EQ-EDV was calculated and the FP-EDV was compared to the EQ-EDV. The correlation coefficient for these two values was r = 0.96 (n = 23, p less than 0.001), and the standard error of the estimated volume was 10 ml.(ABSTRACT TRUNCATED AT 250 WORDS)

  10. Tracking and people counting using Particle Filter Method

    NASA Astrophysics Data System (ADS)

    Sulistyaningrum, D. R.; Setiyono, B.; Rizky, M. S.

    2018-03-01

    In recent years, technology has developed quite rapidly, especially in the field of object tracking. Moreover, if the object under study is a person and the number of people a lot. The purpose of this research is to apply Particle Filter method for tracking and counting people in certain area. Tracking people will be rather difficult if there are some obstacles, one of which is occlusion. The stages of tracking and people counting scheme in this study include pre-processing, segmentation using Gaussian Mixture Model (GMM), tracking using particle filter, and counting based on centroid. The Particle Filter method uses the estimated motion included in the model used. The test results show that the tracking and people counting can be done well with an average accuracy of 89.33% and 77.33% respectively from six videos test data. In the process of tracking people, the results are good if there is partial occlusion and no occlusion

  11. Precision wildlife monitoring using unmanned aerial vehicles

    NASA Astrophysics Data System (ADS)

    Hodgson, Jarrod C.; Baylis, Shane M.; Mott, Rowan; Herrod, Ashley; Clarke, Rohan H.

    2016-03-01

    Unmanned aerial vehicles (UAVs) represent a new frontier in environmental research. Their use has the potential to revolutionise the field if they prove capable of improving data quality or the ease with which data are collected beyond traditional methods. We apply UAV technology to wildlife monitoring in tropical and polar environments and demonstrate that UAV-derived counts of colony nesting birds are an order of magnitude more precise than traditional ground counts. The increased count precision afforded by UAVs, along with their ability to survey hard-to-reach populations and places, will likely drive many wildlife monitoring projects that rely on population counts to transition from traditional methods to UAV technology. Careful consideration will be required to ensure the coherence of historic data sets with new UAV-derived data and we propose a method for determining the number of duplicated (concurrent UAV and ground counts) sampling points needed to achieve data compatibility.

  12. ANTHELMINTIC EFFECTS OF DRIED GROUND BANANA PLANT LEAVES (MUSA SPP.) FED TO SHEEP ARTIFICIALLY INFECTED WITH HAEMONCHUS CONTORTUS AND TRICHOSTRONGYLUS COLUBRIFORMIS

    PubMed Central

    Gregory, Lilian; Yoshihara, Eidi; Silva, Leandro Kataoaka Fernandes; Marques, Eduardo Carvalho; Ribeiro, Bruno Leonardo Mendonça; de Souza Meira, Enoch Brandão; Rossi, Rodolfo Santos; do Amarante, Alessandro Francisco Talamini; Hasegawa, Marjorie Yumi

    2017-01-01

    Background: Helminths is a endoparasites that cause the major losses for profitable sheep production in Brazil. The increased development of resistant strains of endoparasites have enforced the search for sustainable alternatives. The aim of this paper was to provide information about endoparasites control with banana leaves in infected sheep as alternative control strategies and see its viability. Materials and Methods: In this study, we performed two trials to investigate the anthelmintic properties of banana leaves on endoparasites in sheep. In Trial 1, twelve sheep were artificially infected with Trichostrongylus colubriformis; in Trial 2, eleven sheep were artificially infected with Haemonchus contortus. Clinical examinations, packed cell volume, total protein, faecal egg counts (FECs) and egg hatchability tests (EHTs) were performed. At the end of the trials, the sheep were humanely slaughtered, and total worm counts were performed. Results: In Trial 1 and 2, no significant FEC decreases were note but significant diference in EHTs were observed. Total worm counts, clinical and haematological parameters did not reveal significant changes between the treatment and control groups. These results suggest that feeding dried ground banana plant leaves to sheep may reduce the viability of Trichostrongylus colubriformis eggs, and this anthelmintic activity is potentially exploitable as part of an integrated parasite management programme. Conclusion: However, further investigation is needed to establish the optimal dosage, develop a convenient delivery form and confirm the economic feasibility of using banana plantation byproducts as feed for ruminant species. Abbreviations: Coproculture test (CT)., Faecal egg count (FEC)., Egg hatchability test (EHT) PMID:28480391

  13. Enumeration procedure for monitoring test microbe populations on inoculated carriers in AOAC use-dilution methods.

    PubMed

    Tomasino, Stephen F; Fiumara, Rebecca M; Cottrill, Michele P

    2006-01-01

    The AOAC Use-Dilution methods do not provide procedures to enumerate the test microbe on stainless steel carriers (penicylinders) or guidance on the expected target populations of the test microbe (i.e., a performance standard). This report describes the procedures used by the U.S. Environmental Protection Agency to enumerate the test microbe (carrier counts) associated with conducting the Use-Dilution method with Staphylococcus aureus (Method 955.15) and Pseudomonas aeruginosa (Method 964.02) and the examination of historical data. The carrier count procedure involves the random selection of carriers, shearing bacterial cells from the carrier surface through sonication, and plating of serially diluted inoculum on trypticase soy agar. For each Use-Dilution test conducted, the official AOAC method was strictly followed for carrier preparation, culture initiation, test culture preparation, and carrier inoculation steps. Carrier count data from 78 Use-Dilution tests conducted over a 6-year period were compiled and analyzed. A mean carrier count of 6.6 logs (approximately 4.0 x 10(6) colony-forming units/carrier) was calculated for both S. aureus and P. aeruginosa. Of the mean values, 95% fell within +/- 2 repeatability standard deviations. The enumeration procedure and target carrier counts are desirable for standardizing the Use-Dilution methods, increasing their reproducibility, and ensuring the quality of the data.

  14. How many fish in a tank? Constructing an automated fish counting system by using PTV analysis

    NASA Astrophysics Data System (ADS)

    Abe, S.; Takagi, T.; Takehara, K.; Kimura, N.; Hiraishi, T.; Komeyama, K.; Torisawa, S.; Asaumi, S.

    2017-02-01

    Because escape from a net cage and mortality are constant problems in fish farming, health control and management of facilities are important in aquaculture. In particular, the development of an accurate fish counting system has been strongly desired for the Pacific Bluefin tuna farming industry owing to the high market value of these fish. The current fish counting method, which involves human counting, results in poor accuracy; moreover, the method is cumbersome because the aquaculture net cage is so large that fish can only be counted when they move to another net cage. Therefore, we have developed an automated fish counting system by applying particle tracking velocimetry (PTV) analysis to a shoal of swimming fish inside a net cage. In essence, we treated the swimming fish as tracer particles and estimated the number of fish by analyzing the corresponding motion vectors. The proposed fish counting system comprises two main components: image processing and motion analysis, where the image-processing component abstracts the foreground and the motion analysis component traces the individual's motion. In this study, we developed a Region Extraction and Centroid Computation (RECC) method and a Kalman filter and Chi-square (KC) test for the two main components. To evaluate the efficiency of our method, we constructed a closed system, placed an underwater video camera with a spherical curved lens at the bottom of the tank, and recorded a 360° view of a swimming school of Japanese rice fish (Oryzias latipes). Our study showed that almost all fish could be abstracted by the RECC method and the motion vectors could be calculated by the KC test. The recognition rate was approximately 90% when more than 180 individuals were observed within the frame of the video camera. These results suggest that the presented method has potential application as a fish counting system for industrial aquaculture.

  15. Methods of detecting and counting raptors: A review

    USGS Publications Warehouse

    Fuller, M.R.; Mosher, J.A.; Ralph, C. John; Scott, J. Michael

    1981-01-01

    Most raptors are wide-ranging, secretive, and occur at relatively low densities. These factors, in conjunction with the nocturnal activity of owls, cause the counting of raptors by most standard census and survey efforts to be very time consuming and expensive. This paper reviews the most common methods of detecting and counting raptors. It is hoped that it will be of use to the ever-increasing number of biologists, land-use planners, and managers that must determine the occurrence, density, or population dynamics of raptors. Road counts of fixed station or continuous transect design are often used to sample large areas. Detection of spontaneous or elicited vocalizations, especially those of owls, provides a means of detecting and estimating raptor numbers. Searches for nests are accomplished from foot surveys, observations from automobiles and boats, or from aircraft when nest structures are conspicuous (e.g., Osprey). Knowledge of nest habitat, historic records, and inquiries of local residents are useful for locating nests. Often several of these techniques are combined to help find nest sites. Aerial searches have also been used to locate or count large raptors (e.g., eagles), or those that may be conspicuous in open habitats (e.g., tundra). Counts of birds entering or leaving nest colonies or colonial roosts have been attempted on a limited basis. Results from Christmas Bird Counts have provided an index of the abundance of some species. Trapping and banding generally has proven to be an inefficient method of detecting raptors or estimating their populations. Concentrations of migrants at strategically located points around the world afford the best opportunity to count many rap tors in a relatively short period of time, but the influence of many unquantified variables has inhibited extensive interpretation of these counts. Few data exist to demonstrate the effectiveness of these methods. We believe more research on sampling techniques, rather than complete counts or intensive searches, will provide adequate yet affordable estimates of raptor numbers in addition to providing methods for detecting the presence of raptors on areas of interest to researchers and managers.

  16. The 2-24 μm source counts from the AKARI North Ecliptic Pole survey

    NASA Astrophysics Data System (ADS)

    Murata, K.; Pearson, C. P.; Goto, T.; Kim, S. J.; Matsuhara, H.; Wada, T.

    2014-11-01

    We present herein galaxy number counts of the nine bands in the 2-24 μm range on the basis of the AKARI North Ecliptic Pole (NEP) surveys. The number counts are derived from NEP-deep and NEP-wide surveys, which cover areas of 0.5 and 5.8 deg2, respectively. To produce reliable number counts, the sources were extracted from recently updated images. Completeness and difference between observed and intrinsic magnitudes were corrected by Monte Carlo simulation. Stellar counts were subtracted by using the stellar fraction estimated from optical data. The resultant source counts are given down to the 80 per cent completeness limit; 0.18, 0.16, 0.10, 0.05, 0.06, 0.10, 0.15, 0.16 and 0.44 mJy in the 2.4, 3.2, 4.1, 7, 9, 11, 15, 18 and 24 μm bands, respectively. On the bright side of all bands, the count distribution is flat, consistent with the Euclidean universe, while on the faint side, the counts deviate, suggesting that the galaxy population of the distant universe is evolving. These results are generally consistent with previous galaxy counts in similar wavebands. We also compare our counts with evolutionary models and find them in good agreement. By integrating the models down to the 80 per cent completeness limits, we calculate that the AKARI NEP survey revolves 20-50 per cent of the cosmic infrared background, depending on the wavebands.

  17. Comparison of point-of-care methods for preparation of platelet concentrate (platelet-rich plasma).

    PubMed

    Weibrich, Gernot; Kleis, Wilfried K G; Streckbein, Philipp; Moergel, Maximilian; Hitzler, Walter E; Hafner, Gerd

    2012-01-01

    This study analyzed the concentrations of platelets and growth factors in platelet-rich plasma (PRP), which are likely to depend on the method used for its production. The cellular composition and growth factor content of platelet concentrates (platelet-rich plasma) produced by six different procedures were quantitatively analyzed and compared. Platelet and leukocyte counts were determined on an automatic cell counter, and analysis of growth factors was performed using enzyme-linked immunosorbent assay. The principal differences between the analyzed PRP production methods (blood bank method of intermittent flow centrifuge system/platelet apheresis and by the five point-of-care methods) and the resulting platelet concentrates were evaluated with regard to resulting platelet, leukocyte, and growth factor levels. The platelet counts in both whole blood and PRP were generally higher in women than in men; no differences were observed with regard to age. Statistical analysis of platelet-derived growth factor AB (PDGF-AB) and transforming growth factor β1 (TGF-β1) showed no differences with regard to age or gender. Platelet counts and TGF-β1 concentration correlated closely, as did platelet counts and PDGF-AB levels. There were only rare correlations between leukocyte counts and PDGF-AB levels, but comparison of leukocyte counts and PDGF-AB levels demonstrated certain parallel tendencies. TGF-β1 levels derive in substantial part from platelets and emphasize the role of leukocytes, in addition to that of platelets, as a source of growth factors in PRP. All methods of producing PRP showed high variability in platelet counts and growth factor levels. The highest growth factor levels were found in the PRP prepared using the Platelet Concentrate Collection System manufactured by Biomet 3i.

  18. A conceptual guide to detection probability for point counts and other count-based survey methods

    Treesearch

    D. Archibald McCallum

    2005-01-01

    Accurate and precise estimates of numbers of animals are vitally needed both to assess population status and to evaluate management decisions. Various methods exist for counting birds, but most of those used with territorial landbirds yield only indices, not true estimates of population size. The need for valid density estimates has spawned a number of models for...

  19. Validation of analytical methods in GMP: the disposable Fast Read 102® device, an alternative practical approach for cell counting.

    PubMed

    Gunetti, Monica; Castiglia, Sara; Rustichelli, Deborah; Mareschi, Katia; Sanavio, Fiorella; Muraro, Michela; Signorino, Elena; Castello, Laura; Ferrero, Ivana; Fagioli, Franca

    2012-05-31

    The quality and safety of advanced therapy products must be maintained throughout their production and quality control cycle to ensure their final use in patients. We validated the cell count method according to the International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use and European Pharmacopoeia, considering the tests' accuracy, precision, repeatability, linearity and range. As the cell count is a potency test, we checked accuracy, precision, and linearity, according to ICH Q2. Briefly our experimental approach was first to evaluate the accuracy of Fast Read 102® compared to the Bürker chamber. Once the accuracy of the alternative method was demonstrated, we checked the precision and linearity test only using Fast Read 102®. The data were statistically analyzed by average, standard deviation and coefficient of variation percentages inter and intra operator. All the tests performed met the established acceptance criteria of a coefficient of variation of less than ten percent. For the cell count, the precision reached by each operator had a coefficient of variation of less than ten percent (total cells) and under five percent (viable cells). The best range of dilution, to obtain a slope line value very similar to 1, was between 1:8 and 1:128. Our data demonstrated that the Fast Read 102® count method is accurate, precise and ensures the linearity of the results obtained in a range of cell dilution. Under our standard method procedures, this assay may thus be considered a good quality control method for the cell count as a batch release quality control test. Moreover, the Fast Read 102® chamber is a plastic, disposable device that allows a number of samples to be counted in the same chamber. Last but not least, it overcomes the problem of chamber washing after use and so allows a cell count in a clean environment such as that in a Cell Factory. In a good manufacturing practice setting the disposable cell counting devices will allow a single use of the count chamber they can then be thrown away, thus avoiding the waste disposal of vital dye (e.g. Trypan Blue) or lysing solution (e.g. Tuerk solution).

  20. Image-based spectral distortion correction for photon-counting x-ray detectors

    PubMed Central

    Ding, Huanjun; Molloi, Sabee

    2012-01-01

    Purpose: To investigate the feasibility of using an image-based method to correct for distortions induced by various artifacts in the x-ray spectrum recorded with photon-counting detectors for their application in breast computed tomography (CT). Methods: The polyenergetic incident spectrum was simulated with the tungsten anode spectral model using the interpolating polynomials (TASMIP) code and carefully calibrated to match the x-ray tube in this study. Experiments were performed on a Cadmium-Zinc-Telluride (CZT) photon-counting detector with five energy thresholds. Energy bins were adjusted to evenly distribute the recorded counts above the noise floor. BR12 phantoms of various thicknesses were used for calibration. A nonlinear function was selected to fit the count correlation between the simulated and the measured spectra in the calibration process. To evaluate the proposed spectral distortion correction method, an empirical fitting derived from the calibration process was applied on the raw images recorded for polymethyl methacrylate (PMMA) phantoms of 8.7, 48.8, and 100.0 mm. Both the corrected counts and the effective attenuation coefficient were compared to the simulated values for each of the five energy bins. The feasibility of applying the proposed method to quantitative material decomposition was tested using a dual-energy imaging technique with a three-material phantom that consisted of water, lipid, and protein. The performance of the spectral distortion correction method was quantified using the relative root-mean-square (RMS) error with respect to the expected values from simulations or areal analysis of the decomposition phantom. Results: The implementation of the proposed method reduced the relative RMS error of the output counts in the five energy bins with respect to the simulated incident counts from 23.0%, 33.0%, and 54.0% to 1.2%, 1.8%, and 7.7% for 8.7, 48.8, and 100.0 mm PMMA phantoms, respectively. The accuracy of the effective attenuation coefficient of PMMA estimate was also improved with the proposed spectral distortion correction. Finally, the relative RMS error of water, lipid, and protein decompositions in dual-energy imaging was significantly reduced from 53.4% to 6.8% after correction was applied. Conclusions: The study demonstrated that dramatic distortions in the recorded raw image yielded from a photon-counting detector could be expected, which presents great challenges for applying the quantitative material decomposition method in spectral CT. The proposed semi-empirical correction method can effectively reduce these errors caused by various artifacts, including pulse pileup and charge sharing effects. Furthermore, rather than detector-specific simulation packages, the method requires a relatively simple calibration process and knowledge about the incident spectrum. Therefore, it may be used as a generalized procedure for the spectral distortion correction of different photon-counting detectors in clinical breast CT systems. PMID:22482608

  1. A critical evaluation of a flow cytometer used for detecting enterococci in recreational waters.

    PubMed

    King, Dawn N; Brenner, Kristen P; Rodgers, Mark R

    2007-06-01

    The current U. S. Environmental Protection Agency-approved method for enterococci (Method 1600) in recreational water is a membrane filter (MF) method that takes 24 hours to obtain results. If the recreational water is not in compliance with the standard, the risk of exposure to enteric pathogens may occur before the water is identified as hazardous. Because flow cytometry combined with specific fluorescent antibodies has the potential to be used as a rapid detection method for microorganisms, this technology was evaluated as a rapid, same-day method to detect enterococci in bathing beach waters. The flow cytometer chosen for this study was a laser microbial detection system designed to detect labeled antibodies. A comparison of MF counts with flow cytometry counts of enterococci in phosphate buffer and sterile-filtered recreational water showed good agreement between the two methods. However, when flow cytometry was used, the counts were several orders of magnitude higher than the MF counts with no correlation to Enterococcus spike concentrations. The unspiked sample controls frequently had higher counts than the samples spiked with enterococci. Particles within the spiked water samples were probably counted as target cells by the flow cytometer because of autofluorescence or non-specific adsorption of antibody and carryover to subsequent samples. For these reasons, this technology may not be suitable for enterococci detection in recreational waters. Improvements in research and instrument design that will eliminate high background and carryover may make this a viable technology in the

  2. Application of the backward extrapolation method to pulsed neutron sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Talamo, Alberto; Gohar, Yousry

    We report particle detectors operated in pulse mode are subjected to the dead-time effect. When the average of the detector counts is constant over time, correcting for the dead-time effect is simple and can be accomplished by analytical formulas. However, when the average of the detector counts changes over time it is more difficult to take into account the dead-time effect. When a subcritical nuclear assembly is driven by a pulsed neutron source, simple analytical formulas cannot be applied to the measured detector counts to correct for the dead-time effect because of the sharp change of the detector counts overmore » time. This work addresses this issue by using the backward extrapolation method. The latter can be applied not only to a continuous (e.g. californium) external neutron source but also to a pulsed external neutron source (e.g. by a particle accelerator) driving a subcritical nuclear assembly. Finally, the backward extrapolation method allows to obtain from the measured detector counts both the dead-time value and the real detector counts.« less

  3. Application of the backward extrapolation method to pulsed neutron sources

    DOE PAGES

    Talamo, Alberto; Gohar, Yousry

    2017-09-23

    We report particle detectors operated in pulse mode are subjected to the dead-time effect. When the average of the detector counts is constant over time, correcting for the dead-time effect is simple and can be accomplished by analytical formulas. However, when the average of the detector counts changes over time it is more difficult to take into account the dead-time effect. When a subcritical nuclear assembly is driven by a pulsed neutron source, simple analytical formulas cannot be applied to the measured detector counts to correct for the dead-time effect because of the sharp change of the detector counts overmore » time. This work addresses this issue by using the backward extrapolation method. The latter can be applied not only to a continuous (e.g. californium) external neutron source but also to a pulsed external neutron source (e.g. by a particle accelerator) driving a subcritical nuclear assembly. Finally, the backward extrapolation method allows to obtain from the measured detector counts both the dead-time value and the real detector counts.« less

  4. Clustering method for counting passengers getting in a bus with single camera

    NASA Astrophysics Data System (ADS)

    Yang, Tao; Zhang, Yanning; Shao, Dapei; Li, Ying

    2010-03-01

    Automatic counting of passengers is very important for both business and security applications. We present a single-camera-based vision system that is able to count passengers in a highly crowded situation at the entrance of a traffic bus. The unique characteristics of the proposed system include, First, a novel feature-point-tracking- and online clustering-based passenger counting framework, which performs much better than those of background-modeling-and foreground-blob-tracking-based methods. Second, a simple and highly accurate clustering algorithm is developed that projects the high-dimensional feature point trajectories into a 2-D feature space by their appearance and disappearance times and counts the number of people through online clustering. Finally, all test video sequences in the experiment are captured from a real traffic bus in Shanghai, China. The results show that the system can process two 320×240 video sequences at a frame rate of 25 fps simultaneously, and can count passengers reliably in various difficult scenarios with complex interaction and occlusion among people. The method achieves high accuracy rates up to 96.5%.

  5. Tutorial on Using Regression Models with Count Outcomes Using R

    ERIC Educational Resources Information Center

    Beaujean, A. Alexander; Morgan, Grant B.

    2016-01-01

    Education researchers often study count variables, such as times a student reached a goal, discipline referrals, and absences. Most researchers that study these variables use typical regression methods (i.e., ordinary least-squares) either with or without transforming the count variables. In either case, using typical regression for count data can…

  6. Sources and magnitude of sampling error in redd counts for bull trout

    Treesearch

    Jason B. Dunham; Bruce Rieman

    2001-01-01

    Monitoring of salmonid populations often involves annual redd counts, but the validity of this method has seldom been evaluated. We conducted redd counts of bull trout Salvelinus confluentus in two streams in northern Idaho to address four issues: (1) relationships between adult escapements and redd counts; (2) interobserver variability in redd...

  7. CORNAS: coverage-dependent RNA-Seq analysis of gene expression data without biological replicates.

    PubMed

    Low, Joel Z B; Khang, Tsung Fei; Tammi, Martti T

    2017-12-28

    In current statistical methods for calling differentially expressed genes in RNA-Seq experiments, the assumption is that an adjusted observed gene count represents an unknown true gene count. This adjustment usually consists of a normalization step to account for heterogeneous sample library sizes, and then the resulting normalized gene counts are used as input for parametric or non-parametric differential gene expression tests. A distribution of true gene counts, each with a different probability, can result in the same observed gene count. Importantly, sequencing coverage information is currently not explicitly incorporated into any of the statistical models used for RNA-Seq analysis. We developed a fast Bayesian method which uses the sequencing coverage information determined from the concentration of an RNA sample to estimate the posterior distribution of a true gene count. Our method has better or comparable performance compared to NOISeq and GFOLD, according to the results from simulations and experiments with real unreplicated data. We incorporated a previously unused sequencing coverage parameter into a procedure for differential gene expression analysis with RNA-Seq data. Our results suggest that our method can be used to overcome analytical bottlenecks in experiments with limited number of replicates and low sequencing coverage. The method is implemented in CORNAS (Coverage-dependent RNA-Seq), and is available at https://github.com/joel-lzb/CORNAS .

  8. Method and apparatus to debug an integrated circuit chip via synchronous clock stop and scan

    DOEpatents

    Bellofatto, Ralph E [Ridgefield, CT; Ellavsky, Matthew R [Rochester, MN; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Gooding, Thomas M [Rochester, MN; Haring, Rudolf A [Cortlandt Manor, NY; Hehenberger, Lance G [Leander, TX; Ohmacht, Martin [Yorktown Heights, NY

    2012-03-20

    An apparatus and method for evaluating a state of an electronic or integrated circuit (IC), each IC including one or more processor elements for controlling operations of IC sub-units, and each the IC supporting multiple frequency clock domains. The method comprises: generating a synchronized set of enable signals in correspondence with one or more IC sub-units for starting operation of one or more IC sub-units according to a determined timing configuration; counting, in response to one signal of the synchronized set of enable signals, a number of main processor IC clock cycles; and, upon attaining a desired clock cycle number, generating a stop signal for each unique frequency clock domain to synchronously stop a functional clock for each respective frequency clock domain; and, upon synchronously stopping all on-chip functional clocks on all frequency clock domains in a deterministic fashion, scanning out data values at a desired IC chip state. The apparatus and methodology enables construction of a cycle-by-cycle view of any part of the state of a running IC chip, using a combination of on-chip circuitry and software.

  9. Queries over Unstructured Data: Probabilistic Methods to the Rescue

    NASA Astrophysics Data System (ADS)

    Sarawagi, Sunita

    Unstructured data like emails, addresses, invoices, call transcripts, reviews, and press releases are now an integral part of any large enterprise. A challenge of modern business intelligence applications is analyzing and querying data seamlessly across structured and unstructured sources. This requires the development of automated techniques for extracting structured records from text sources and resolving entity mentions in data from various sources. The success of any automated method for extraction and integration depends on how effectively it unifies diverse clues in the unstructured source and in existing structured databases. We argue that statistical learning techniques like Conditional Random Fields (CRFs) provide a accurate, elegant and principled framework for tackling these tasks. Given the inherent noise in real-world sources, it is important to capture the uncertainty of the above operations via imprecise data models. CRFs provide a sound probability distribution over extractions but are not easy to represent and query in a relational framework. We present methods of approximating this distribution to query-friendly row and column uncertainty models. Finally, we present models for representing the uncertainty of de-duplication and algorithms for various Top-K count queries on imprecise duplicates.

  10. Transition-Edge Sensor Pixel Parameter Design of the Microcalorimeter Array for the X-Ray Integral Field Unit on Athena

    NASA Technical Reports Server (NTRS)

    Smith, S. J.; Adams, J. S.; Bandler, S. R.; Betancourt-Martinez, G. L.; Chervenak, J. A.; Chiao, M. P.; Eckart, M. E.; Finkbeiner, F. M.; Kelley, R. L.; Kilbourne, C. A.; hide

    2016-01-01

    The focal plane of the X-ray integral field unit (X-IFU) for ESA's Athena X-ray observatory will consist of approximately 4000 transition edge sensor (TES) x-ray microcalorimeters optimized for the energy range of 0.2 to 12 kiloelectronvolts. The instrument will provide unprecedented spectral resolution of approximately 2.5 electronvolts at energies of up to 7 kiloelectronvolts and will accommodate photon fluxes of 1 milliCrab (90 counts per second) for point source observations. The baseline configuration is a uniform large pixel array (LPA) of 4.28 arcseconds pixels that is read out using frequency domain multiplexing (FDM). However, an alternative configuration under study incorporates an 18 by × 18 small pixel array (SPA) of 2 arcseconds pixels in the central approximately 36 arcseconds region. This hybrid array configuration could be designed to accommodate higher fluxes of up to 10 milliCrabs (900 counts per second) or alternately for improved spectral performance (less than 1.5 electronvolts) at low count-rates. In this paper we report on the TES pixel designs that are being optimized to meet these proposed LPA and SPA configurations. In particular we describe details of how important TES parameters are chosen to meet the specific mission criteria such as energy resolution, count-rate and quantum efficiency, and highlight performance trade-offs between designs. The basis of the pixel parameter selection is discussed in the context of existing TES arrays that are being developed for solar and x-ray astronomy applications. We describe the latest results on DC biased diagnostic arrays as well as large format kilo-pixel arrays and discuss the technical challenges associated with integrating different array types on to a single detector die.

  11. A gravimetric simplified method for nucleated marrow cell counting using an injection needle.

    PubMed

    Saitoh, Toshiki; Fang, Liu; Matsumoto, Kiyoshi

    2005-08-01

    A simplified gravimetric marrow cell counting method for rats is proposed for a regular screening method. After fresh bone marrow was aspirated by an injection needle, the marrow cells were suspended in carbonate buffered saline. The nucleated marrow cell count (NMC) was measured by an automated multi-blood cell analyzer. When this gravimetric method was applied to rats, the NMC of the left and right femurs had essentially identical values due to careful handling. The NMC at 4 to 10 weeks of age in male and female Crj:CD(SD)IGS rats was 2.72 to 1.96 and 2.75 to 1.98 (x10(6) counts/mg), respectively. More useful information for evaluation could be obtained by using this gravimetric method in addition to myelogram examination. However, some difficulties with this method include low NMC due to blood contamination and variation of NMC due to handling. Therefore, the utility of this gravimetric method for screening will be clarified by the accumulation of the data on myelotoxicity studies with this method.

  12. Ozone and Interdisciplinary Science Teaching--Learning to Address the Things That Count Most.

    ERIC Educational Resources Information Center

    Hobson, Art

    1993-01-01

    Presents the ozone depletion story as an excellent case study for the integration of science-related social issues into the college science curriculum. Describes the history of ozone depletion and efforts to remedy the problem. Provides a lecture outline on ozone depletion. Discusses integrating other science-related interdisciplinary topics in…

  13. Three-dimensional computed tomographic volumetry precisely predicts the postoperative pulmonary function.

    PubMed

    Kobayashi, Keisuke; Saeki, Yusuke; Kitazawa, Shinsuke; Kobayashi, Naohiro; Kikuchi, Shinji; Goto, Yukinobu; Sakai, Mitsuaki; Sato, Yukio

    2017-11-01

    It is important to accurately predict the patient's postoperative pulmonary function. The aim of this study was to compare the accuracy of predictions of the postoperative residual pulmonary function obtained with three-dimensional computed tomographic (3D-CT) volumetry with that of predictions obtained with the conventional segment-counting method. Fifty-three patients scheduled to undergo lung cancer resection, pulmonary function tests, and computed tomography were enrolled in this study. The postoperative residual pulmonary function was predicted based on the segment-counting and 3D-CT volumetry methods. The predicted postoperative values were compared with the results of postoperative pulmonary function tests. Regarding the linear correlation coefficients between the predicted postoperative values and the measured values, those obtained using the 3D-CT volumetry method tended to be higher than those acquired using the segment-counting method. In addition, the variations between the predicted and measured values were smaller with the 3D-CT volumetry method than with the segment-counting method. These results were more obvious in COPD patients than in non-COPD patients. Our findings suggested that the 3D-CT volumetry was able to predict the residual pulmonary function more accurately than the segment-counting method, especially in patients with COPD. This method might lead to the selection of appropriate candidates for surgery among patients with a marginal pulmonary function.

  14. Enumeration of total aerobic microorganisms in foods by SimPlate Total Plate Count-Color Indicator methods and conventional culture methods: collaborative study.

    PubMed

    Feldsine, Philip T; Leung, Stephanie C; Lienau, Andrew H; Mui, Linda A; Townsend, David E

    2003-01-01

    The relative efficacy of the SimPlate Total Plate Count-Color Indicator (TPC-CI) method (SimPlate 35 degrees C) was compared with the AOAC Official Method 966.23 (AOAC 35 degrees C) for enumeration of total aerobic microorganisms in foods. The SimPlate TPC-CI method, incubated at 30 degrees C (SimPlate 30 degrees C), was also compared with the International Organization for Standardization (ISO) 4833 method (ISO 30 degrees C). Six food types were analyzed: ground black pepper, flour, nut meats, frozen hamburger patties, frozen fruits, and fresh vegetables. All foods tested were naturally contaminated. Nineteen laboratories throughout North America and Europe participated in the study. Three method comparisons were conducted. In general, there was <0.3 mean log count difference in recovery among the SimPlate methods and their corresponding reference methods. Mean log counts between the 2 reference methods were also very similar. Repeatability (Sr) and reproducibility (SR) standard deviations were similar among the 3 method comparisons. The SimPlate method (35 degrees C) and the AOAC method were comparable for enumerating total aerobic microorganisms in foods. Similarly, the SimPlate method (30 degrees C) was comparable to the ISO method when samples were prepared and incubated according to the ISO method.

  15. Automated cell counts on CSF samples: A multicenter performance evaluation of the GloCyte system.

    PubMed

    Hod, E A; Brugnara, C; Pilichowska, M; Sandhaus, L M; Luu, H S; Forest, S K; Netterwald, J C; Reynafarje, G M; Kratz, A

    2018-02-01

    Automated cell counters have replaced manual enumeration of cells in blood and most body fluids. However, due to the unreliability of automated methods at very low cell counts, most laboratories continue to perform labor-intensive manual counts on many or all cerebrospinal fluid (CSF) samples. This multicenter clinical trial investigated if the GloCyte System (Advanced Instruments, Norwood, MA), a recently FDA-approved automated cell counter, which concentrates and enumerates red blood cells (RBCs) and total nucleated cells (TNCs), is sufficiently accurate and precise at very low cell counts to replace all manual CSF counts. The GloCyte System concentrates CSF and stains RBCs with fluorochrome-labeled antibodies and TNCs with nucleic acid dyes. RBCs and TNCs are then counted by digital image analysis. Residual adult and pediatric CSF samples obtained for clinical analysis at five different medical centers were used for the study. Cell counts were performed by the manual hemocytometer method and with the GloCyte System following the same protocol at all sites. The limits of the blank, detection, and quantitation, as well as precision and accuracy of the GloCyte, were determined. The GloCyte detected as few as 1 TNC/μL and 1 RBC/μL, and reliably counted as low as 3 TNCs/μL and 2 RBCs/μL. The total coefficient of variation was less than 20%. Comparison with cell counts obtained with a hemocytometer showed good correlation (>97%) between the GloCyte and the hemocytometer, including at very low cell counts. The GloCyte instrument is a precise, accurate, and stable system to obtain red cell and nucleated cell counts in CSF samples. It allows for the automated enumeration of even very low cell numbers, which is crucial for CSF analysis. These results suggest that GloCyte is an acceptable alternative to the manual method for all CSF samples, including those with normal cell counts. © 2017 John Wiley & Sons Ltd.

  16. Dynamic time-correlated single-photon counting laser ranging

    NASA Astrophysics Data System (ADS)

    Peng, Huan; Wang, Yu-rong; Meng, Wen-dong; Yan, Pei-qin; Li, Zhao-hui; Li, Chen; Pan, Hai-feng; Wu, Guang

    2018-03-01

    We demonstrate a photon counting laser ranging experiment with a four-channel single-photon detector (SPD). The multi-channel SPD improve the counting rate more than 4×107 cps, which makes possible for the distance measurement performed even in daylight. However, the time-correlated single-photon counting (TCSPC) technique cannot distill the signal easily while the fast moving targets are submersed in the strong background. We propose a dynamic TCSPC method for fast moving targets measurement by varying coincidence window in real time. In the experiment, we prove that targets with velocity of 5 km/s can be detected according to the method, while the echo rate is 20% with the background counts of more than 1.2×107 cps.

  17. Steam Versus Hot-Water Scalding in Reducing Bacterial Loads on the Skin of Commercially Processed Poultry

    PubMed Central

    Patrick, Thomas E.; Goodwin, T. L.; Collins, J. A.; Wyche, R. C.; Love, B. E.

    1972-01-01

    A comparison of two types of scalders was conducted to determine their effectiveness in reducing bacterial contamination of poultry carcasses. A conventional hot-water scalder and a prototype model of a steam scalder were tested under commercial conditions. Total plate counts from steam-scalded birds were significantly lower than the counts of water-scalded birds immediately after scalding and again after picking. No differences in the two methods could be found after chilling. Coliform counts from steam-scalded birds were significantly lower than the counts from water-scalded birds immediately after scalding. No significant differences in coliform counts were detected when the two scald methods were compared after defeathering and chilling. PMID:4553146

  18. Evaluation of ICT filariasis card test using whole capillary blood: comparison with Knott's concentration and counting chamber methods.

    PubMed

    Njenga, S M; Wamae, C N

    2001-10-01

    An immunochromatographic card test (ICT) that uses fingerprick whole blood instead of serum for diagnosis of bancroftian filariasis has recently been developed. The card test was validated in the field in Kenya by comparing its sensitivity to the combined sensitivity of Knott's concentration and counting chamber methods. A total of 102 (14.6%) and 117 (16.7%) persons was found to be microfilaremic by Knott's concentration and counting chamber methods, respectively. The geometric mean intensities (GMI) were 74.6 microfilariae (mf)/ml and 256.5 mf/ml by Knott's concentration and counting chamber methods, respectively. All infected individuals detected by both Knott's concentration and counting chamber methods were also antigen positive by the ICT filariasis card test (100% sensitivity). Further, of 97 parasitologically amicrofilaremic persons, 24 (24.7%) were antigen positive by the ICT. The overall prevalence of antigenemia was 37.3%. Of 100 nonendemic area control persons, none was found to be filarial antigen positive (100% specificity). The results show that the new version of the ICT filariasis card test is a simple, sensitive, specific, and rapid test that is convenient in field settings.

  19. Inspection Methods in Programming.

    DTIC Science & Technology

    1981-06-01

    Counting is a a specialization of Iterative-generation in which the generating function is Oneplus ) Waters’ second category of plan building method...is Oneplus and the initial input is 1. 0 I 180 CHAPTER NINE -ta a acio f igr9-.IeaieGnrtoPln 7 -7 STEADY STATE PLANS 181 TemporalPlan counting...specializalion iterative-generation roles .action(afu nction) ,tail(counting) conslraints .action.op = oneplus A .action.input = 1 The lItcrative-application

  20. Performance Equivalence and Validation of the Soleris Automated System for Quantitative Microbial Content Testing Using Pure Suspension Cultures.

    PubMed

    Limberg, Brian J; Johnstone, Kevin; Filloon, Thomas; Catrenich, Carl

    2016-09-01

    Using United States Pharmacopeia-National Formulary (USP-NF) general method <1223> guidance, the Soleris(®) automated system and reagents (Nonfermenting Total Viable Count for bacteria and Direct Yeast and Mold for yeast and mold) were validated, using a performance equivalence approach, as an alternative to plate counting for total microbial content analysis using five representative microbes: Staphylococcus aureus, Bacillus subtilis, Pseudomonas aeruginosa, Candida albicans, and Aspergillus brasiliensis. Detection times (DTs) in the alternative automated system were linearly correlated to CFU/sample (R(2) = 0.94-0.97) with ≥70% accuracy per USP General Chapter <1223> guidance. The LOD and LOQ of the automated system were statistically similar to the traditional plate count method. This system was significantly more precise than plate counting (RSD 1.2-2.9% for DT, 7.8-40.6% for plate counts), was statistically comparable to plate counting with respect to variations in analyst, vial lots, and instruments, and was robust when variations in the operating detection thresholds (dTs; ±2 units) were used. The automated system produced accurate results, was more precise and less labor-intensive, and met or exceeded criteria for a valid alternative quantitative method, consistent with USP-NF general method <1223> guidance.

  1. Remodeling census population with spatial information from Landsat TM imagery

    USGS Publications Warehouse

    Yuan, Y.; Smith, R.M.; Limp, W.F.

    1997-01-01

    In geographic information systems (GIS) studies there has been some difficulty integrating socioeconomic and physiogeographic data. One important type of socioeconomic data, census data, offers a wide range of socioeconomic information, but is aggregated within arbitrary enumeration districts (EDs). Values reflect either raw counts or, when standardized, the mean densities in the EDs. On the other hand, remote sensing imagery, an important type of physiogeographic data, provides large quantities of information with more spatial details than census data. Based on the dasymetric mapping principle, this study applies multivariable regression to examine the correlation between population counts from census and land cover types. The land cover map is classified from LandSat TM imagery. The correlation is high. Census population counts are remodeled to a GIS raster layer based on the discovered correlations coupled with scaling techniques, which offset influences from other than land cover types. The GIS raster layer depicts the population distribution with much more spatial detail than census data offer. The resulting GIS raster layer is ready to be analyzed or integrated with other GIS data. ?? 1998 Elsevier Science Ltd. All rights reserved.

  2. Counted Sb donors in Si quantum dots

    NASA Astrophysics Data System (ADS)

    Singh, Meenakshi; Pacheco, Jose; Bielejec, Edward; Perry, Daniel; Ten Eyck, Gregory; Bishop, Nathaniel; Wendt, Joel; Luhman, Dwight; Carroll, Malcolm; Lilly, Michael

    2015-03-01

    Deterministic control over the location and number of donors is critical for donor spin qubits in semiconductor based quantum computing. We have developed techniques using a focused ion beam and a diode detector integrated next to a silicon MOS single electron transistor to gain such control. With the diode detector operating in linear mode, the numbers of ions implanted have been counted and single ion implants have been detected. Poisson statistics in the number of ions implanted have been observed. Transport measurements performed on samples with counted number of implants have been performed and regular coulomb blockade and charge offsets observed. The capacitances to various gates are found to be in agreement with QCAD simulations for an electrostatically defined dot. This work was performed, in part, at the Center for Integrated Nanotechnologies, a U.S. DOE Office of Basic Energy Sciences user facility. The work was supported by Sandia National Laboratories Directed Research and Development Program. Sandia National Laboratories is a multi-program laboratory operated by Sandia Corporation, a Lockheed-Martin Company, for the U. S. Department of Energy under Contract No. DE-AC04-94AL85000.

  3. Integrated Arrival and Departure Schedule Optimization Under Uncertainty

    NASA Technical Reports Server (NTRS)

    Xue, Min; Zelinski, Shannon

    2014-01-01

    In terminal airspace, integrating arrivals and departures with shared waypoints provides the potential of improving operational efficiency by allowing direct routes when possible. Incorporating stochastic evaluation as a post-analysis process of deterministic optimization, and imposing a safety buffer in deterministic optimization, are two ways to learn and alleviate the impact of uncertainty and to avoid unexpected outcomes. This work presents a third and direct way to take uncertainty into consideration during the optimization. The impact of uncertainty was incorporated into cost evaluations when searching for the optimal solutions. The controller intervention count was computed using a heuristic model and served as another stochastic cost besides total delay. Costs under uncertainty were evaluated using Monte Carlo simulations. The Pareto fronts that contain a set of solutions were identified and the trade-off between delays and controller intervention count was shown. Solutions that shared similar delays but had different intervention counts were investigated. The results showed that optimization under uncertainty could identify compromise solutions on Pareto fonts, which is better than deterministic optimization with extra safety buffers. It helps decision-makers reduce controller intervention while achieving low delays.

  4. Label-free enumeration, collection and downstream cytological and cytogenetic analysis of circulating tumor cells.

    PubMed

    Dhar, Manjima; Pao, Edward; Renier, Corinne; Go, Derek E; Che, James; Montoya, Rosita; Conrad, Rachel; Matsumoto, Melissa; Heirich, Kyra; Triboulet, Melanie; Rao, Jianyu; Jeffrey, Stefanie S; Garon, Edward B; Goldman, Jonathan; Rao, Nagesh P; Kulkarni, Rajan; Sollier-Christen, Elodie; Di Carlo, Dino

    2016-10-14

    Circulating tumor cells (CTCs) have a great potential as indicators of metastatic disease that may help physicians improve cancer prognostication, treatment and patient outcomes. Heterogeneous marker expression as well as the complexity of current antibody-based isolation and analysis systems highlights the need for alternative methods. In this work, we use a microfluidic Vortex device that can selectively isolate potential tumor cells from blood independent of cell surface expression. This system was adapted to interface with three protein-marker-free analysis techniques: (i) an in-flow automated image processing system to enumerate cells released, (ii) cytological analysis using Papanicolaou (Pap) staining and (iii) fluorescence in situ hybridization (FISH) targeting the ALK rearrangement. In-flow counting enables a rapid assessment of the cancer-associated large circulating cells in a sample within minutes to determine whether standard downstream assays such as cytological and cytogenetic analyses that are more time consuming and costly are warranted. Using our platform integrated with these workflows, we analyzed 32 non-small cell lung cancer (NSCLC) and 22 breast cancer patient samples, yielding 60 to 100% of the cancer patients with a cell count over the healthy threshold, depending on the detection method used: respectively 77.8% for automated, 60-100% for cytology, and 80% for immunostaining based enumeration.

  5. Label-free enumeration, collection and downstream cytological and cytogenetic analysis of circulating tumor cells

    PubMed Central

    Dhar, Manjima; Pao, Edward; Renier, Corinne; Go, Derek E.; Che, James; Montoya, Rosita; Conrad, Rachel; Matsumoto, Melissa; Heirich, Kyra; Triboulet, Melanie; Rao, Jianyu; Jeffrey, Stefanie S.; Garon, Edward B.; Goldman, Jonathan; Rao, Nagesh P.; Kulkarni, Rajan; Sollier-Christen, Elodie; Di Carlo, Dino

    2016-01-01

    Circulating tumor cells (CTCs) have a great potential as indicators of metastatic disease that may help physicians improve cancer prognostication, treatment and patient outcomes. Heterogeneous marker expression as well as the complexity of current antibody-based isolation and analysis systems highlights the need for alternative methods. In this work, we use a microfluidic Vortex device that can selectively isolate potential tumor cells from blood independent of cell surface expression. This system was adapted to interface with three protein-marker-free analysis techniques: (i) an in-flow automated image processing system to enumerate cells released, (ii) cytological analysis using Papanicolaou (Pap) staining and (iii) fluorescence in situ hybridization (FISH) targeting the ALK rearrangement. In-flow counting enables a rapid assessment of the cancer-associated large circulating cells in a sample within minutes to determine whether standard downstream assays such as cytological and cytogenetic analyses that are more time consuming and costly are warranted. Using our platform integrated with these workflows, we analyzed 32 non-small cell lung cancer (NSCLC) and 22 breast cancer patient samples, yielding 60 to 100% of the cancer patients with a cell count over the healthy threshold, depending on the detection method used: respectively 77.8% for automated, 60–100% for cytology, and 80% for immunostaining based enumeration. PMID:27739521

  6. Real-time dynamic range and signal to noise enhancement in beam-scanning microscopy by integration of sensor characteristics, data acquisition hardware, and statistical methods

    NASA Astrophysics Data System (ADS)

    Kissick, David J.; Muir, Ryan D.; Sullivan, Shane Z.; Oglesbee, Robert A.; Simpson, Garth J.

    2013-02-01

    Despite the ubiquitous use of multi-photon and confocal microscopy measurements in biology, the core techniques typically suffer from fundamental compromises between signal to noise (S/N) and linear dynamic range (LDR). In this study, direct synchronous digitization of voltage transients coupled with statistical analysis is shown to allow S/N approaching the theoretical maximum throughout an LDR spanning more than 8 decades, limited only by the dark counts of the detector on the low end and by the intrinsic nonlinearities of the photomultiplier tube (PMT) detector on the high end. Synchronous digitization of each voltage transient represents a fundamental departure from established methods in confocal/multi-photon imaging, which are currently based on either photon counting or signal averaging. High information-density data acquisition (up to 3.2 GB/s of raw data) enables the smooth transition between the two modalities on a pixel-by-pixel basis and the ultimate writing of much smaller files (few kB/s). Modeling of the PMT response allows extraction of key sensor parameters from the histogram of voltage peak-heights. Applications in second harmonic generation (SHG) microscopy are described demonstrating S/N approaching the shot-noise limit of the detector over large dynamic ranges.

  7. ChIP-PaM: an algorithm to identify protein-DNA interaction using ChIP-Seq data.

    PubMed

    Wu, Song; Wang, Jianmin; Zhao, Wei; Pounds, Stanley; Cheng, Cheng

    2010-06-03

    ChIP-Seq is a powerful tool for identifying the interaction between genomic regulators and their bound DNAs, especially for locating transcription factor binding sites. However, high cost and high rate of false discovery of transcription factor binding sites identified from ChIP-Seq data significantly limit its application. Here we report a new algorithm, ChIP-PaM, for identifying transcription factor target regions in ChIP-Seq datasets. This algorithm makes full use of a protein-DNA binding pattern by capitalizing on three lines of evidence: 1) the tag count modelling at the peak position, 2) pattern matching of a specific tag count distribution, and 3) motif searching along the genome. A novel data-based two-step eFDR procedure is proposed to integrate the three lines of evidence to determine significantly enriched regions. Our algorithm requires no technical controls and efficiently discriminates falsely enriched regions from regions enriched by true transcription factor (TF) binding on the basis of ChIP-Seq data only. An analysis of real genomic data is presented to demonstrate our method. In a comparison with other existing methods, we found that our algorithm provides more accurate binding site discovery while maintaining comparable statistical power.

  8. Pixelized Device Control Actuators for Large Adaptive Optics

    NASA Technical Reports Server (NTRS)

    Knowles, Gareth J.; Bird, Ross W.; Shea, Brian; Chen, Peter

    2009-01-01

    A fully integrated, compact, adaptive space optic mirror assembly has been developed, incorporating new advances in ultralight, high-performance composite mirrors. The composite mirrors use Q-switch matrix architecture-based pixelized control (PMN-PT) actuators, which achieve high-performance, large adaptive optic capability, while reducing the weight of present adaptive optic systems. The self-contained, fully assembled, 11x11x4-in. (approx.= 28x28x10-cm) unit integrates a very-high-performance 8-in. (approx.=20-cm) optic, and has 8-kHz true bandwidth. The assembled unit weighs less than 15 pounds (=6.8 kg), including all mechanical assemblies, power electronics, control electronics, drive electronics, face sheet, wiring, and cabling. It requires just three wires to be attached (power, ground, and signal) for full-function systems integration, and uses a steel-frame and epoxied electronics. The three main innovations are: 1. Ultralightweight composite optics: A new replication method for fabrication of very thin composite 20-cm-diameter laminate face sheets with good as-fabricated optical figure was developed. The approach is a new mandrel resin surface deposition onto previously fabricated thin composite laminates. 2. Matrix (regenerative) power topology: Waveform correction can be achieved across an entire face sheet at 6 kHz, even for large actuator counts. In practice, it was found to be better to develop a quadrant drive, that is, four quadrants of 169 actuators behind the face sheet. Each quadrant has a single, small, regenerative power supply driving all 169 actuators at 8 kHz in effective parallel. 3. Q-switch drive architecture: The Q-switch innovation is at the heart of the matrix architecture, and allows for a very fast current draw into a desired actuator element in 120 counts of a MHz clock without any actuator coupling.

  9. SURVIVAL OF SALMONELLA SPECIES IN RIVER WATER.

    EPA Science Inventory

    The survival of four Salmonella strains in river water microcosms was monitored using culturing techniques, direct counts, whole cell hybridization, scanning electron microscopy, and resuscitation techniques via the direct viable count method and flow cytrometry. Plate counts of...

  10. Counting Dots.

    ERIC Educational Resources Information Center

    Repine, Tom; Hemler, Deb; Lane, Duane

    2003-01-01

    Presents a problem-solving investigation on coal mining that integrates science and mathematics with geology. Engages students in a scenario in which they play the roles of geologists and mining engineers. (NB)

  11. Smart fast blood counting of trace volumes of body fluids from various mammalian species using a compact custom-built microscope cytometer (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Smith, Zachary J.; Gao, Tingjuan; Lin, Tzu-Yin; Carrade-Holt, Danielle; Lane, Stephen M.; Matthews, Dennis L.; Dwyre, Denis M.; Wachsmann-Hogiu, Sebastian

    2016-03-01

    Cell counting in human body fluids such as blood, urine, and CSF is a critical step in the diagnostic process for many diseases. Current automated methods for cell counting are based on flow cytometry systems. However, these automated methods are bulky, costly, require significant user expertise, and are not well suited to counting cells in fluids other than blood. Therefore, their use is limited to large central laboratories that process enough volume of blood to recoup the significant capital investment these instruments require. We present in this talk a combination of a (1) low-cost microscope system, (2) simple sample preparation method, and (3) fully automated analysis designed for providing cell counts in blood and body fluids. We show results on both humans and companion and farm animals, showing that accurate red cell, white cell, and platelet counts, as well as hemoglobin concentration, can be accurately obtained in blood, as well as a 3-part white cell differential in human samples. We can also accurately count red and white cells in body fluids with a limit of detection ~3 orders of magnitude smaller than current automated instruments. This method uses less than 1 microliter of blood, and less than 5 microliters of body fluids to make its measurements, making it highly compatible with finger-stick style collections, as well as appropriate for small animals such as laboratory mice where larger volume blood collections are dangerous to the animal's health.

  12. Integrated micro-optofluidic platform for real-time detection of airborne microorganisms

    NASA Astrophysics Data System (ADS)

    Choi, Jeongan; Kang, Miran; Jung, Jae Hee

    2015-11-01

    We demonstrate an integrated micro-optofluidic platform for real-time, continuous detection and quantification of airborne microorganisms. Measurements of the fluorescence and light scattering from single particles in a microfluidic channel are used to determine the total particle number concentration and the microorganism number concentration in real-time. The system performance is examined by evaluating standard particle measurements with various sample flow rates and the ratios of fluorescent to non-fluorescent particles. To apply this method to real-time detection of airborne microorganisms, airborne Escherichia coli, Bacillus subtilis, and Staphylococcus epidermidis cells were introduced into the micro-optofluidic platform via bioaerosol generation, and a liquid-type particle collection setup was used. We demonstrate successful discrimination of SYTO82-dyed fluorescent bacterial cells from other residue particles in a continuous and real-time manner. In comparison with traditional microscopy cell counting and colony culture methods, this micro-optofluidic platform is not only more accurate in terms of the detection efficiency for airborne microorganisms but it also provides additional information on the total particle number concentration.

  13. Integrated micro-optofluidic platform for real-time detection of airborne microorganisms

    PubMed Central

    Choi, Jeongan; Kang, Miran; Jung, Jae Hee

    2015-01-01

    We demonstrate an integrated micro-optofluidic platform for real-time, continuous detection and quantification of airborne microorganisms. Measurements of the fluorescence and light scattering from single particles in a microfluidic channel are used to determine the total particle number concentration and the microorganism number concentration in real-time. The system performance is examined by evaluating standard particle measurements with various sample flow rates and the ratios of fluorescent to non-fluorescent particles. To apply this method to real-time detection of airborne microorganisms, airborne Escherichia coli, Bacillus subtilis, and Staphylococcus epidermidis cells were introduced into the micro-optofluidic platform via bioaerosol generation, and a liquid-type particle collection setup was used. We demonstrate successful discrimination of SYTO82-dyed fluorescent bacterial cells from other residue particles in a continuous and real-time manner. In comparison with traditional microscopy cell counting and colony culture methods, this micro-optofluidic platform is not only more accurate in terms of the detection efficiency for airborne microorganisms but it also provides additional information on the total particle number concentration. PMID:26522006

  14. Precision measurement of the photon detection efficiency of silicon photomultipliers using two integrating spheres.

    PubMed

    Yang, Seul Ki; Lee, J; Kim, Sug-Whan; Lee, Hye-Young; Jeon, Jin-A; Park, I H; Yoon, Jae-Ryong; Baek, Yang-Sik

    2014-01-13

    We report a new and improved photon counting method for the precision PDE measurement of SiPM detectors, utilizing two integrating spheres connected serially and calibrated reference detectors. First, using a ray tracing simulation and irradiance measurement results with a reference photodiode, we investigated irradiance characteristics of the measurement instrument, and analyzed dominating systematic uncertainties in PDE measurement. Two SiPM detectors were then used for PDE measurements between wavelengths of 368 and 850 nm and for bias voltages varying from around 70V. The resulting PDEs of the SiPMs show good agreement with those from other studies, yet with an improved accuracy of 1.57% (1σ). This was achieved by the simultaneous measurement with the NIST calibrated reference detectors, which suppressed the time dependent variation of source light. The technical details of the instrumentation, measurement results and uncertainty analysis are reported together with their implications.

  15. Method selection and adaptation for distributed monitoring of infectious diseases for syndromic surveillance.

    PubMed

    Xing, Jian; Burkom, Howard; Tokars, Jerome

    2011-12-01

    Automated surveillance systems require statistical methods to recognize increases in visit counts that might indicate an outbreak. In prior work we presented methods to enhance the sensitivity of C2, a commonly used time series method. In this study, we compared the enhanced C2 method with five regression models. We used emergency department chief complaint data from US CDC BioSense surveillance system, aggregated by city (total of 206 hospitals, 16 cities) during 5/2008-4/2009. Data for six syndromes (asthma, gastrointestinal, nausea and vomiting, rash, respiratory, and influenza-like illness) was used and was stratified by mean count (1-19, 20-49, ≥50 per day) into 14 syndrome-count categories. We compared the sensitivity for detecting single-day artificially-added increases in syndrome counts. Four modifications of the C2 time series method, and five regression models (two linear and three Poisson), were tested. A constant alert rate of 1% was used for all methods. Among the regression models tested, we found that a Poisson model controlling for the logarithm of total visits (i.e., visits both meeting and not meeting a syndrome definition), day of week, and 14-day time period was best. Among 14 syndrome-count categories, time series and regression methods produced approximately the same sensitivity (<5% difference) in 6; in six categories, the regression method had higher sensitivity (range 6-14% improvement), and in two categories the time series method had higher sensitivity. When automated data are aggregated to the city level, a Poisson regression model that controls for total visits produces the best overall sensitivity for detecting artificially added visit counts. This improvement was achieved without increasing the alert rate, which was held constant at 1% for all methods. These findings will improve our ability to detect outbreaks in automated surveillance system data. Published by Elsevier Inc.

  16. Comparison of Kato-Katz thick-smear and McMaster egg counting method for the assessment of drug efficacy against soil-transmitted helminthiasis in school children in Jimma Town, Ethiopia.

    PubMed

    Bekana, Teshome; Mekonnen, Zeleke; Zeynudin, Ahmed; Ayana, Mio; Getachew, Mestawet; Vercruysse, Jozef; Levecke, Bruno

    2015-10-01

    There is a paucity of studies that compare efficacy of drugs obtained by different diagnostic methods. We compared the efficacy of a single oral dose albendazole (400 mg), measured as egg reduction rate, against soil-transmitted helminth infections in 210 school children (Jimma Town, Ethiopia) using both Kato-Katz thick smear and McMaster egg counting method. Our results indicate that differences in sensitivity and faecal egg counts did not imply a significant difference in egg reduction rate estimates. The choice of a diagnostic method to assess drug efficacy should not be based on sensitivity and faecal egg counts only. © The Author 2015. Published by Oxford University Press on behalf of Royal Society of Tropical Medicine and Hygiene. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  17. Quick counting method for estimating the number of viable microbes on food and food processing equipment.

    PubMed

    Winter, F H; York, G K; el-Nakhal, H

    1971-07-01

    A rapid method for estimating the extent of microbial contamination on food and on food processing equipment is described. Microbial cells are rinsed from food or swab samples with sterile diluent and concentrated on the surface of membrane filters. The filters are incubated on a suitable bacteriological medium for 4 hr at 30 C, heated at 105 C for 5 min, and stained. The membranes are then dried at 60 C for 15 min, rendered transparent with immersion oil, and examined microscopically. Data obtained by the rapid method were compared with counts of the same samples determined by the standard plate count method. Over 60 comparisons resulted in a correlation coefficient of 0.906. Because the rapid technique can provide reliable microbiological count information in extremely short times, it can be a most useful tool in the routine evaluation of microbial contamination of food processing facilities and for some foods.

  18. Reduction in Male Suicide Mortality Following the 2006 Russian Alcohol Policy: An Interrupted Time Series Analysis

    PubMed Central

    Chamlin, Mitchell B.; Andreev, Evgeny

    2013-01-01

    Objectives. We took advantage of a natural experiment to assess the impact on suicide mortality of a suite of Russian alcohol policies. Methods. We obtained suicide counts from anonymous death records collected by the Russian Federal State Statistics Service. We used autoregressive integrated moving average (ARIMA) interrupted time series techniques to model the effect of the alcohol policy (implemented in January 2006) on monthly male and female suicide counts between January 2000 and December 2010. Results. Monthly male and female suicide counts decreased during the period under study. Although the ARIMA analysis showed no impact of the policy on female suicide mortality, the results revealed an immediate and permanent reduction of about 9% in male suicides (Ln ω0 = −0.096; P = .01). Conclusions. Despite a recent decrease in mortality, rates of alcohol consumption and suicide in Russia remain among the highest in the world. Our analysis revealed that the 2006 alcohol policy in Russia led to a 9% reduction in male suicide mortality, meaning the policy was responsible for saving 4000 male lives annually that would otherwise have been lost to suicide. Together with recent similar findings elsewhere, our results suggest an important role for public health and other population level interventions, including alcohol policy, in reducing alcohol-related harm. PMID:24028249

  19. Establishment of HPC(R2A) for regrowth control in non-chlorinated distribution systems.

    PubMed

    Uhl, Wolfgang; Schaule, Gabriela

    2004-05-01

    Drinking water distributed without disinfection and without regrowth problems for many years may show bacterial regrowth when the residence time and/or temperature in the distribution system increases or when substrate and/or bacterial concentration in the treated water increases. An example of a regrowth event in a major German city is discussed. Regrowth of HPC bacteria occurred unexpectedly at the end of a very hot summer. No pathogenic or potentially pathogenic bacteria were identified. Increased residence times in the distribution system and temperatures up to 25 degrees C were identified as most probable causes and the regrowth event was successfully overcome by changing flow regimes and decreasing residence times. Standard plate counts of HPC bacteria using the spread plate technique on nutrient rich agar according to German Drinking Water Regulations (GDWR) had proven to be a very good indicator of hygienically safe drinking water and to demonstrate the effectiveness of water treatment. However, the method proved insensitive for early regrowth detection. Regrowth experiments in the lab and sampling of the distribution system during two summers showed that spread plate counts on nutrient-poor R2A agar after 7-day incubation yielded 100 to 200 times higher counts. Counts on R2A after 3-day incubation were three times less than after 7 days. As the precision of plate count methods is very poor for counts less than 10 cfu/plate, a method yielding higher counts is better suited to detect upcoming regrowth than a method yielding low counts. It is shown that for the identification of regrowth events HPC(R2A) gives a further margin of about 2 weeks for reaction before HPC(GDWR). Copyright 2003 Elsevier B.V.

  20. SURVIVAL OF SALMONELLA SPECIES IN RIVER WATER

    EPA Science Inventory

    The survival of four Salmonella strains in river water microcosms was monitored by culturing techniques, direct counts, whole-cell hybridization, scanning electron microscopy, and resuscitation techniques via the direct viable count method and flow cytometry. Plate counts of bact...

  1. Methods for evaluating the effects of environmental chemicals on human sperm production

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wyrobek, A.J.

    1982-04-20

    Sperm tests provide a direct and effective way of identifying chemical agents that induce spermatogenic damage in man. Four human sperm tests are available: sperm count, motility, morphology (seminal cytology), and the Y-body test. These sperm tests have numerous advantages over other approaches for assessing spermatogenic damage, and they have already been used to assess the effects of at least 85 different occupational, envionmental, and drug-related chemical exposures. When carefully controlled, seminal cytology appears to be statistically more sensitive than the other human sperm tests and should be considered an integral part of semen analysis when assessing induced spermatogenic damage.

  2. Low-cost optical interconnect module for parallel optical data links

    NASA Astrophysics Data System (ADS)

    Noddings, Chad; Hirsch, Tom J.; Olla, M.; Spooner, C.; Yu, Jason J.

    1995-04-01

    We have designed, fabricated, and tested a prototype parallel ten-channel unidirectional optical data link. When scaled to production, we project that this technology will satisfy the following market penetration requirements: (1) up to 70 meters transmission distance, (2) at least 1 gigabyte/second data rate, and (3) 0.35 to 0.50 MByte/second volume selling price. These goals can be achieved by means of the assembly innovations described in this paper: a novel alignment method that is integrated with low-cost, few chip module packaging techniques, yielding high coupling and reducing the component count. Furthermore, high coupling efficiency increases projected reliability reducing the driver's power requirements.

  3. Photon counting detector for the personal radiography inspection system "SIBSCAN"

    NASA Astrophysics Data System (ADS)

    Babichev, E. A.; Baru, S. E.; Grigoriev, D. N.; Leonov, V. V.; Oleynikov, V. P.; Porosev, V. V.; Savinov, G. A.

    2017-02-01

    X-ray detectors operating in the energy integrating mode are successfully used in many different applications. Nevertheless the direct photon counting detectors, having the superior parameters in comparison with the integrating ones, are rarely used yet. One of the reasons for this is the low value of the electrical signal generated by a detected photon. Silicon photomultiplier (SiPM) based scintillation counters have a high detection efficiency, high electronic gain and compact dimensions. This makes them a very attractive candidate to replace routinely used detectors in many fields. More than 10 years ago the digital scanning radiography system based on multistrip ionization chamber (MIC) was suggested at Budker Institute of Nuclear Physics. The detector demonstrates excellent radiation resistance and parameter stability after 5 year operations and an imaging of up to 1000 persons per day. Currently, the installations operate at several Russian airports and at subway stations in some cities. At the present time we design a new detector operating in the photon counting mode, having superior parameters than the gas one, based on scintillator - SiPM assemblies. This detector has close to zero noise, higher quantum efficiency and a count rate capability of more than 5 MHz per channel (20% losses), which leads to better image quality and improved detection capability. The suggested detector technology could be expanded to medical applications.

  4. The Ndynamics package—Numerical analysis of dynamical systems and the fractal dimension of boundaries

    NASA Astrophysics Data System (ADS)

    Avellar, J.; Duarte, L. G. S.; da Mota, L. A. C. P.; de Melo, N.; Skea, J. E. F.

    2012-09-01

    A set of Maple routines is presented, fully compatible with the new releases of Maple (14 and higher). The package deals with the numerical evolution of dynamical systems and provide flexible plotting of the results. The package also brings an initial conditions generator, a numerical solver manager, and a focusing set of routines that allow for better analysis of the graphical display of the results. The novelty that the package presents an optional C interface is maintained. This allows for fast numerical integration, even for the totally inexperienced Maple user, without any C expertise being required. Finally, the package provides the routines to calculate the fractal dimension of boundaries (via box counting). New version program summary Program Title: Ndynamics Catalogue identifier: %Leave blank, supplied by Elsevier. Licensing provisions: no. Programming language: Maple, C. Computer: Intel(R) Core(TM) i3 CPU M330 @ 2.13 GHz. Operating system: Windows 7. RAM: 3.0 GB Keywords: Dynamical systems, Box counting, Fractal dimension, Symbolic computation, Differential equations, Maple. Classification: 4.3. Catalogue identifier of previous version: ADKH_v1_0. Journal reference of previous version: Comput. Phys. Commun. 119 (1999) 256. Does the new version supersede the previous version?: Yes. Nature of problem Computation and plotting of numerical solutions of dynamical systems and the determination of the fractal dimension of the boundaries. Solution method The default method of integration is a fifth-order Runge-Kutta scheme, but any method of integration present on the Maple system is available via an argument when calling the routine. A box counting [1] method is used to calculate the fractal dimension [2] of the boundaries. Reasons for the new version The Ndynamics package met a demand of our research community for a flexible and friendly environment for analyzing dynamical systems. All the user has to do is create his/her own Maple session, with the system to be studied, and use the commands on the package to (for instance) calculate the fractal dimension of a certain boundary, without knowing or worrying about a single line of C programming. So the package combines the flexibility and friendly aspect of Maple with the fast and robust numerical integration of the compiled (for example C) basin. The package is old, but the problems it was designed to dealt with are still there. Since Maple evolved, the package stopped working, and we felt compelled to produce this version, fully compatible with the latest version of Maple, to make it again available to the Maple user. Summary of revisions Deprecated Maple Packages and Commands: Paraphrasing the Maple in-built help files, "Some Maple commands and packages are deprecated. A command (or package) is deprecated when its functionality has been replaced by an improved implementation. The newer command is said to supersede the older one, and use of the newer command is strongly recommended". So, we have examined our code to see if some of these occurrences could be dangerous for it. For example, the "readlib" command is unnecessary, and we have removed its occurrences from our code. We have checked and changed all the necessary commands in order for us to be safe in respect to danger from this source. Another change we had to make was related to the tools we have implemented in order to use the interface for performing the numerical integration in C, externally, via the use of the Maple command "ssystem". In the past, we had used, for the external C integration, the DJGPP system. But now we present the package with (free) Borland distribution. The compilation and compiling commands are now slightly changed. For example, to compile only, we had used "gcc-c"; now, we use "bcc32-c", etc. All this installation (Borland) is explained on a "README" file we are submitting here to help the potential user. Restrictions Besides the inherent restrictions of numerical integration methods, this version of the package only deals with systems of first-order differential equations. Unusual features This package provides user-friendly software tools for analyzing the character of a dynamical system, whether it displays chaotic behaviour, and so on. Options within the package allow the user to specify characteristics that separate the trajectories into families of curves. In conjunction with the facilities for altering the user's viewpoint, this provides a graphical interface for the speedy and easy identification of regions with interesting dynamics. An unusual characteristic of the package is its interface for performing the numerical integrations in C using a fifth-order Runge-Kutta method (default). This potentially improves the speed of the numerical integration by some orders of magnitude and, in cases where it is necessary to calculate thousands of graphs in regions of difficult integration, this feature is very desirable. Besides that tool, somewhat more experienced users can produce their own C integrator and, by using the commands available in the package, use it as the C integrator provided with the package as long as the new integrator manages the input and output in the same format as the default one does. Running time This depends strongly on the dynamical system. With an Intel® Core™ i3 CPU M330 @ 2.13 GHz, the integration of 50 graphs, for a system of two first-order equations, typically takes less than a second to run (with the C integration interface). Without the C interface, it takes a few seconds. In order to calculate the fractal dimension, where we typically use 10,000 points to integrate, using the C interface it takes from 20 to 30 s. Without the C interface, it becomes really impractical, taking, sometimes, for the same case, almost an hour. For some cases, it takes many hours.

  5. Conversion from Engineering Units to Telemetry Counts on Dryden Flight Simulators

    NASA Technical Reports Server (NTRS)

    Fantini, Jay A.

    1998-01-01

    Dryden real-time flight simulators encompass the simulation of pulse code modulation (PCM) telemetry signals. This paper presents a new method whereby the calibration polynomial (from first to sixth order), representing the conversion from counts to engineering units (EU), is numerically inverted in real time. The result is less than one-count error for valid EU inputs. The Newton-Raphson method is used to numerically invert the polynomial. A reverse linear interpolation between the EU limits is used to obtain an initial value for the desired telemetry count. The method presented here is not new. What is new is how classical numerical techniques are optimized to take advantage of modem computer power to perform the desired calculations in real time. This technique makes the method simple to understand and implement. There are no interpolation tables to store in memory as in traditional methods. The NASA F-15 simulation converts and transmits over 1000 parameters at 80 times/sec. This paper presents algorithm development, FORTRAN code, and performance results.

  6. Comparison of a new GIS-based technique and a manual method for determining sinkhole density: An example from Illinois' sinkhole plain

    USGS Publications Warehouse

    Angel, J.C.; Nelson, D.O.; Panno, S.V.

    2004-01-01

    A new Geographic Information System (GIS) method was developed as an alternative to the hand-counting of sinkholes on topographic maps for density and distribution studies. Sinkhole counts were prepared by hand and compared to those generated from USGS DLG data using ArcView 3.2 and the ArcInfo Workstation component of ArcGIS 8.1 software. The study area for this investigation, chosen for its great density of sinkholes, included the 42 public land survey sections that reside entirely within the Renault Quadrangle in southwestern Illinois. Differences between the sinkhole counts derived from the two methods for the Renault Quadrangle study area were negligible. Although the initial development and refinement of the GIS method required considerably more time than counting sinkholes by hand, the flexibility of the GIS method is expected to provide significant long-term benefits and time savings when mapping larger areas and expanding research efforts. ?? 2004 by The National Speleological Society.

  7. In situ DNA hybridized chain reaction (FISH-HCR) as a better method for quantification of bacteria and archaea within marine sediment

    NASA Astrophysics Data System (ADS)

    Buongiorno, J.; Lloyd, K. G.; Shumaker, A.; Schippers, A.; Webster, G.; Weightman, A.; Turner, S.

    2015-12-01

    Nearly 75% of the Earth's surface is covered by marine sediment that is home to an estimated 2.9 x 1029 microbial cells. A substantial impediment to understanding the abundance and distribution of cells within marine sediment is the lack of a consistent and reliable method for their taxon-specific quantification. Catalyzed reporter fluorescent in situ hybridization (CARD-FISH) provides taxon-specific enumeration, but this process requires passing a large enzyme through cell membranes, decreasing its precision relative to general cell counts using a small DNA stain. In 2015, Yamaguchi et al. developed FISH hybridization chain reaction (FISH-HCR) as an in situ whole cell detection method for environmental microorganisms. FISH-HCR amplifies the fluorescent signal, as does CARD-FISH, but it allows for milder cell permeation methods that might prevent yield loss. To compare FISH-HCR to CARD-FISH, we examined bacteria and archaea cell counts within two sediment cores, Lille Belt (~78 meters deep) and Landsort Deep (90 meters deep), which were retrieved from the Baltic Sea Basin during IODP Expedition 347. Preliminary analysis shows that CARD-FISH counts are below the quantification limit for most depths across both cores. By contrast, quantification of cells was possible with FISH-HCR in all examined depths. When quantification with CARD-FISH was above the limit of detection, counts with FISH-HCR were up to 11 fold higher for Bacteria and 3 fold higher for Archaea from the same sediment sample. Further, FISH-HCR counts follow the trends of on board counts nicely, indicating that FISH-HCR may better reflect the cellular abundance within marine sediment than other quantification methods, including qPCR. Using FISH-HCR, we found that archaeal cell counts were on average greater than bacterial cell counts, but within the same order of magnitude.

  8. Comparison of McMaster and FECPAKG2 methods for counting nematode eggs in the faeces of alpacas.

    PubMed

    Rashid, Mohammed H; Stevenson, Mark A; Waenga, Shea; Mirams, Greg; Campbell, Angus J D; Vaughan, Jane L; Jabbar, Abdul

    2018-05-02

    This study aimed to compare the FECPAK G2 and the McMaster techniques for counting of gastrointestinal nematode eggs in the faeces of alpacas using two floatation solutions (saturated sodium chloride and sucrose solutions). Faecal eggs counts from both techniques were compared using the Lin's concordance correlation coefficient and Bland and Altman statistics. Results showed moderate to good agreement between the two methods, with better agreement achieved when saturated sugar is used as a floatation fluid, particularly when faecal egg counts are less than 1000 eggs per gram of faeces. To the best of our knowledge this is the first study to assess agreement of measurements between McMaster and FECPAK G2 methods for estimating faecal eggs in South American camelids.

  9. The diabetes nutrition education study randomized controlled trial: A comparative effectiveness study of approaches to nutrition in diabetes self-management education.

    PubMed

    Bowen, Michael E; Cavanaugh, Kerri L; Wolff, Kathleen; Davis, Dianne; Gregory, Rebecca P; Shintani, Ayumi; Eden, Svetlana; Wallston, Ken; Elasy, Tom; Rothman, Russell L

    2016-08-01

    To compare the effectiveness of different approaches to nutrition education in diabetes self-management education and support (DSME/S). We randomized 150 adults with type 2 diabetes to either certified diabetes educator (CDE)-delivered DSME/S with carbohydrate gram counting or the modified plate method versus general health education. The primary outcome was change in HbA1C over 6 months. At 6 months, HbA1C improved within the plate method [-0.83% (-1.29, -0.33), P<0.001] and carbohydrate counting [-0.63% (-1.03, -0.18), P=0.04] groups but not the control group [P=0.34]. Change in HbA1C from baseline between the control and intervention groups was not significant at 6 months (carbohydrate counting, P=0.36; modified plate method, P=0.08). In a pre-specified subgroup analysis of patients with a baseline HbA1C 7-10%, change in HbA1C from baseline improved in the carbohydrate counting [-0.86% (-1.47, -0.26), P=0.006] and plate method groups [-0.76% (-1.33, -0.19), P=0.01] compared to controls. CDE-delivered DSME/S focused on carbohydrate counting or the modified plate method improved glycemic control in patients with an initial HbA1C between 7 and 10%. Both carbohydrate counting and the modified plate method improve glycemic control as part of DSME/S. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  10. Accurate measurement of peripheral blood mononuclear cell concentration using image cytometry to eliminate RBC-induced counting error.

    PubMed

    Chan, Leo Li-Ying; Laverty, Daniel J; Smith, Tim; Nejad, Parham; Hei, Hillary; Gandhi, Roopali; Kuksin, Dmitry; Qiu, Jean

    2013-02-28

    Peripheral blood mononuclear cells (PBMCs) have been widely researched in the fields of immunology, infectious disease, oncology, transplantation, hematological malignancy, and vaccine development. Specifically, in immunology research, PBMCs have been utilized to monitor concentration, viability, proliferation, and cytokine production from immune cells, which are critical for both clinical trials and biomedical research. The viability and concentration of isolated PBMCs are traditionally measured by manual counting with trypan blue (TB) using a hemacytometer. One of the common issues of PBMC isolation is red blood cell (RBC) contamination. The RBC contamination can be dependent on the donor sample and/or technical skill level of the operator. RBC contamination in a PBMC sample can introduce error to the measured concentration, which can pass down to future experimental assays performed on these cells. To resolve this issue, RBC lysing protocol can be used to eliminate potential error caused by RBC contamination. In the recent years, a rapid fluorescence-based image cytometry system has been utilized for bright-field and fluorescence imaging analysis of cellular characteristics (Nexcelom Bioscience LLC, Lawrence, MA). The Cellometer image cytometry system has demonstrated the capability of automated concentration and viability detection in disposable counting chambers of unpurified mouse splenocytes and PBMCs stained with acridine orange (AO) and propidium iodide (PI) under fluorescence detection. In this work, we demonstrate the ability of Cellometer image cytometry system to accurately measure PBMC concentration, despite RBC contamination, by comparison of five different total PBMC counting methods: (1) manual counting of trypan blue-stained PBMCs in hemacytometer, (2) manual counting of PBMCs in bright-field images, (3) manual counting of acetic acid lysing of RBCs with TB-stained PBMCs, (4) automated counting of acetic acid lysing of RBCs with PI-stained PBMCs, and (5) AO/PI dual staining method. The results show comparable total PBMC counting among all five methods, which validate the AO/PI staining method for PBMC measurement in the image cytometry method. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Energy-correction photon counting pixel for photon energy extraction under pulse pile-up

    NASA Astrophysics Data System (ADS)

    Lee, Daehee; Park, Kyungjin; Lim, Kyung Taek; Cho, Gyuseong

    2017-06-01

    A photon counting detector (PCD) has been proposed as an alternative solution to an energy-integrating detector (EID) in medical imaging field due to its high resolution, high efficiency, and low noise. The PCD has expanded to variety of fields such as spectral CT, k-edge imaging, and material decomposition owing to its capability to count and measure the number and the energy of an incident photon, respectively. Nonetheless, pulse pile-up, which is a superimposition of pulses at the output of a charge sensitive amplifier (CSA) in each PC pixel, occurs frequently as the X-ray flux increases due to the finite pulse processing time (PPT) in CSAs. Pulse pile-up induces not only a count loss but also distortion in the measured X-ray spectrum from each PC pixel and thus it is a main constraint on the use of PCDs in high flux X-ray applications. To minimize these effects, an energy-correction PC (ECPC) pixel is proposed to resolve pulse pile-up without cutting off the PPT by adding an energy correction logic (ECL) via a cross detection method (CDM). The ECPC pixel with a size of 200×200 μm2 was fabricated by using a 6-metal 1-poly 0.18 μm CMOS process with a static power consumption of 7.2 μW/pixel. The maximum count rate of the ECPC pixel was extended by approximately three times higher than that of a conventional PC pixel with a PPT of 500 nsec. The X-ray spectrum of 90 kVp, filtered by 3 mm Al filter, was measured as the X-ray current was increased using the CdTe and the ECPC pixel. As a result, the ECPC pixel dramatically reduced the energy spectrum distortion at 2 Mphotons/pixel/s when compared to that of the ERCP pixel with the same 500 nsec PPT.

  12. Relationships among indoor, outdoor, and personal airborne Japanese cedar pollen counts.

    PubMed

    Yamamoto, Naomichi; Matsuki, Yuuki; Yokoyama, Hiromichi; Matsuki, Hideaki

    2015-01-01

    Japanese cedar pollinosis (JCP) is an important illness caused by the inhalation of airborne allergenic cedar pollens, which are dispersed in the early spring throughout the Japanese islands. However, associations between pollen exposures and the prevalence or severity of allergic symptoms are largely unknown, due to a lack of understanding regarding personal pollen exposures in relation to indoor and outdoor concentrations. This study aims to examine the relationships among indoor, outdoor, and personal airborne Japanese cedar pollen counts. We conducted a 4-year monitoring campaign to quantify indoor, outdoor, and personal airborne cedar pollen counts, where the personal passive settling sampler that has been previously validated against a volumetric sampler was used to count airborne pollen grains. A total of 256 sets of indoor, outdoor, and personal samples (768 samples) were collected from 9 subjects. Medians of the seasonally-integrated indoor-to-outdoor, personal-to-outdoor, and personal-to-indoor ratios of airborne pollen counts measured for 9 subjects were 0.08, 0.10, and 1.19, respectively. A greater correlation was observed between the personal and indoor counts (r = 0.89) than between the personal and outdoor counts (r = 0.71), suggesting a potential inaccuracy in the use of outdoor counts as a basis for estimating personal exposures. The personal pollen counts differed substantially among the human subjects (49% geometric coefficient of variation), in part due to the variability in the indoor counts that have been found as major determinants of the personal pollen counts. The findings of this study highlight the need for pollen monitoring in proximity to human subjects to better understand the relationships between pollen exposures and the prevalence or severity of pollen allergy.

  13. Relationships among Indoor, Outdoor, and Personal Airborne Japanese Cedar Pollen Counts

    PubMed Central

    Yamamoto, Naomichi; Matsuki, Yuuki; Yokoyama, Hiromichi; Matsuki, Hideaki

    2015-01-01

    Japanese cedar pollinosis (JCP) is an important illness caused by the inhalation of airborne allergenic cedar pollens, which are dispersed in the early spring throughout the Japanese islands. However, associations between pollen exposures and the prevalence or severity of allergic symptoms are largely unknown, due to a lack of understanding regarding personal pollen exposures in relation to indoor and outdoor concentrations. This study aims to examine the relationships among indoor, outdoor, and personal airborne Japanese cedar pollen counts. We conducted a 4-year monitoring campaign to quantify indoor, outdoor, and personal airborne cedar pollen counts, where the personal passive settling sampler that has been previously validated against a volumetric sampler was used to count airborne pollen grains. A total of 256 sets of indoor, outdoor, and personal samples (768 samples) were collected from 9 subjects. Medians of the seasonally-integrated indoor-to-outdoor, personal-to-outdoor, and personal-to-indoor ratios of airborne pollen counts measured for 9 subjects were 0.08, 0.10, and 1.19, respectively. A greater correlation was observed between the personal and indoor counts (r = 0.89) than between the personal and outdoor counts (r = 0.71), suggesting a potential inaccuracy in the use of outdoor counts as a basis for estimating personal exposures. The personal pollen counts differed substantially among the human subjects (49% geometric coefficient of variation), in part due to the variability in the indoor counts that have been found as major determinants of the personal pollen counts. The findings of this study highlight the need for pollen monitoring in proximity to human subjects to better understand the relationships between pollen exposures and the prevalence or severity of pollen allergy. PMID:26110813

  14. Special Issue on "Instanton Counting: Moduli Spaces, Representation Theory, and Integrable Systems"

    NASA Astrophysics Data System (ADS)

    Bruzzo, Ugo; Sala, Francesco

    2016-11-01

    This special issue of the Journal of Geometry and Physics collects some papers that were presented during the workshop ;Instanton Counting: Moduli Spaces, Representation Theory, and Integrable Systems; that took place at the Lorentz Center in Leiden, The Netherlands, from 16 to 20 June 2014. The workshop was supported by the Lorentz Center, the ;Geometry and Quantum Theory; Cluster, Centre Européen pour les Mathématiques, la Physique et leurs Interactions (Lille, France), Laboratoire Angevin de Recherche en Mathématiques (Angers, France), SISSA (Trieste, Italy), and Foundation Compositio (Amsterdam, the Netherlands). We deeply thank all these institutions for making the workshop possible. We also thank the other organizers of the workshop, Professors Dimitri Markushevich, Vladimir Rubtsov and Sergey Shadrin, for their efforts and great collaboration.

  15. Count distribution for mixture of two exponentials as renewal process duration with applications

    NASA Astrophysics Data System (ADS)

    Low, Yeh Ching; Ong, Seng Huat

    2016-06-01

    A count distribution is presented by considering a renewal process where the distribution of the duration is a finite mixture of exponential distributions. This distribution is able to model over dispersion, a feature often found in observed count data. The computation of the probabilities and renewal function (expected number of renewals) are examined. Parameter estimation by the method of maximum likelihood is considered with applications of the count distribution to real frequency count data exhibiting over dispersion. It is shown that the mixture of exponentials count distribution fits over dispersed data better than the Poisson process and serves as an alternative to the gamma count distribution.

  16. Using multiple data types and integrated population models to improve our knowledge of apex predator population dynamics.

    PubMed

    Bled, Florent; Belant, Jerrold L; Van Daele, Lawrence J; Svoboda, Nathan; Gustine, David; Hilderbrand, Grant; Barnes, Victor G

    2017-11-01

    Current management of large carnivores is informed using a variety of parameters, methods, and metrics; however, these data are typically considered independently. Sharing information among data types based on the underlying ecological, and recognizing observation biases, can improve estimation of individual and global parameters. We present a general integrated population model (IPM), specifically designed for brown bears ( Ursus arctos ), using three common data types for bear ( U . spp.) populations: repeated counts, capture-mark-recapture, and litter size. We considered factors affecting ecological and observation processes for these data. We assessed the practicality of this approach on a simulated population and compared estimates from our model to values used for simulation and results from count data only. We then present a practical application of this general approach adapted to the constraints of a case study using historical data available for brown bears on Kodiak Island, Alaska, USA. The IPM provided more accurate and precise estimates than models accounting for repeated count data only, with credible intervals including the true population 94% and 5% of the time, respectively. For the Kodiak population, we estimated annual average litter size (within one year after birth) to vary between 0.45 [95% credible interval: 0.43; 0.55] and 1.59 [1.55; 1.82]. We detected a positive relationship between salmon availability and adult survival, with survival probabilities greater for females than males. Survival probabilities increased from cubs to yearlings to dependent young ≥2 years old and decreased with litter size. Linking multiple information sources based on ecological and observation mechanisms can provide more accurate and precise estimates, to better inform management. IPMs can also reduce data collection efforts by sharing information among agencies and management units. Our approach responds to an increasing need in bear populations' management and can be readily adapted to other large carnivores.

  17. Using multiple data types and integrated population models to improve our knowledge of apex predator population dynamics

    USGS Publications Warehouse

    Bled, Florent; Belant, Jerrold L.; Van Daele, Lawrence J.; Svoboda, Nathan; Gustine, David D.; Hilderbrand, Grant V.; Barnes, Victor G.

    2017-01-01

    Current management of large carnivores is informed using a variety of parameters, methods, and metrics; however, these data are typically considered independently. Sharing information among data types based on the underlying ecological, and recognizing observation biases, can improve estimation of individual and global parameters. We present a general integrated population model (IPM), specifically designed for brown bears (Ursus arctos), using three common data types for bear (U. spp.) populations: repeated counts, capture–mark–recapture, and litter size. We considered factors affecting ecological and observation processes for these data. We assessed the practicality of this approach on a simulated population and compared estimates from our model to values used for simulation and results from count data only. We then present a practical application of this general approach adapted to the constraints of a case study using historical data available for brown bears on Kodiak Island, Alaska, USA. The IPM provided more accurate and precise estimates than models accounting for repeated count data only, with credible intervals including the true population 94% and 5% of the time, respectively. For the Kodiak population, we estimated annual average litter size (within one year after birth) to vary between 0.45 [95% credible interval: 0.43; 0.55] and 1.59 [1.55; 1.82]. We detected a positive relationship between salmon availability and adult survival, with survival probabilities greater for females than males. Survival probabilities increased from cubs to yearlings to dependent young ≥2 years old and decreased with litter size. Linking multiple information sources based on ecological and observation mechanisms can provide more accurate and precise estimates, to better inform management. IPMs can also reduce data collection efforts by sharing information among agencies and management units. Our approach responds to an increasing need in bear populations’ management and can be readily adapted to other large carnivores.

  18. Fabrication of X-ray Microcalorimeter Focal Planes Composed of Two Distinct Pixel Types.

    PubMed

    Wassell, E J; Adams, J S; Bandler, S R; Betancourt-Martinez, G L; Chiao, M P; Chang, M P; Chervenak, J A; Datesman, A M; Eckart, M E; Ewin, A J; Finkbeiner, F M; Ha, J Y; Kelley, R; Kilbourne, C A; Miniussi, A R; Sakai, K; Porter, F; Sadleir, J E; Smith, S J; Wakeham, N A; Yoon, W

    2017-06-01

    We are developing superconducting transition-edge sensor (TES) microcalorimeter focal planes for versatility in meeting specifications of X-ray imaging spectrometers including high count-rate, high energy resolution, and large field-of-view. In particular, a focal plane composed of two sub-arrays: one of fine-pitch, high count-rate devices and the other of slower, larger pixels with similar energy resolution, offers promise for the next generation of astrophysics instruments, such as the X-ray Integral Field Unit (X-IFU) instrument on the European Space Agency's Athena mission. We have based the sub-arrays of our current design on successful pixel designs that have been demonstrated separately. Pixels with an all gold X-ray absorber on 50 and 75 micron scales where the Mo/Au TES sits atop a thick metal heatsinking layer have shown high resolution and can accommodate high count-rates. The demonstrated larger pixels use a silicon nitride membrane for thermal isolation, thinner Au and an added bismuth layer in a 250 micron square absorber. To tune the parameters of each sub-array requires merging the fabrication processes of the two detector types. We present the fabrication process for dual production of different X-ray absorbers on the same substrate, thick Au on the small pixels and thinner Au with a Bi capping layer on the larger pixels to tune their heat capacities. The process requires multiple electroplating and etching steps, but the absorbers are defined in a single ion milling step. We demonstrate methods for integrating heatsinking of the two types of pixel into the same focal plane consistent with the requirements for each sub-array, including the limiting of thermal crosstalk. We also discuss fabrication process modifications for tuning the intrinsic transition temperature (T c ) of the bilayers for the different device types through variation of the bilayer thicknesses. The latest results on these "hybrid" arrays will be presented.

  19. Fabrication of X-ray Microcalorimeter Focal Planes Composed of Two Distinct Pixel Types

    PubMed Central

    Wassell, E. J.; Adams, J. S.; Bandler, S. R.; Betancourt-Martinez, G. L.; Chiao, M. P.; Chang, M. P.; Chervenak, J. A.; Datesman, A. M.; Eckart, M. E.; Ewin, A. J.; Finkbeiner, F. M.; Ha, J. Y.; Kelley, R.; Kilbourne, C. A.; Miniussi, A. R.; Sakai, K.; Porter, F.; Sadleir, J. E.; Smith, S. J.; Wakeham, N. A.; Yoon, W.

    2017-01-01

    We are developing superconducting transition-edge sensor (TES) microcalorimeter focal planes for versatility in meeting specifications of X-ray imaging spectrometers including high count-rate, high energy resolution, and large field-of-view. In particular, a focal plane composed of two sub-arrays: one of fine-pitch, high count-rate devices and the other of slower, larger pixels with similar energy resolution, offers promise for the next generation of astrophysics instruments, such as the X-ray Integral Field Unit (X-IFU) instrument on the European Space Agency’s Athena mission. We have based the sub-arrays of our current design on successful pixel designs that have been demonstrated separately. Pixels with an all gold X-ray absorber on 50 and 75 micron scales where the Mo/Au TES sits atop a thick metal heatsinking layer have shown high resolution and can accommodate high count-rates. The demonstrated larger pixels use a silicon nitride membrane for thermal isolation, thinner Au and an added bismuth layer in a 250 micron square absorber. To tune the parameters of each sub-array requires merging the fabrication processes of the two detector types. We present the fabrication process for dual production of different X-ray absorbers on the same substrate, thick Au on the small pixels and thinner Au with a Bi capping layer on the larger pixels to tune their heat capacities. The process requires multiple electroplating and etching steps, but the absorbers are defined in a single ion milling step. We demonstrate methods for integrating heatsinking of the two types of pixel into the same focal plane consistent with the requirements for each sub-array, including the limiting of thermal crosstalk. We also discuss fabrication process modifications for tuning the intrinsic transition temperature (Tc) of the bilayers for the different device types through variation of the bilayer thicknesses. The latest results on these “hybrid” arrays will be presented. PMID:28804229

  20. An integrated electrolysis - electrospray - ionization antimicrobial platform using Engineered Water Nanostructures (EWNS) for food safety applications.

    PubMed

    Vaze, Nachiket; Jiang, Yi; Mena, Lucas; Zhang, Yipei; Bello, Dhimiter; Leonard, Stephen S; Morris, Anna M; Eleftheriadou, Mary; Pyrgiotakis, Georgios; Demokritou, Philip

    2018-03-01

    Engineered water nanostructures (EWNS) synthesized utilizing electrospray and ionization of water, have been, recently, shown to be an effective, green, antimicrobial platform for surface and air disinfection, where reactive oxygen species (ROS), generated and encapsulated within the particles during synthesis, were found to be the main inactivation mechanism. Herein, the antimicrobial potency of the EWNS was further enhanced by integrating electrolysis, electrospray and ionization of de-ionized water in the EWNS synthesis process. Detailed physicochemical characterization of these enhanced EWNS (eEWNS) was performed using state-of-the-art analytical methods and has shown that, while both size and charge remain similar to the EWNS (mean diameter of 13 nm and charge of 13 electrons), they possess a three times higher ROS content. The increase of the ROS content as a result of the addition of the electrolysis step before electrospray and ionization led to an increased antimicrobial ability as verified by E. coli inactivation studies using stainless steel coupons. It was shown that a 45-minute exposure to eEWNS resulted in a 4-log reduction as opposed to a 1.9-log reduction when exposed to EWNS. In addition, the eEWNS were assessed for their potency to inactivate natural microbiota (total viable and yeast and mold counts), as well as, inoculated E.coli on the surface of fresh organic blackberries. The results showed a 97% (1.5-log) inactivation of the total viable count, a 99% (2-log) reduction in the yeast and mold count and a 2.5-log reduction of the inoculated E.coli after 45 minutes of exposure, without any visual changes to the fruit. This enhanced antimicrobial activity further underpins the EWNS platform as an effective, dry and chemical free approach suitable for a variety of food safety applications and could be ideal for delicate fresh produce that cannot withstand the classical, wet disinfection treatments.

  1. Fabrication of X-ray Microcalorimeter Focal Planes Composed of Two Distinct Pixel Types

    NASA Technical Reports Server (NTRS)

    Wassell, Edward J.; Adams, Joseph S.; Bandler, Simon R.; Betancour-Martinez, Gabriele L; Chiao, Meng P.; Chang, Meng Ping; Chervenak, James A.; Datesman, Aaron M.; Eckart, Megan E.; Ewin, Audrey J.; hide

    2016-01-01

    We develop superconducting transition-edge sensor (TES) microcalorimeter focal planes for versatility in meeting the specifications of X-ray imaging spectrometers, including high count rate, high energy resolution, and large field of view. In particular, a focal plane composed of two subarrays: one of fine pitch, high count-rate devices and the other of slower, larger pixels with similar energy resolution, offers promise for the next generation of astrophysics instruments, such as the X-ray Integral Field Unit Instrument on the European Space Agencys ATHENA mission. We have based the subarrays of our current design on successful pixel designs that have been demonstrated separately. Pixels with an all-gold X-ray absorber on 50 and 75 micron pitch, where the Mo/Au TES sits atop a thick metal heatsinking layer, have shown high resolution and can accommodate high count rates. The demonstrated larger pixels use a silicon nitride membrane for thermal isolation, thinner Au, and an added bismuth layer in a 250-sq micron absorber. To tune the parameters of each subarray requires merging the fabrication processes of the two detector types. We present the fabrication process for dual production of different X-ray absorbers on the same substrate, thick Au on the small pixels and thinner Au with a Bi capping layer on the larger pixels to tune their heat capacities. The process requires multiple electroplating and etching steps, but the absorbers are defined in a single-ion milling step. We demonstrate methods for integrating the heatsinking of the two types of pixel into the same focal plane consistent with the requirements for each subarray, including the limiting of thermal crosstalk. We also discuss fabrication process modifications for tuning the intrinsic transition temperature (T(sub c)) of the bilayers for the different device types through variation of the bilayer thicknesses. The latest results on these 'hybrid' arrays will be presented.

  2. Rain volume estimation over areas using satellite and radar data

    NASA Technical Reports Server (NTRS)

    Doneaud, A. A.; Vonderhaar, T. H.

    1985-01-01

    An investigation of the feasibility of rain volume estimation using satellite data following a technique recently developed with radar data called the Arera Time Integral was undertaken. Case studies were selected on the basis of existing radar and satellite data sets which match in space and time. Four multicell clusters were analyzed. Routines for navigation remapping amd smoothing of satellite images were performed. Visible counts were normalized for solar zenith angle. A radar sector of interest was defined to delineate specific radar echo clusters for each radar time throughout the radar echo cluster lifetime. A satellite sector of interest was defined by applying small adjustments to the radar sector using a manual processing technique. The radar echo area, the IR maximum counts and the IR counts matching radar echo areas were found to evolve similarly, except for the decaying phase of the cluster where the cirrus debris keeps the IR counts high.

  3. Photon-counting detector arrays based on microchannel array plates. [for image enhancement

    NASA Technical Reports Server (NTRS)

    Timothy, J. G.

    1975-01-01

    The recent development of the channel electron multiplier (CEM) and its miniaturization into the microchannel array plate (MCP) offers the possibility of fully combining the advantages of the photographic and photoelectric detection systems. The MCP has an image-intensifying capability and the potential of being developed to yield signal outputs superior to those of conventional photomultipliers. In particular, the MCP has a photon-counting capability with a negligible dark-count rate. Furthermore, the MCP can operate stably and efficiently at extreme-ultraviolet and soft X-ray wavelengths in a windowless configuration or can be integrated with a photo-cathode in a sealed tube for use at ultraviolet and visible wavelengths. The operation of one- and two-dimensional photon-counting detector arrays based on the MCP at extreme-ultraviolet wavelengths is described, and the design of sealed arrays for use at ultraviolet and visible wavelengths is briefly discussed.

  4. Towards whole-body ultra-weak photon counting and imaging with a focus on human beings: a review.

    PubMed

    Van Wijk, Roeland; Van Wijk, Eduard P A; van Wietmarschen, Herman A; van der Greef, Jan

    2014-10-05

    For decades, the relationship between ultra-weak photon emission (UPE) and the health state of the body is being studied. With the advent of systems biology, attention shifted from the association between UPE and reactive oxygen species towards UPE as a reflection of changed metabolic networks. Essential for this shift in thinking is the development of novel photon count statistical methods that more reflect the dynamics of the systems organization. Additionally, efforts to combine and correlate UPE data with other types of measurements such as metabolomics be key to understand the complexity of the human body. This review describes the history and developments in the area of human UPE research from a technical - methodological perspective, an experimental perspective and a theoretical perspective. There is ample evidence that human UPE research will allow a better understanding of the body as a complex dynamical system. The future lies in the further development of an integrated UPE and metabolomics platform for a personalized monitoring of changes of the system towards health or disease. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Renormalization and additional degrees of freedom within the chiral effective theory for spin-1 resonances

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kampf, Karol; Institute of Particle and Nuclear Physics, Faculty of Mathematics and Physics, Charles University in Prague, V Holesovickach 2, 18000 Prague; Novotny, Jiri

    2010-06-01

    We study in detail various aspects of the renormalization of the spin-1 resonance propagator in the effective field theory framework. First, we briefly review the formalisms for the description of spin-1 resonances in the path integral formulation with the stress on the issue of propagating degrees of freedom. Then we calculate the one-loop 1{sup --} meson self-energy within the resonance chiral theory in the chiral limit using different methods for the description of spin-1 particles, namely, the Proca field, antisymmetric tensor field, and the first-order formalisms. We discuss in detail technical aspects of the renormalization procedure which are inherent tomore » the power-counting nonrenormalizable theory and give a formal prescription for the organization of both the counterterms and one-particle irreducible graphs. We also construct the corresponding propagators and investigate their properties. We show that the additional poles corresponding to the additional one-particle states are generated by loop corrections, some of which are negative norm ghosts or tachyons. We count the number of such additional poles and briefly discuss their physical meaning.« less

  6. Measurement of the amount and number of pollen particles of Cryptomeria japonica (taxodiaceae) by imaging with a photoacoustic microscope.

    PubMed

    Miyamoto, Katsuhiko; Hoshimiya, Tsutomu

    2006-03-01

    A photoacoustic microscope (PAM), which includes a condenser microphone and a pair of linear-motor-driven pulse stages, was specially designed for spectroscopic applications. The PAM was applied to measure the amount and number of pollen particles of Cryptomeria japonica (CJ), which is known for its allergic function against eyes and nose. The advantage of photoacoustic (PA) imaging is both its high sensitivity and its counting ability up to high concentrations of the specimen. The CJ pollen particles were fixed on a piece of adhesive tape or on albumen (egg white) on a glass slide set in a PA cell. The PA image showed the ability of this method to count CJ pollen from the several-hundred-milligram region to even a single particle. The PA signal obtained was integrated over the specimen surface. The dependence of the PA signal on the amount or number of the pollen particles was measured. The resulting coefficients of correlation of the calibration curves for the amount and the number of pollen particles were 0.94 and 0.97, respectively.

  7. Neural Imaging Using Single-Photon Avalanche Diodes

    PubMed Central

    Karami, Mohammad Azim; Ansarian, Misagh

    2017-01-01

    Introduction: This paper analyses the ability of single-photon avalanche diodes (SPADs) for neural imaging. The current trend in the production of SPADs moves toward the minimum dark count rate (DCR) and maximum photon detection probability (PDP). Moreover, the jitter response which is the main measurement characteristic for the timing uncertainty is progressing. Methods: The neural imaging process using SPADs can be performed by means of florescence lifetime imaging (FLIM), time correlated single-photon counting (TCSPC), positron emission tomography (PET), and single-photon emission computed tomography (SPECT). Results: This trend will result in more precise neural imaging cameras. While achieving low DCR SPADs is difficult in deep submicron technologies because of using higher doping profiles, higher PDPs are reported in green and blue part of light. Furthermore, the number of pixels integrated in the same chip is increasing with the technology progress which can result in the higher resolution of imaging. Conclusion: This study proposes implemented SPADs in Deep-submicron technologies to be used in neural imaging cameras, due to the small size pixels and higher timing accuracies. PMID:28446946

  8. Relationship between abstract thinking and eye gaze pattern in patients with schizophrenia

    PubMed Central

    2014-01-01

    Background Effective integration of visual information is necessary to utilize abstract thinking, but patients with schizophrenia have slow eye movement and usually explore limited visual information. This study examines the relationship between abstract thinking ability and the pattern of eye gaze in patients with schizophrenia using a novel theme identification task. Methods Twenty patients with schizophrenia and 22 healthy controls completed the theme identification task, in which subjects selected which word, out of a set of provided words, best described the theme of a picture. Eye gaze while performing the task was recorded by the eye tracker. Results Patients exhibited a significantly lower correct rate for theme identification and lesser fixation and saccade counts than controls. The correct rate was significantly correlated with the fixation count in patients, but not in controls. Conclusions Patients with schizophrenia showed impaired abstract thinking and decreased quality of gaze, which were positively associated with each other. Theme identification and eye gaze appear to be useful as tools for the objective measurement of abstract thinking in patients with schizophrenia. PMID:24739356

  9. Risk factors associated with low CD4+ lymphocyte count among HIV-positive pregnant women in Nigeria.

    PubMed

    Abimiku, Alash'le; Villalba-Diebold, Pacha; Dadik, Jelpe; Okolo, Felicia; Mang, Edwina; Charurat, Man

    2009-09-01

    To determine the risk factors for CD4+ lymphocyte counts of 200 cells/mm(3) or lower in HIV-positive pregnant women in Nigeria. A cross-sectional data analysis from a prospective cohort of 515 HIV-positive women attending a prenatal clinic. Risk of a low CD4+ count was estimated using logistic regression analysis. CD4+ lymphocyte counts of 200 cells/mm(3) or lower (280+/-182 cells/mm(3)) were recorded in 187 (36.3%) out of 515 HIV-positive pregnant women included in the study. Low CD4+ count was associated with older age (adjusted odds ratio [aOR] 10.71; 95% confidence interval [CI], 1.20-95.53), lack of condom use (aOR, 5.16; 95% CI, 1.12-23.8), history of genital ulcers (aOR, 1.78; 95% CI, 1.12-2.82), and history of vaginal discharge (aOR; 1.62; 1.06-2.48). Over 35% of the HIV-positive pregnant women had low CD4+ counts, indicating the need for treatment. The findings underscore the need to integrate prevention of mother-to-child transmission with HIV treatment and care, particularly services for sexually transmitted infections.

  10. How to Quantify Penile Corpus Cavernosum Structures with Histomorphometry: Comparison of Two Methods

    PubMed Central

    Felix-Patrício, Bruno; De Souza, Diogo Benchimol; Gregório, Bianca Martins; Costa, Waldemar Silva; Sampaio, Francisco José

    2015-01-01

    The use of morphometrical tools in biomedical research permits the accurate comparison of specimens subjected to different conditions, and the surface density of structures is commonly used for this purpose. The traditional point-counting method is reliable but time-consuming, with computer-aided methods being proposed as an alternative. The aim of this study was to compare the surface density data of penile corpus cavernosum trabecular smooth muscle in different groups of rats, measured by two observers using the point-counting or color-based segmentation method. Ten normotensive and 10 hypertensive male rats were used in this study. Rat penises were processed to obtain smooth muscle immunostained histological slices and photomicrographs captured for analysis. The smooth muscle surface density was measured in both groups by two different observers by the point-counting method and by the color-based segmentation method. Hypertensive rats showed an increase in smooth muscle surface density by the two methods, and no difference was found between the results of the two observers. However, surface density values were higher by the point-counting method. The use of either method did not influence the final interpretation of the results, and both proved to have adequate reproducibility. However, as differences were found between the two methods, results obtained by either method should not be compared. PMID:26413547

  11. How to Quantify Penile Corpus Cavernosum Structures with Histomorphometry: Comparison of Two Methods.

    PubMed

    Felix-Patrício, Bruno; De Souza, Diogo Benchimol; Gregório, Bianca Martins; Costa, Waldemar Silva; Sampaio, Francisco José

    2015-01-01

    The use of morphometrical tools in biomedical research permits the accurate comparison of specimens subjected to different conditions, and the surface density of structures is commonly used for this purpose. The traditional point-counting method is reliable but time-consuming, with computer-aided methods being proposed as an alternative. The aim of this study was to compare the surface density data of penile corpus cavernosum trabecular smooth muscle in different groups of rats, measured by two observers using the point-counting or color-based segmentation method. Ten normotensive and 10 hypertensive male rats were used in this study. Rat penises were processed to obtain smooth muscle immunostained histological slices and photomicrographs captured for analysis. The smooth muscle surface density was measured in both groups by two different observers by the point-counting method and by the color-based segmentation method. Hypertensive rats showed an increase in smooth muscle surface density by the two methods, and no difference was found between the results of the two observers. However, surface density values were higher by the point-counting method. The use of either method did not influence the final interpretation of the results, and both proved to have adequate reproducibility. However, as differences were found between the two methods, results obtained by either method should not be compared.

  12. Integral Airframe Structures (IAS): Validated Feasibility Study of Integrally Stiffened Metallic Fuselage Panels for Reducing Manufacturing Costs

    NASA Technical Reports Server (NTRS)

    Munroe, J.; Wilkins, K.; Gruber, M.; Domack, Marcia S. (Technical Monitor)

    2000-01-01

    The Integral Airframe Structures (IAS) program investigated the feasibility of using "integrally stiffened" construction for commercial transport fuselage structure. The objective of the program was to demonstrate structural performance and weight equal to current "built-up" structure with lower manufacturing cost. Testing evaluated mechanical properties, structural details, joint performance, repair, static compression, and two-bay crack residual strength panels. Alloys evaluated included 7050-T7451 plate, 7050-T74511 extrusion, 6013-T6511x extrusion, and 7475-T7351 plate. Structural performance was evaluated with a large 7475-T7351 pressure test that included the arrest of a two-bay longitudinal crack, and a measure of residual strength for a two-bay crack centered on a broken frame. Analysis predictions for the two-bay longitudinal crack panel correlated well with the test results. Analysis activity conducted by the IAS team strongly indicates that current analysis tools predict integral structural behavior as accurately as built-up structure. The cost study results indicated that, compared to built-up fabrication methods, high-speed machining structure from aluminum plate would yield a recurring cost savings of 61%. Part count dropped from 78 individual parts on a baseline panel to just 7 parts for machined IAS structure.

  13. Counting-loss correction for X-ray spectroscopy using unit impulse pulse shaping.

    PubMed

    Hong, Xu; Zhou, Jianbin; Ni, Shijun; Ma, Yingjie; Yao, Jianfeng; Zhou, Wei; Liu, Yi; Wang, Min

    2018-03-01

    High-precision measurement of X-ray spectra is affected by the statistical fluctuation of the X-ray beam under low-counting-rate conditions. It is also limited by counting loss resulting from the dead-time of the system and pile-up pulse effects, especially in a high-counting-rate environment. In this paper a detection system based on a FAST-SDD detector and a new kind of unit impulse pulse-shaping method is presented, for counting-loss correction in X-ray spectroscopy. The unit impulse pulse-shaping method is evolved by inverse deviation of the pulse from a reset-type preamplifier and a C-R shaper. It is applied to obtain the true incoming rate of the system based on a general fast-slow channel processing model. The pulses in the fast channel are shaped to unit impulse pulse shape which possesses small width and no undershoot. The counting rate in the fast channel is corrected by evaluating the dead-time of the fast channel before it is used to correct the counting loss in the slow channel.

  14. Systematic wavelength selection for improved multivariate spectral analysis

    DOEpatents

    Thomas, Edward V.; Robinson, Mark R.; Haaland, David M.

    1995-01-01

    Methods and apparatus for determining in a biological material one or more unknown values of at least one known characteristic (e.g. the concentration of an analyte such as glucose in blood or the concentration of one or more blood gas parameters) with a model based on a set of samples with known values of the known characteristics and a multivariate algorithm using several wavelength subsets. The method includes selecting multiple wavelength subsets, from the electromagnetic spectral region appropriate for determining the known characteristic, for use by an algorithm wherein the selection of wavelength subsets improves the model's fitness of the determination for the unknown values of the known characteristic. The selection process utilizes multivariate search methods that select both predictive and synergistic wavelengths within the range of wavelengths utilized. The fitness of the wavelength subsets is determined by the fitness function F=.function.(cost, performance). The method includes the steps of: (1) using one or more applications of a genetic algorithm to produce one or more count spectra, with multiple count spectra then combined to produce a combined count spectrum; (2) smoothing the count spectrum; (3) selecting a threshold count from a count spectrum to select these wavelength subsets which optimize the fitness function; and (4) eliminating a portion of the selected wavelength subsets. The determination of the unknown values can be made: (1) noninvasively and in vivo; (2) invasively and in vivo; or (3) in vitro.

  15. The standardization of urine particle counting in medical laboratories--a Polish experience with the EQA programme.

    PubMed

    Cwiklińska, Agnieszka; Kąkol, Judyta; Kuchta, Agnieszka; Kortas-Stempak, Barbara; Pacanis, Anastasis; Rogulski, Jerzy; Wróblewska, Małgorzata

    2012-02-01

    Given the common problems with the standardization of urine particle counting methods and the great variability in the results obtained by Polish laboratories under international Labquality External Quality Assessment (EQA), we initiated educational recovery activities. Detailed instructions on how to perform the standardized examination were sent to EQA participants, as was a questionnaire forms which enabled information to be gathered in respect to the procedures being applied. Laboratory results were grouped according to the method declared on the EQA 'Result' form or according to a manual examination procedure established on the basis of the questionnaire. The between-laboratory CVs for leukocyte and erythrocyte counts were calculated for each group and compared using the Mann-Whitney test. Significantly lower between-laboratory CVs (p = 0.03) were achieved for leukocyte counting among the laboratories that analysed control specimens in accordance with standardized procedures as compared with those which used non-standardized procedures. We also observed a visible lower variability for erythrocyte counting. Unfortunately despite our activities, only a few of the Polish laboratories applied the standardized examination procedures, and only 29% of the results could have been considered to be standardized (16% - manual methods, 13% - automated systems). The standardization of urine particle counting methods continues to be a significant problem in medical laboratories and requires further recovery activities which can be conducted using the EQA scheme.

  16. Evaluation of position-estimation methods applied to CZT-based photon-counting detectors for dedicated breast CT

    PubMed Central

    Makeev, Andrey; Clajus, Martin; Snyder, Scott; Wang, Xiaolang; Glick, Stephen J.

    2015-01-01

    Abstract. Semiconductor photon-counting detectors based on high atomic number, high density materials [cadmium zinc telluride (CZT)/cadmium telluride (CdTe)] for x-ray computed tomography (CT) provide advantages over conventional energy-integrating detectors, including reduced electronic and Swank noise, wider dynamic range, capability of spectral CT, and improved signal-to-noise ratio. Certain CT applications require high spatial resolution. In breast CT, for example, visualization of microcalcifications and assessment of tumor microvasculature after contrast enhancement require resolution on the order of 100  μm. A straightforward approach to increasing spatial resolution of pixellated CZT-based radiation detectors by merely decreasing the pixel size leads to two problems: (1) fabricating circuitry with small pixels becomes costly and (2) inter-pixel charge spreading can obviate any improvement in spatial resolution. We have used computer simulations to investigate position estimation algorithms that utilize charge sharing to achieve subpixel position resolution. To study these algorithms, we model a simple detector geometry with a 5×5 array of 200  μm pixels, and use a conditional probability function to model charge transport in CZT. We used COMSOL finite element method software to map the distribution of charge pulses and the Monte Carlo package PENELOPE for simulating fluorescent radiation. Performance of two x-ray interaction position estimation algorithms was evaluated: the method of maximum-likelihood estimation and a fast, practical algorithm that can be implemented in a readout application-specific integrated circuit and allows for identification of a quadrant of the pixel in which the interaction occurred. Both methods demonstrate good subpixel resolution; however, their actual efficiency is limited by the presence of fluorescent K-escape photons. Current experimental breast CT systems typically use detectors with a pixel size of 194  μm, with 2×2 binning during the acquisition giving an effective pixel size of 388  μm. Thus, it would be expected that the position estimate accuracy reported in this study would improve detection and visualization of microcalcifications as compared to that with conventional detectors. PMID:26158095

  17. Evaluation of position-estimation methods applied to CZT-based photon-counting detectors for dedicated breast CT.

    PubMed

    Makeev, Andrey; Clajus, Martin; Snyder, Scott; Wang, Xiaolang; Glick, Stephen J

    2015-04-01

    Semiconductor photon-counting detectors based on high atomic number, high density materials [cadmium zinc telluride (CZT)/cadmium telluride (CdTe)] for x-ray computed tomography (CT) provide advantages over conventional energy-integrating detectors, including reduced electronic and Swank noise, wider dynamic range, capability of spectral CT, and improved signal-to-noise ratio. Certain CT applications require high spatial resolution. In breast CT, for example, visualization of microcalcifications and assessment of tumor microvasculature after contrast enhancement require resolution on the order of [Formula: see text]. A straightforward approach to increasing spatial resolution of pixellated CZT-based radiation detectors by merely decreasing the pixel size leads to two problems: (1) fabricating circuitry with small pixels becomes costly and (2) inter-pixel charge spreading can obviate any improvement in spatial resolution. We have used computer simulations to investigate position estimation algorithms that utilize charge sharing to achieve subpixel position resolution. To study these algorithms, we model a simple detector geometry with a [Formula: see text] array of [Formula: see text] pixels, and use a conditional probability function to model charge transport in CZT. We used COMSOL finite element method software to map the distribution of charge pulses and the Monte Carlo package PENELOPE for simulating fluorescent radiation. Performance of two x-ray interaction position estimation algorithms was evaluated: the method of maximum-likelihood estimation and a fast, practical algorithm that can be implemented in a readout application-specific integrated circuit and allows for identification of a quadrant of the pixel in which the interaction occurred. Both methods demonstrate good subpixel resolution; however, their actual efficiency is limited by the presence of fluorescent [Formula: see text]-escape photons. Current experimental breast CT systems typically use detectors with a pixel size of [Formula: see text], with [Formula: see text] binning during the acquisition giving an effective pixel size of [Formula: see text]. Thus, it would be expected that the position estimate accuracy reported in this study would improve detection and visualization of microcalcifications as compared to that with conventional detectors.

  18. A randomized approach to speed up the analysis of large-scale read-count data in the application of CNV detection.

    PubMed

    Wang, WeiBo; Sun, Wei; Wang, Wei; Szatkiewicz, Jin

    2018-03-01

    The application of high-throughput sequencing in a broad range of quantitative genomic assays (e.g., DNA-seq, ChIP-seq) has created a high demand for the analysis of large-scale read-count data. Typically, the genome is divided into tiling windows and windowed read-count data is generated for the entire genome from which genomic signals are detected (e.g. copy number changes in DNA-seq, enrichment peaks in ChIP-seq). For accurate analysis of read-count data, many state-of-the-art statistical methods use generalized linear models (GLM) coupled with the negative-binomial (NB) distribution by leveraging its ability for simultaneous bias correction and signal detection. However, although statistically powerful, the GLM+NB method has a quadratic computational complexity and therefore suffers from slow running time when applied to large-scale windowed read-count data. In this study, we aimed to speed up substantially the GLM+NB method by using a randomized algorithm and we demonstrate here the utility of our approach in the application of detecting copy number variants (CNVs) using a real example. We propose an efficient estimator, the randomized GLM+NB coefficients estimator (RGE), for speeding up the GLM+NB method. RGE samples the read-count data and solves the estimation problem on a smaller scale. We first theoretically validated the consistency and the variance properties of RGE. We then applied RGE to GENSENG, a GLM+NB based method for detecting CNVs. We named the resulting method as "R-GENSENG". Based on extensive evaluation using both simulated and empirical data, we concluded that R-GENSENG is ten times faster than the original GENSENG while maintaining GENSENG's accuracy in CNV detection. Our results suggest that RGE strategy developed here could be applied to other GLM+NB based read-count analyses, i.e. ChIP-seq data analysis, to substantially improve their computational efficiency while preserving the analytic power.

  19. A PFM-based MWIR DROIC employing off-pixel fine conversion of photocharge to digital using integrated column ADCs

    NASA Astrophysics Data System (ADS)

    Abbasi, S.; Galioglu, A.; Shafique, A.; Ceylan, O.; Yazici, M.; Gurbuz, Y.

    2017-02-01

    A 32x32 prototype of a digital readout IC (DROIC) for medium-wave infrared focal plane arrays (MWIR IR-FPAs) is presented. The DROIC employs in-pixel photocurrent to digital conversion based on a pulse frequency modulation (PFM) loop and boasts a novel feature of off-pixel residue conversion using 10-bit column SAR ADCs. The remaining charge at the end of integration in typical PFM based digital pixel sensors is usually wasted. Previous works employing in-pixel extended counting methods make use of extra memory and counters to convert this left-over charge to digital, thereby performing fine conversion of the incident photocurrent. This results in a low quantization noise and hence keeps the readout noise low. However, focal plane arrays (FPAs) with small pixel pitch are constrained in pixel area, which makes it difficult to benefit from in-pixel extended counting circuitry. Thus, in this work, a novel approach to measure the residue outside the pixel using column -parallel SAR ADCs has been proposed. Moreover, a modified version of the conventional PFM based pixel has been designed to help hold the residue charge and buffer it to the column ADC. In addition to the 2D array of pixels, the prototype consists of 32 SAR ADCs, a timing controller block and a memory block to buffer the residue data coming out of the ADCs. The prototype has been designed and fabricated in 90nm CMOS.

  20. Impact of advanced and basic carbohydrate counting methods on metabolic control in patients with type 1 diabetes.

    PubMed

    Souto, Débora Lopes; Zajdenverg, Lenita; Rodacki, Melanie; Rosado, Eliane Lopes

    2014-03-01

    Diets based on carbohydrate counting remain a key strategy for improving glycemic control in patients with type 1 diabetes. However, these diets may promote weight gain because of the flexibility in food choices. The aim of this study was to compare carbohydrate counting methods regarding anthropometric, biochemical, and dietary variables in individuals with type 1 diabetes, as well as to evaluate their knowledge about nutrition. Participants were allocated in basic or advanced groups. After 3 mo of the nutritional counseling, dietary intake, anthropometric variables, lipemia, and glycemic control were compared between groups. A questionnaire regarding carbohydrate counting, sucrose intake, nutritional knowledge, and diabetes and nutrition taboos also was administered. Ten (30%) participants had already used advanced carbohydrate counting before the nutritional counseling and these individuals had a higher body mass index (BMI) (P < 0.01) and waist circumference (WC) (P = 0.01) than others (n = 23; 69.7%). After 3 mo of follow-up, although participants in the advanced group (n = 17; 51.52%) presented higher BMI (P < 0.01) and WC (P = 0.03), those in the basic group (n = 16; 48.48%) showed a higher fat intake (P < 0.01). The majority of participants reported no difficulty in following carbohydrate counting (62.5% and 88% for basic and advanced groups, respectively) and a greater flexibility in terms of food choices (>90% with both methods). Advanced carbohydrate counting did not affect lipemic and glycemic control in individuals with type 1 diabetes, however, it may increase food intake, and consequently the BMI and WC, when compared to basic carbohydrate counting. Furthermore, carbohydrate counting promoted greater food flexibility. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. Optimal method for collection of umbilical cord blood: an Egyptian trial for a public cord blood bank.

    PubMed

    Bassiouny, M R; El-Chennawi, F; Mansour, A K; Yahia, S; Darwish, A

    2015-06-01

    Umbilical cord blood (UCB) contains stem cells and can be used as an alternative to bone marrow transplantation. Engraftment is dependent on the total nucleated cell (TNC) and CD34+ cell counts of the cord blood units. This study was designed to evaluate the effect of the method of collection of the UCB on the yield of the cord blood units. Informed consent was obtained from 100 eligible mothers for donation of cord blood. Both in utero and ex utero methods were used for collection. The cord blood volume was measured. The TNC and the CD34+ cell counts were enumerated. We have found that in utero collection gave significantly larger volumes of cord blood and higher TNC counts than ex utero collection. There was no significant difference between both methods regarding the CD34+ cell counts. This study revealed a significant correlation between the volume of the collected cord blood and both TNC and CD34+ cell counts. It is better to collect cord blood in utero before placental delivery to optimize the quality of the cord blood unit. © 2015 AABB.

  2. Optimisation of nasal swab analysis by liquid scintillation counting.

    PubMed

    Dai, Xiongxin; Liblong, Aaron; Kramer-Tremblay, Sheila; Priest, Nicholas; Li, Chunsheng

    2012-06-01

    When responding to an emergency radiological incident, rapid methods are needed to provide the physicians and radiation protection personnel with an early estimation of possible internal dose resulting from the inhalation of radionuclides. This information is needed so that appropriate medical treatment and radiological protection control procedures can be implemented. Nasal swab analysis, which employs swabs swiped inside a nostril followed by liquid scintillation counting of alpha and beta activity on the swab, could provide valuable information to quickly identify contamination of the affected population. In this study, various parameters (such as alpha/beta discrimination, swab materials, counting time and volume of scintillation cocktail etc) were evaluated in order to optimise the effectiveness of the nasal swab analysis method. An improved nasal swab procedure was developed by replacing cotton swabs with polyurethane-tipped swabs. Liquid scintillation counting was performed using a Hidex 300SL counter with alpha/beta pulse shape discrimination capability. Results show that the new method is more reliable than existing methods using cotton swabs and effectively meets the analysis requirements for screening personnel in an emergency situation. This swab analysis procedure is also applicable to wipe tests of surface contamination to minimise the source self-absorption effect on liquid scintillation counting.

  3. Integrating Child Health Information Systems

    PubMed Central

    Hinman, Alan R.; Eichwald, John; Linzer, Deborah; Saarlas, Kristin N.

    2005-01-01

    The Health Resources and Services Administration and All Kids Count (a national technical assistance center fostering development of integrated child health information systems) have been working together to foster development of integrated child health information systems. Activities have included: identification of key elements for successful integration of systems; development of principles and core functions for the systems; a survey of state and local integration efforts; and a conference to develop a common vision for child health information systems to meet medical care and public health needs. We provide 1 state (Utah) as an example that is well on the way to development of integrated child health information systems. PMID:16195524

  4. Evolving Postmortems as Teams Evolve Through TxP

    DTIC Science & Technology

    2014-12-01

    Instead of waiting for SEI to compile enough data to repeat this kind of analysis for the system integration test domain , a system integration test team...and stand up their Team Test Process (TTP). Some abilities, like planning on how many mistakes will be made by the team in producing a test procedure...can only be performed after the team has determined a) which mistakes count in the domain of system integration testing, b) what units to use to

  5. Testing the Model: A Phase 1/11 Randomized Double Blind Placebo Control Trial of Targeted Therapeutics: Liposomal Glutathione and Curcumin

    DTIC Science & Technology

    2016-10-01

    Can non- specific cellular immunity protect HIV-infected persons with very low CD4 counts? Presented at Conference on Integrating Psychology and...Under Review. 50. Nierenberg B, Cooper S, Feuer SJ, Broderick G. Applying Network Medicine to Chronic Illness: A Model for Integrating Psychology ...function in these subjects as compared to GW era sedentary healthy controls. We applied an integrative systems- based approach rooted in computational

  6. Testing the Model: A Phase 1/11 Randomized Double Blind Placebo Control Trial of Targeted Therapeutics: Liposomal Glutathione and Curcumin

    DTIC Science & Technology

    2017-10-01

    HIV-infected persons with very low CD4 counts? Presented at Conference on Integrating Psychology and Medicine, Waheki Island, Auckland, NZ, 10-12th...SJ, Broderick G. Applying Network Medicine to Chronic Illness: A Model for Integrating Psychology into Routine Care. Amer Psych, 2015. Under review...function in these subjects as compared to GW era sedentary healthy controls. We applied an integrative systems- based approach rooted in

  7. Monitoring trends in bird populations: addressing background levels of annual variability in counts

    Treesearch

    Jared Verner; Kathryn L. Purcell; Jennifer G. Turner

    1996-01-01

    Point counting has been widely accepted as a method for monitoring trends in bird populations. Using a rigorously standardized protocol at 210 counting stations at the San Joaquin Experimental Range, Madera Co., California, we have been studying sources of variability in point counts of birds. Vegetation types in the study area have not changed during the 11 years of...

  8. Eight-Channel Continuous Timer

    NASA Technical Reports Server (NTRS)

    Cole, Steven

    2004-01-01

    A custom laboratory electronic timer circuit measures the durations of successive cycles of nominally highly stable input clock signals in as many as eight channels, for the purpose of statistically quantifying the small instabilities of these signals. The measurement data generated by this timer are sent to a personal computer running software that integrates the measurements to form a phase residual for each channel and uses the phase residuals to compute Allan variances for each channel. (The Allan variance is a standard statistical measure of instability of a clock signal.) Like other laboratory clock-cycle-measuring circuits, this timer utilizes an externally generated reference clock signal having a known frequency (100 MHz) much higher than the frequencies of the input clock signals (between 100 and 120 Hz). It counts the number of reference-clock cycles that occur between successive rising edges of each input clock signal of interest, thereby affording a measurement of the input clock-signal period to within the duration (10 ns) of one reference clock cycle. Unlike typical prior laboratory clock-cycle-measuring circuits, this timer does not skip some cycles of the input clock signals. The non-cycle-skipping feature is an important advantage because in applications that involve integration of measurements over long times for characterizing nominally highly stable clock signals, skipping cycles can degrade accuracy. The timer includes a field-programmable gate array that functions as a 20-bit counter running at the reference clock rate of 100 MHz. The timer also includes eight 20-bit latching circuits - one for each channel - at the output terminals of the counter. Each transition of an input signal from low to high causes the corresponding latching circuit to latch the count at that instant. Each such transition also sets a status flip-flop circuit to indicate the presence of the latched count. A microcontroller reads the values of all eight status flipflops and then reads the latched count for each channel for which the flip-flop indicates the presence of a count. Reading the count for each channel automatically causes the flipflop of that channel to be reset. The microcontroller places the counts in time order, identifies the channel number for each count, and transmits these data to the personal computer.

  9. Pill counts and pill rental: unintended entrepreneurial opportunities.

    PubMed

    Viscomi, Christopher M; Covington, Melissa; Christenson, Catherine

    2013-07-01

    Prescription opioid diversion and abuse are becoming increasingly prevalent in many regions of the world, particularly the United States. One method advocated to assess compliance with opioid prescriptions is occasional "pill counts." Shortly before a scheduled appointment, a patient is notified that they must bring in the unused portion of their opioid prescription. It has been assumed that if a patient has the correct number and strength of pills that should be present for that point in a prescription interval that they are unlikely to be selling or abusing their opioids. Two cases are presented where patients describe short term rental of opioids from illicit opioid dealers in order to circumvent pill counts. Pill renting appears to be an established method of circumventing pill counts. Pill counts do not assure non-diversion of opioids and provide additional cash flow to illicit opioid dealers.

  10. The Foraging Ecology of Royal and Sandwich Terns in North Carolina, USA

    USGS Publications Warehouse

    McGinnis, T.W.; Emslie, S.D.

    2001-01-01

    Population sizes of territorial male red-winged blackbirds (Agelaius phoeniceus) were determined with counts of territorial males (area count) and a Petersen-Lincoln Index method for roadsides (roadside estimate). Weather conditions and time of day did not influence either method. Combined roadside estimates had smaller error bounds than the individual transect estimates and were not hindered by the problem of zero recaptures. Roadside estimates were usually one-half as large as the area counts, presumably due to an observer bias for marked birds. The roadside estimate provides only an index of major changes in populations of territorial male redwings. When the roadside estimate is employed, the area count should be used to determine the amount and nature of observer bias. For small population surveys, the area count is probably more reliable and accurate than the roadside estimate.

  11. Determining population size of territorial red-winged blackbirds

    USGS Publications Warehouse

    Albers, P.H.

    1976-01-01

    Population sizes of territorial male red-winged blackbirds (Agelaius phoeniceus) were determined with counts of territorial males (area count) and a Petersen-Lincoln Index method for roadsides (roadside estimate). Weather conditions and time of day did not influence either method. Combined roadside estimates had smaller error bounds than the individual transect estimates and were not hindered by the problem of zero recaptures. Roadside estimates were usually one-half as large as the area counts, presumably due to an observer bias for marked birds. The roadside estimate provides only an index of major changes in populations of territorial male redwings. When the roadside estimate is employed, the area count should be used to determine the amount and nature of observer bias. For small population surveys, the area count is probably more reliable and accurate than the roadside estimate.

  12. Can Detectability Analysis Improve the Utility of Point Counts for Temperate Forest Raptors?

    EPA Science Inventory

    Temperate forest breeding raptors are poorly represented in typical point count surveys because these birds are cryptic and typically breed at low densities. In recent years, many new methods for estimating detectability during point counts have been developed, including distanc...

  13. Radioisotope Dating with Accelerators.

    ERIC Educational Resources Information Center

    Muller, Richard A.

    1979-01-01

    Explains a new method of detecting radioactive isotopes by counting their accelerated ions rather than the atoms that decay during the counting period. This method increases the sensitivity by several orders of magnitude, and allows one to find the ages of much older and smaller samples. (GA)

  14. HSCT Propulsion Airframe Integration Studies

    NASA Technical Reports Server (NTRS)

    Chaney, Steve

    1999-01-01

    The Lockheed Martin spillage study was a substantial effort and is worthy of a separate paper. However, since a paper was not submitted a few of the most pertinent results have been pulled out and included in this paper. The reader is urged to obtain a copy of the complete Boeing Configuration Aerodynamics final 1995 contract report for the complete Lockheed documentation of the spillage work. The supersonic cruise studies presented here focus on the bifurcated - axisymmetric inlet drag delta. In the process of analyzing this delta several test/CFD data correlation problems arose that lead to a correction of the measured drag delta from 4.6 counts to 3.1 counts. This study also lead to much better understanding of the OVERFLOW gridding and solution process, and to increased accuracy of the force and moment data. Detailed observations of the CFD results lead to the conclusion that the 3.1 count difference between the two inlet types could be reduced to approximately 2 counts, with an absolute lower bound of 1.2 counts due to friction drag and the bifurcated lip bevel.

  15. Designing and implementing a monitoring program and the standards for conducting point counts

    Treesearch

    C. John Ralph

    1993-01-01

    Choosing between the apparent plethora of methods for monitoring bird populations is a dilemma for a person contemplating beginning a monitoring program. Cooperrider et al. (1986) and Koskimies and Vaisanen (1991) describe many methods. In the Americas, three methods have been suggested as standard (Butcher 1992). They are: point counts for determining habitat...

  16. A variable circular-plot method for estimating bird numbers

    Treesearch

    R. T. Reynolds; J. M. Scott; R. A. Nussbaum

    1980-01-01

    A bird census method is presented that is designed for tall, structurally complex vegetation types, and rugged terrain. With this method the observer counts all birds seen or heard around a station, and estimates the horizontal distance from the station to each bird. Count periods at stations vary according to the avian community and structural complexity of the...

  17. High throughput single cell counting in droplet-based microfluidics.

    PubMed

    Lu, Heng; Caen, Ouriel; Vrignon, Jeremy; Zonta, Eleonora; El Harrak, Zakaria; Nizard, Philippe; Baret, Jean-Christophe; Taly, Valérie

    2017-05-02

    Droplet-based microfluidics is extensively and increasingly used for high-throughput single-cell studies. However, the accuracy of the cell counting method directly impacts the robustness of such studies. We describe here a simple and precise method to accurately count a large number of adherent and non-adherent human cells as well as bacteria. Our microfluidic hemocytometer provides statistically relevant data on large populations of cells at a high-throughput, used to characterize cell encapsulation and cell viability during incubation in droplets.

  18. Evaluation of absolute measurement using a 4π plastic scintillator for the 4πβ-γ coincidence counting method.

    PubMed

    Unno, Y; Sanami, T; Sasaki, S; Hagiwara, M; Yunoki, A

    2018-04-01

    Absolute measurement by the 4πβ-γ coincidence counting method was conducted by two photomultipliers facing across a plastic scintillator to be focused on β ray counting efficiency. The detector was held with a through-hole-type NaI(Tl) detector. The results include absolutely determined activity and its uncertainty especially about extrapolation. A comparison between the obtained and known activities showed agreement within their uncertainties. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. Deep 3 GHz number counts from a P(D) fluctuation analysis

    NASA Astrophysics Data System (ADS)

    Vernstrom, T.; Scott, Douglas; Wall, J. V.; Condon, J. J.; Cotton, W. D.; Fomalont, E. B.; Kellermann, K. I.; Miller, N.; Perley, R. A.

    2014-05-01

    Radio source counts constrain galaxy populations and evolution, as well as the global star formation history. However, there is considerable disagreement among the published 1.4-GHz source counts below 100 μJy. Here, we present a statistical method for estimating the μJy and even sub-μJy source count using new deep wide-band 3-GHz data in the Lockman Hole from the Karl G. Jansky Very Large Array. We analysed the confusion amplitude distribution P(D), which provides a fresh approach in the form of a more robust model, with a comprehensive error analysis. We tested this method on a large-scale simulation, incorporating clustering and finite source sizes. We discuss in detail our statistical methods for fitting using Markov chain Monte Carlo, handling correlations, and systematic errors from the use of wide-band radio interferometric data. We demonstrated that the source count can be constrained down to 50 nJy, a factor of 20 below the rms confusion. We found the differential source count near 10 μJy to have a slope of -1.7, decreasing to about -1.4 at fainter flux densities. At 3 GHz, the rms confusion in an 8-arcsec full width at half-maximum beam is ˜ 1.2 μJy beam-1, and a radio background temperature ˜14 mK. Our counts are broadly consistent with published evolutionary models. With these results, we were also able to constrain the peak of the Euclidean normalized differential source count of any possible new radio populations that would contribute to the cosmic radio background down to 50 nJy.

  20. Hematology of healthy Florida manatees (Trichechus manatus)

    USGS Publications Warehouse

    Harvey, J.W.; Harr, K.E.; Murphy, D.; Walsh, M.T.; Nolan, E.C.; Bonde, R.K.; Pate, M.G.; Deutsch, C.J.; Edwards, H.H.; Clapp, W.L.

    2009-01-01

    Background: Hematologic analysis is an important tool in evaluating the general health status of free-ranging manatees and in the diagnosis and monitoring of rehabilitating animals. Objectives: The purpose of this study was to evaluate diagnostically important hematologic analytes in healthy manatees (Trichechus manatus) and to assess variations with respect to location (free ranging vs captive), age class (small calves, large calves, subadults, and adults), and gender. Methods: Blood was collected from 55 free-ranging and 63 captive healthy manatees. Most analytes were measured using a CELL-DYN 3500R; automated reticulocytes were measured with an ADVIA 120. Standard manual methods were used for differential leukocyte counts, reticulocyte and Heinz body counts, and plasma protein and fibrinogen concentrations. Results: Rouleaux, slight polychromasia, stomatocytosis, and low numbers of schistocytes and nucleated RBCs (NRBCs) were seen often in stained blood films. Manual reticulocyte counts were higher than automated reticulocyte counts. Heinz bodies were present in erythrocytes of most manatees. Compared with free-ranging manatees, captive animals had slightly lower MCV, MCH, and eosinophil counts and slightly higher heterophil and NRBC counts, and fibrinogen concentration. Total leukocyte, heterophil, and monocyte counts tended to be lower in adults than in younger animals. Small calves tended to have higher reticulocyte counts and NRBC counts than older animals. Conclusions: Hematologic findings were generally similar between captive and free-ranging manatees. Higher manual reticulocyte counts suggest the ADVIA detects only reticulocytes containing large amounts of RNA. Higher reticulocyte and NRBC counts in young calves probably reflect an increased rate of erythropoiesis compared with older animals. ?? 2009 American Society for Veterinary Clinical Pathology.

  1. Unbiased estimation of chloroplast number in mesophyll cells: advantage of a genuine three-dimensional approach

    PubMed Central

    Kubínová, Zuzana

    2014-01-01

    Chloroplast number per cell is a frequently examined quantitative anatomical parameter, often estimated by counting chloroplast profiles in two-dimensional (2D) sections of mesophyll cells. However, a mesophyll cell is a three-dimensional (3D) structure and this has to be taken into account when quantifying its internal structure. We compared 2D and 3D approaches to chloroplast counting from different points of view: (i) in practical measurements of mesophyll cells of Norway spruce needles, (ii) in a 3D model of a mesophyll cell with chloroplasts, and (iii) using a theoretical analysis. We applied, for the first time, the stereological method of an optical disector based on counting chloroplasts in stacks of spruce needle optical cross-sections acquired by confocal laser-scanning microscopy. This estimate was compared with counting chloroplast profiles in 2D sections from the same stacks of sections. Comparing practical measurements of mesophyll cells, calculations performed in a 3D model of a cell with chloroplasts as well as a theoretical analysis showed that the 2D approach yielded biased results, while the underestimation could be up to 10-fold. We proved that the frequently used method for counting chloroplasts in a mesophyll cell by counting their profiles in 2D sections did not give correct results. We concluded that the present disector method can be efficiently used for unbiased estimation of chloroplast number per mesophyll cell. This should be the method of choice, especially in coniferous needles and leaves with mesophyll cells with lignified cell walls where maceration methods are difficult or impossible to use. PMID:24336344

  2. Enumeration of Vibrio cholerae O1 in Bangladesh waters by fluorescent-antibody direct viable count.

    PubMed Central

    Brayton, P R; Tamplin, M L; Huq, A; Colwell, R R

    1987-01-01

    A field trial to enumerate Vibrio cholerae O1 in aquatic environments in Bangladesh was conducted, comparing fluorescent-antibody direct viable count with culture detection by the most-probable-number index. Specificity of a monoclonal antibody prepared against the O1 antigen was assessed and incorporated into the fluorescence staining method. All pond and water samples yielded higher counts of viable V. cholerae O1 by fluorescent-antibody direct viable count than by the most-probable-number index. Fluorescence microscopy is a more sensitive detection system than culture methods because it allows the enumeration of both culturable and nonculturable cells and therefore provides more precise monitoring of microbiological water quality. PMID:3324967

  3. Counting Tree Growth Rings Moderately Difficult to Distinguish

    Treesearch

    C. B. Briscoe; M. Chudnoff

    1964-01-01

    There is an extensive literature dealing with techniques and gadgets to facilitate counting tree growth rings. A relatively simple method is described below, satisfactory for species too difficult to count in the field, but not sufficiently difficult to require the preparation of microscope slides nor staining techniques.

  4. Bacteriocidal activity of sanitizers against Enterococcus faecium attached to stainless steel as determined by plate count and impedance methods.

    PubMed

    Andrade, N J; Bridgeman, T A; Zottola, E A

    1998-07-01

    Enterococcus faecium attached to stainless steel chips (100 mm2) was treated with the following sanitizers: sodium hypochlorite, peracetic acid (PA), peracetic acid plus an organic acid (PAS), quaternary ammonium, organic acid, and anionic acid. The effectiveness of sanitizer solutions on planktonic cells (not attached) was evaluated by the Association of Official Analytical Chemists (AOAC) suspension test. The number of attached cells was determined by impedance measurement and plate count method after vortexing. The decimal reduction (DR) in numbers of the E. faecium population was determined for the three methods and was analyzed by analysis of variance (P < 0.05) using Statview software. The adhered cells were more resistant (P < 0.05) than nonadherent cells. The DR averages for all of the sanitizers for 30 s of exposure were 6.4, 2.2, and 2.5 for the AOAC suspension test, plate count method after vortexing, and impedance measurement, respectively. Plate count and impedance methods showed a difference (P < 0.05) after 30 s of sanitizer exposure but not after 2 min. The impedance measurement was the best method to measure adherent cells. Impedance measurement required the development of a quadratic regression. The equation developed from 82 samples is as follows: log CFU/chip = 0.2385T2-0.96T + 9.35, r2 = 0.92, P < 0.05, T = impedance detection time in hours. This method showed that the sanitizers PAS and PA were more effective against E. faecium than the other sanitizers. At 30 s, the impedance method recovered about 25 times more cells than the plate count method after vortexing. These data suggest that impedance measurement is the method of choice when evaluating the number of bacterial cells adhered to a surface.

  5. Anatomical-based partial volume correction for low-dose dedicated cardiac SPECT/CT

    NASA Astrophysics Data System (ADS)

    Liu, Hui; Chan, Chung; Grobshtein, Yariv; Ma, Tianyu; Liu, Yaqiang; Wang, Shi; Stacy, Mitchel R.; Sinusas, Albert J.; Liu, Chi

    2015-09-01

    Due to the limited spatial resolution, partial volume effect has been a major degrading factor on quantitative accuracy in emission tomography systems. This study aims to investigate the performance of several anatomical-based partial volume correction (PVC) methods for a dedicated cardiac SPECT/CT system (GE Discovery NM/CT 570c) with focused field-of-view over a clinically relevant range of high and low count levels for two different radiotracer distributions. These PVC methods include perturbation geometry transfer matrix (pGTM), pGTM followed by multi-target correction (MTC), pGTM with known concentration in blood pool, the former followed by MTC and our newly proposed methods, which perform the MTC method iteratively, where the mean values in all regions are estimated and updated by the MTC-corrected images each time in the iterative process. The NCAT phantom was simulated for cardiovascular imaging with 99mTc-tetrofosmin, a myocardial perfusion agent, and 99mTc-red blood cell (RBC), a pure intravascular imaging agent. Images were acquired at six different count levels to investigate the performance of PVC methods in both high and low count levels for low-dose applications. We performed two large animal in vivo cardiac imaging experiments following injection of 99mTc-RBC for evaluation of intramyocardial blood volume (IMBV). The simulation results showed our proposed iterative methods provide superior performance than other existing PVC methods in terms of image quality, quantitative accuracy, and reproducibility (standard deviation), particularly for low-count data. The iterative approaches are robust for both 99mTc-tetrofosmin perfusion imaging and 99mTc-RBC imaging of IMBV and blood pool activity even at low count levels. The animal study results indicated the effectiveness of PVC to correct the overestimation of IMBV due to blood pool contamination. In conclusion, the iterative PVC methods can achieve more accurate quantification, particularly for low count cardiac SPECT studies, typically obtained from low-dose protocols, gated studies, and dynamic applications.

  6. Test of a mosquito eggshell isolation method and subsampling procedure.

    PubMed

    Turner, P A; Streever, W J

    1997-03-01

    Production of Aedes vigilax, the common salt-marsh mosquito, can be assessed by determining eggshell densities found in soil. In this study, 14 field-collected eggshell samples were used to test a subsampling technique and compare eggshell counts obtained with a flotation method to those obtained by direct examination of sediment (DES). Relative precision of the subsampling technique was assessed by determining the minimum number of subsamples required to estimate the true mean and confidence interval of a sample at a predetermined confidence level. A regression line was fitted to cube-root transformed eggshell counts obtained from flotation and DES and found to be significant (P < 0.001, r2 = 0.97). The flotation method allowed processing of samples in about one-third of the time required by DES, but recovered an average of 44% of the eggshells present. Eggshells obtained with the flotation method can be used to predict those from DES using the following equation: DES count = [1.386 x (flotation count)0.33 - 0.01]3.

  7. Stratigraphic charcoal analysis on petrographic thin sections: Application to fire history in northwestern Minnesota

    NASA Astrophysics Data System (ADS)

    Clark, James S.

    1988-07-01

    Results of stratigraphic charcoal analysis from thin sections of varved lake sediments have been compared with fire scars on red pine trees in northwestern Minnesota to determine if charcoal data accurately reflect fire regimes. Pollen and opaque-spherule analyses were completed from a short core to confirm that laminations were annual over the last 350 yr. A good correspondence was found between fossil-charcoal and fire-scar data. Individual fires could be identified as specific peaks in the charcoal curves, and times of reduced fire frequency were reflected in the charcoal data. Charcoal was absent during the fire-suppression era from 1920 A.D. to the present. Distinct charcoal maxima from 1864 to 1920 occurred at times of fire within the lake catchment. Fire was less frequent during the 19th century, and charcoal was substantially less abundant. Fire was frequent from 1760 to 1815, and charcoal was abundant continuously. Fire scars and fossil charcoal indicate that fires did not occur during 1730-1750 and 1670-1700. Several fires occurred from 1640 to 1670 and 1700 to 1730. Charcoal counted from pollen preparations in the area generally do not show this changing fire regime. Simulated "sampling" of the thin-section data in a fashion comparable to pollen-slide methods suggests that sampling alone is not sufficient to account for differences between the two methods. Integrating annual charcoal values in this fashion still produced much higher resolution than the pollen-slide method, and the postfire suppression decline of charcoal characteristic of my method (but not of pollen slides) is still evident. Consideration of the differences in size of fragments counted by the two methods is necessary to explain charcoal representation in lake sediments.

  8. A comparison of cosegregation analysis methods for the clinical setting.

    PubMed

    Rañola, John Michael O; Liu, Quanhui; Rosenthal, Elisabeth A; Shirts, Brian H

    2018-04-01

    Quantitative cosegregation analysis can help evaluate the pathogenicity of genetic variants. However, genetics professionals without statistical training often use simple methods, reporting only qualitative findings. We evaluate the potential utility of quantitative cosegregation in the clinical setting by comparing three methods. One thousand pedigrees each were simulated for benign and pathogenic variants in BRCA1 and MLH1 using United States historical demographic data to produce pedigrees similar to those seen in the clinic. These pedigrees were analyzed using two robust methods, full likelihood Bayes factors (FLB) and cosegregation likelihood ratios (CSLR), and a simpler method, counting meioses. Both FLB and CSLR outperform counting meioses when dealing with pathogenic variants, though counting meioses is not far behind. For benign variants, FLB and CSLR greatly outperform as counting meioses is unable to generate evidence for benign variants. Comparing FLB and CSLR, we find that the two methods perform similarly, indicating that quantitative results from either of these methods could be combined in multifactorial calculations. Combining quantitative information will be important as isolated use of cosegregation in single families will yield classification for less than 1% of variants. To encourage wider use of robust cosegregation analysis, we present a website ( http://www.analyze.myvariant.org ) which implements the CSLR, FLB, and Counting Meioses methods for ATM, BRCA1, BRCA2, CHEK2, MEN1, MLH1, MSH2, MSH6, and PMS2. We also present an R package, CoSeg, which performs the CSLR analysis on any gene with user supplied parameters. Future variant classification guidelines should allow nuanced inclusion of cosegregation evidence against pathogenicity.

  9. Evaluation of counting methods for oceanic radium-228

    NASA Astrophysics Data System (ADS)

    Orr, James C.

    1988-07-01

    Measurement of open ocean 228Ra is difficult, typically requiring at least 200 L of seawater. The burden of collecting and processing these large-volume samples severely limits the widespread use of this promising tracer. To use smaller-volume samples, a more sensitive means of analysis is required. To seek out new and improved counting method(s), conventional 228Ra counting methods have been compared with some promising techniques which are currently used for other radionuclides. Of the conventional methods, α spectrometry possesses the highest efficiency (3-9%) and lowest background (0.0015 cpm), but it suffers from the need for complex chemical processing after sampling and the need to allow about 1 year for adequate ingrowth of 228Th granddaughter. The other two conventional counting methods measure the short-lived 228Ac daughter while it remains supported by 228Ra, thereby avoiding the complex sample processing and the long delay before counting. The first of these, high-resolution γ spectrometry, offers the simplest processing and an efficiency (4.8%) comparable to α spectrometry; yet its high background (0.16 cpm) and substantial equipment cost (˜30,000) limit its widespread use. The second no-wait method, β-γ coincidence spectrometry, also offers comparable efficiency (5.3%), but it possesses both lower background (0.0054 cpm) and lower initial cost (˜12,000). Three new (i.e., untried for 228Ra) techniques all seem to promise about a fivefold increase in efficiency over conventional methods. By employing liquid scintillation methods, both α spectrometry and β-γ coincidence spectrometry can improve their counter efficiency while retaining low background. The third new 228Ra counting method could be adapted from a technique which measures 224Ra by 220Rn emanation. After allowing for ingrowth and then counting for the 224Ra great-granddaughter, 228Ra could be back calculated, thereby yielding a method with high efficiency, where no sample processing is required. The efficiency and background of each of the three new methods have been estimated and are compared with those of the three methods currently employed to measure oceanic 228Ra. From efficiency and background, the relative figure of merit and the detection limit have been determined for each of the six counters. These data suggest that the new counting methods have the potential to measure most 228Ra samples with just 30 L of seawater, to better than 5% precision. Not only would this reduce the time, effort, and expense involved in sample collection, but 228Ra could then be measured on many small-volume samples (20-30 L) previously collected with only 226Ra in mind. By measuring 228Ra quantitatively on such small-volume samples, three analyses (large-volume 228Ra, large-volume 226Ra, and small-volume 226Ra) could be reduced to one, thereby dramatically improving analytical precision.

  10. Methods for assessing long-term mean pathogen count in drinking water and risk management implications.

    PubMed

    Englehardt, James D; Ashbolt, Nicholas J; Loewenstine, Chad; Gadzinski, Erik R; Ayenu-Prah, Albert Y

    2012-06-01

    Recently pathogen counts in drinking and source waters were shown theoretically to have the discrete Weibull (DW) or closely related discrete growth distribution (DGD). The result was demonstrated versus nine short-term and three simulated long-term water quality datasets. These distributions are highly skewed such that available datasets seldom represent the rare but important high-count events, making estimation of the long-term mean difficult. In the current work the methods, and data record length, required to assess long-term mean microbial count were evaluated by simulation of representative DW and DGD waterborne pathogen count distributions. Also, microbial count data were analyzed spectrally for correlation and cycles. In general, longer data records were required for more highly skewed distributions, conceptually associated with more highly treated water. In particular, 500-1,000 random samples were required for reliable assessment of the population mean ±10%, though 50-100 samples produced an estimate within one log (45%) below. A simple correlated first order model was shown to produce count series with 1/f signal, and such periodicity over many scales was shown in empirical microbial count data, for consideration in sampling. A tiered management strategy is recommended, including a plan for rapid response to unusual levels of routinely-monitored water quality indicators.

  11. Direct quantification and distribution of tetracycline-resistant genes in meat samples by real-time polymerase chain reaction.

    PubMed

    Guarddon, Mónica; Miranda, Jose M; Vázquez, Beatriz I; Cepeda, Alberto; Franco, Carlos M

    2012-07-01

    The evolution of antimicrobial-resistant bacteria has become a threat to food safety and methods to control them are necessary. Counts of tetracycline-resistant (TR) bacteria by microbiological methods were compared with those obtained by quantitative PCR (qPCR) in 80 meat samples. TR Enterobacteriaceae counts were similar between the count plate method and qPCR (P= 0.24), whereas TR aerobic mesophilic bacteria counts were significantly higher by the microbiological method (P < 0.001). The distribution of tetA and tetB genes was investigated in different types of meat. tetA was detected in chicken meat (40%), turkey meat (100%), pork (20%), and beef (40%) samples, whereas tetB was detected in chicken meat (45%), turkey meat (70%), pork (30%), and beef (35%) samples. The presence of tetracycline residues was also investigated by a receptor assay. This study offers an alternative and rapid method for monitoring the presence of TR bacteria in meat and furthers the understanding of the distribution of tetA and tetB genes. © 2012 Institute of Food Technologists®

  12. Bayesian model selection techniques as decision support for shaping a statistical analysis plan of a clinical trial: an example from a vertigo phase III study with longitudinal count data as primary endpoint.

    PubMed

    Adrion, Christine; Mansmann, Ulrich

    2012-09-10

    A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). The instruments under study provide excellent tools for preparing decisions within the SAP in a transparent way when structuring the primary analysis, sensitivity or ancillary analyses, and specific analyses for secondary endpoints. The mean logarithmic score and DIC discriminate well between different model scenarios. It becomes obvious that the naive choice of a conventional random effects Poisson model is often inappropriate for real-life count data. The findings are used to specify an appropriate mixed model employed in the sensitivity analyses of an ongoing phase III trial. The proposed Bayesian methods are not only appealing for inference but notably provide a sophisticated insight into different aspects of model performance, such as forecast verification or calibration checks, and can be applied within the model selection process. The mean of the logarithmic score is a robust tool for model ranking and is not sensitive to sample size. Therefore, these Bayesian model selection techniques offer helpful decision support for shaping sensitivity and ancillary analyses in a statistical analysis plan of a clinical trial with longitudinal count data as the primary endpoint.

  13. The piecewise-linear dynamic attenuator reduces the impact of count rate loss with photon-counting detectors

    NASA Astrophysics Data System (ADS)

    Hsieh, Scott S.; Pelc, Norbert J.

    2014-06-01

    Photon counting x-ray detectors (PCXDs) offer several advantages compared to standard energy-integrating x-ray detectors, but also face significant challenges. One key challenge is the high count rates required in CT. At high count rates, PCXDs exhibit count rate loss and show reduced detective quantum efficiency in signal-rich (or high flux) measurements. In order to reduce count rate requirements, a dynamic beam-shaping filter can be used to redistribute flux incident on the patient. We study the piecewise-linear attenuator in conjunction with PCXDs without energy discrimination capabilities. We examined three detector models: the classic nonparalyzable and paralyzable detector models, and a ‘hybrid’ detector model which is a weighted average of the two which approximates an existing, real detector (Taguchi et al 2011 Med. Phys. 38 1089-102 ). We derive analytic expressions for the variance of the CT measurements for these detectors. These expressions are used with raw data estimated from DICOM image files of an abdomen and a thorax to estimate variance in reconstructed images for both the dynamic attenuator and a static beam-shaping (‘bowtie’) filter. By redistributing flux, the dynamic attenuator reduces dose by 40% without increasing peak variance for the ideal detector. For non-ideal PCXDs, the impact of count rate loss is also reduced. The nonparalyzable detector shows little impact from count rate loss, but with the paralyzable model, count rate loss leads to noise streaks that can be controlled with the dynamic attenuator. With the hybrid model, the characteristic count rates required before noise streaks dominate the reconstruction are reduced by a factor of 2 to 3. We conclude that the piecewise-linear attenuator can reduce the count rate requirements of the PCXD in addition to improving dose efficiency. The magnitude of this reduction depends on the detector, with paralyzable detectors showing much greater benefit than nonparalyzable detectors.

  14. WFIRST: Predicting the number density of Hα-emitting galaxies

    NASA Astrophysics Data System (ADS)

    Benson, Andrew; Merson, Alex; Wang, Yun; Faisst, Andreas; Masters, Daniel; Kiessling, Alina; Rhodes, Jason

    2018-01-01

    The WFIRST mission will measure the clustering of Hα-emitting galaxies to help probe the nature of dark energy. Knowledge of the number density of such galaxies is therefore vital for forecasting the precision of thesemeasurements and assessing the scientific impact of the WFIRST mission. In this poster we present predictions from a galaxy formation model, Galacticus, for the cumulative number counts of Hα-emitting galaxies. We couple Galacticus to three different dust attenuation methods and examine the counts using each method. A χ2 minimization approach is used to compare the model counts to observed galaxy counts and calibrate the dust parameters. With these calibrated dust methods, we find that the Hα luminosity function from Galacticus is broadly consistent with observed estimates. Finally we present forecasts for the redshift distributions and number counts for a WFIRST-like survey. We predict that over a redshift range of 1 ≤ z ≤ 2 and with a blended flux limit of 1×10-16 erg s-1cm-2 Galacticus predicts that WFIRST would expect to observe a number density between 10400-15200 Hα-emitting galaxies per square degree.

  15. Usefulness of hemocytometer as a counting chamber in a computer assisted sperm analyzer (CASA)

    USGS Publications Warehouse

    Eljarah, A.; Chandler, J.; Jenkins, J.A.; Chenevert, J.; Alcanal, A.

    2013-01-01

    Several methods are used to determine sperm cell concentration, such as the haemocytometer, spectrophotometer, electronic cell counter and computer-assisted semen analysers (CASA). The utility of CASA systems has been limited due to the lack of characterization of individual systems and the absence of standardization among laboratories. The aims of this study were to: 1) validate and establish setup conditions for the CASA system utilizing the haemocytometer as a counting chamber, and 2) compare the different methods used for the determination of sperm cell concentration in bull semen. Two ejaculates were collected and the sperm cell concentration was determined using spectrophotometer and haemocytometer. For the Hamilton-Thorn method, the haemocytometer was used as a counting chamber. Sperm concentration was determined three times per ejaculate samples. A difference (P 0.05) or between the haemocytometer count and the spectrophotometer. Based on the results of this study, we concluded that the haemocytometer can be used in computerized semen analysis systems as a substitute for the commercially available disposable counting chambers, therefore avoiding disadvantageous high costs and slower procedures.

  16. Algebraic geometry and Bethe ansatz. Part I. The quotient ring for BAE

    NASA Astrophysics Data System (ADS)

    Jiang, Yunfeng; Zhang, Yang

    2018-03-01

    In this paper and upcoming ones, we initiate a systematic study of Bethe ansatz equations for integrable models by modern computational algebraic geometry. We show that algebraic geometry provides a natural mathematical language and powerful tools for understanding the structure of solution space of Bethe ansatz equations. In particular, we find novel efficient methods to count the number of solutions of Bethe ansatz equations based on Gröbner basis and quotient ring. We also develop analytical approach based on companion matrix to perform the sum of on-shell quantities over all physical solutions without solving Bethe ansatz equations explicitly. To demonstrate the power of our method, we revisit the completeness problem of Bethe ansatz of Heisenberg spin chain, and calculate the sum rules of OPE coefficients in planar N=4 super-Yang-Mills theory.

  17. An analysis of dependency of counting efficiency on worker anatomy for in vivo measurements: whole-body counting

    NASA Astrophysics Data System (ADS)

    Zhang, Binquan; Mille, Matthew; Xu, X. George

    2008-07-01

    In vivo radiobioassay is integral to many health physics and radiological protection programs dealing with internal exposures. The Bottle Manikin Absorber (BOMAB) physical phantom has been widely used for whole-body counting calibrations. However, the shape of BOMAB phantoms—a collection of plastic, cylindrical shells which contain no bones or internal organs—does not represent realistic human anatomy. Furthermore, workers who come in contact with radioactive materials have rather different body shape and size. To date, there is a lack of understanding about how the counting efficiency would change when the calibrated counter is applied to a worker with complicated internal organs or tissues. This paper presents a study on various in vivo counting efficiencies obtained from Monte Carlo simulations of two BOMAB phantoms and three tomographic image-based models (VIP-Man, NORMAN and CNMAN) for a scenario involving homogeneous whole-body radioactivity contamination. The results reveal that a phantom's counting efficiency is strongly dependent on the shape and size of a phantom. Contrary to what was expected, it was found that only small differences in efficiency were observed when the density and material composition of all internal organs and tissues of the tomographic phantoms were changed to water. The results of this study indicate that BOMAB phantoms with appropriately adjusted size and shape can be sufficient for whole-body counting calibrations when the internal contamination is homogeneous.

  18. Neutron counting with cameras

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Esch, Patrick; Crisanti, Marta; Mutti, Paolo

    2015-07-01

    A research project is presented in which we aim at counting individual neutrons with CCD-like cameras. We explore theoretically a technique that allows us to use imaging detectors as counting detectors at lower counting rates, and transits smoothly to continuous imaging at higher counting rates. As such, the hope is to combine the good background rejection properties of standard neutron counting detectors with the absence of dead time of integrating neutron imaging cameras as well as their very good spatial resolution. Compared to Xray detection, the essence of thermal neutron detection is the nuclear conversion reaction. The released energies involvedmore » are of the order of a few MeV, while X-ray detection releases energies of the order of the photon energy, which is in the 10 KeV range. Thanks to advances in camera technology which have resulted in increased quantum efficiency, lower noise, as well as increased frame rate up to 100 fps for CMOS-type cameras, this more than 100-fold higher available detection energy implies that the individual neutron detection light signal can be significantly above the noise level, as such allowing for discrimination and individual counting, which is hard to achieve with X-rays. The time scale of CMOS-type cameras doesn't allow one to consider time-of-flight measurements, but kinetic experiments in the 10 ms range are possible. The theory is next confronted to the first experimental results. (authors)« less

  19. Local box-counting dimensions of discrete quantum eigenvalue spectra: Analytical connection to quantum spectral statistics

    NASA Astrophysics Data System (ADS)

    Sakhr, Jamal; Nieminen, John M.

    2018-03-01

    Two decades ago, Wang and Ong, [Phys. Rev. A 55, 1522 (1997)], 10.1103/PhysRevA.55.1522 hypothesized that the local box-counting dimension of a discrete quantum spectrum should depend exclusively on the nearest-neighbor spacing distribution (NNSD) of the spectrum. In this Rapid Communication, we validate their hypothesis by deriving an explicit formula for the local box-counting dimension of a countably-infinite discrete quantum spectrum. This formula expresses the local box-counting dimension of a spectrum in terms of single and double integrals of the NNSD of the spectrum. As applications, we derive an analytical formula for Poisson spectra and closed-form approximations to the local box-counting dimension for spectra having Gaussian orthogonal ensemble (GOE), Gaussian unitary ensemble (GUE), and Gaussian symplectic ensemble (GSE) spacing statistics. In the Poisson and GOE cases, we compare our theoretical formulas with the published numerical data of Wang and Ong and observe excellent agreement between their data and our theory. We also study numerically the local box-counting dimensions of the Riemann zeta function zeros and the alternate levels of GOE spectra, which are often used as numerical models of spectra possessing GUE and GSE spacing statistics, respectively. In each case, the corresponding theoretical formula is found to accurately describe the numerically computed local box-counting dimension.

  20. An interlaboratory comparison of sizing and counting of subvisible particles mimicking protein aggregates.

    PubMed

    Ripple, Dean C; Montgomery, Christopher B; Hu, Zhishang

    2015-02-01

    Accurate counting and sizing of protein particles has been limited by discrepancies of counts obtained by different methods. To understand the bias and repeatability of techniques in common use in the biopharmaceutical community, the National Institute of Standards and Technology has conducted an interlaboratory comparison for sizing and counting subvisible particles from 1 to 25 μm. Twenty-three laboratories from industry, government, and academic institutions participated. The circulated samples consisted of a polydisperse suspension of abraded ethylene tetrafluoroethylene particles, which closely mimic the optical contrast and morphology of protein particles. For restricted data sets, agreement between data sets was reasonably good: relative standard deviations (RSDs) of approximately 25% for light obscuration counts with lower diameter limits from 1 to 5 μm, and approximately 30% for flow imaging with specified manufacturer and instrument setting. RSDs of the reported counts for unrestricted data sets were approximately 50% for both light obscuration and flow imaging. Differences between instrument manufacturers were not statistically significant for light obscuration but were significant for flow imaging. We also report a method for accounting for differences in the reported diameter for flow imaging and electrical sensing zone techniques; the method worked well for diameters greater than 15 μm. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  1. Academic Software Downloads from Google Code: Useful Usage Indicators?

    ERIC Educational Resources Information Center

    Thelwall, Mike; Kousha, Kayvan

    2016-01-01

    Introduction: Computer scientists and other researchers often make their programs freely available online. If this software makes a valuable contribution inside or outside of academia then its creators may want to demonstrate this with a suitable indicator, such as download counts. Methods: Download counts, citation counts, labels and licenses…

  2. Complexities of Counting.

    ERIC Educational Resources Information Center

    Stake, Bernadine Evans

    This document focuses on one child's skip counting methods. The pupil, a second grade student at Steuben School, in Kankakee, Illinois, was interviewed as she made several attempts at counting twenty-five poker chips on a circular piece of paper. The interview was part of a larger study of "Children's Conceptions of Number and Numeral,"…

  3. EM Adaptive LASSO—A Multilocus Modeling Strategy for Detecting SNPs Associated with Zero-inflated Count Phenotypes

    PubMed Central

    Mallick, Himel; Tiwari, Hemant K.

    2016-01-01

    Count data are increasingly ubiquitous in genetic association studies, where it is possible to observe excess zero counts as compared to what is expected based on standard assumptions. For instance, in rheumatology, data are usually collected in multiple joints within a person or multiple sub-regions of a joint, and it is not uncommon that the phenotypes contain enormous number of zeroes due to the presence of excessive zero counts in majority of patients. Most existing statistical methods assume that the count phenotypes follow one of these four distributions with appropriate dispersion-handling mechanisms: Poisson, Zero-inflated Poisson (ZIP), Negative Binomial, and Zero-inflated Negative Binomial (ZINB). However, little is known about their implications in genetic association studies. Also, there is a relative paucity of literature on their usefulness with respect to model misspecification and variable selection. In this article, we have investigated the performance of several state-of-the-art approaches for handling zero-inflated count data along with a novel penalized regression approach with an adaptive LASSO penalty, by simulating data under a variety of disease models and linkage disequilibrium patterns. By taking into account data-adaptive weights in the estimation procedure, the proposed method provides greater flexibility in multi-SNP modeling of zero-inflated count phenotypes. A fast coordinate descent algorithm nested within an EM (expectation-maximization) algorithm is implemented for estimating the model parameters and conducting variable selection simultaneously. Results show that the proposed method has optimal performance in the presence of multicollinearity, as measured by both prediction accuracy and empirical power, which is especially apparent as the sample size increases. Moreover, the Type I error rates become more or less uncontrollable for the competing methods when a model is misspecified, a phenomenon routinely encountered in practice. PMID:27066062

  4. EM Adaptive LASSO-A Multilocus Modeling Strategy for Detecting SNPs Associated with Zero-inflated Count Phenotypes.

    PubMed

    Mallick, Himel; Tiwari, Hemant K

    2016-01-01

    Count data are increasingly ubiquitous in genetic association studies, where it is possible to observe excess zero counts as compared to what is expected based on standard assumptions. For instance, in rheumatology, data are usually collected in multiple joints within a person or multiple sub-regions of a joint, and it is not uncommon that the phenotypes contain enormous number of zeroes due to the presence of excessive zero counts in majority of patients. Most existing statistical methods assume that the count phenotypes follow one of these four distributions with appropriate dispersion-handling mechanisms: Poisson, Zero-inflated Poisson (ZIP), Negative Binomial, and Zero-inflated Negative Binomial (ZINB). However, little is known about their implications in genetic association studies. Also, there is a relative paucity of literature on their usefulness with respect to model misspecification and variable selection. In this article, we have investigated the performance of several state-of-the-art approaches for handling zero-inflated count data along with a novel penalized regression approach with an adaptive LASSO penalty, by simulating data under a variety of disease models and linkage disequilibrium patterns. By taking into account data-adaptive weights in the estimation procedure, the proposed method provides greater flexibility in multi-SNP modeling of zero-inflated count phenotypes. A fast coordinate descent algorithm nested within an EM (expectation-maximization) algorithm is implemented for estimating the model parameters and conducting variable selection simultaneously. Results show that the proposed method has optimal performance in the presence of multicollinearity, as measured by both prediction accuracy and empirical power, which is especially apparent as the sample size increases. Moreover, the Type I error rates become more or less uncontrollable for the competing methods when a model is misspecified, a phenomenon routinely encountered in practice.

  5. Semi-automated identification of cones in the human retina using circle Hough transform

    PubMed Central

    Bukowska, Danuta M.; Chew, Avenell L.; Huynh, Emily; Kashani, Irwin; Wan, Sue Ling; Wan, Pak Ming; Chen, Fred K

    2015-01-01

    A large number of human retinal diseases are characterized by a progressive loss of cones, the photoreceptors critical for visual acuity and color perception. Adaptive Optics (AO) imaging presents a potential method to study these cells in vivo. However, AO imaging in ophthalmology is a relatively new phenomenon and quantitative analysis of these images remains difficult and tedious using manual methods. This paper illustrates a novel semi-automated quantitative technique enabling registration of AO images to macular landmarks, cone counting and its radius quantification at specified distances from the foveal center. The new cone counting approach employs the circle Hough transform (cHT) and is compared to automated counting methods, as well as arbitrated manual cone identification. We explore the impact of varying the circle detection parameter on the validity of cHT cone counting and discuss the potential role of using this algorithm in detecting both cones and rods separately. PMID:26713186

  6. State-of-the-art report on non-traditional traffic counting methods

    DOT National Transportation Integrated Search

    2001-10-01

    The purpose of this report is to look at the state-of-the-art of non-traditional traffic counting methods. This is done through a three-fold approach that includes an assessment of currently available technology, a survey of State Department of Trans...

  7. A COMPARISON OF GALAXY COUNTING TECHNIQUES IN SPECTROSCOPICALLY UNDERSAMPLED REGIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Specian, Mike A.; Szalay, Alex S., E-mail: mspecia1@jhu.edu, E-mail: szalay@jhu.edu

    2016-11-01

    Accurate measures of galactic overdensities are invaluable for precision cosmology. Obtaining these measurements is complicated when members of one’s galaxy sample lack radial depths, most commonly derived via spectroscopic redshifts. In this paper, we utilize the Sloan Digital Sky Survey’s Main Galaxy Sample to compare seven methods of counting galaxies in cells when many of those galaxies lack redshifts. These methods fall into three categories: assigning galaxies discrete redshifts, scaling the numbers counted using regions’ spectroscopic completeness properties, and employing probabilistic techniques. We split spectroscopically undersampled regions into three types—those inside the spectroscopic footprint, those outside but adjacent to it,more » and those distant from it. Through Monte Carlo simulations, we demonstrate that the preferred counting techniques are a function of region type, cell size, and redshift. We conclude by reporting optimal counting strategies under a variety of conditions.« less

  8. Probe classification of on-off type DNA microarray images with a nonlinear matching measure

    NASA Astrophysics Data System (ADS)

    Ryu, Munho; Kim, Jong Dae; Min, Byoung Goo; Kim, Jongwon; Kim, Y. Y.

    2006-01-01

    We propose a nonlinear matching measure, called counting measure, as a signal detection measure that is defined as the number of on pixels in the spot area. It is applied to classify probes for an on-off type DNA microarray, where each probe spot is classified as hybridized or not. The counting measure also incorporates the maximum response search method, where the expected signal is obtained by taking the maximum among the measured responses of the various positions and sizes of the spot template. The counting measure was compared to existing signal detection measures such as the normalized covariance and the median for 2390 patient samples tested on the human papillomavirus (HPV) DNA chip. The counting measure performed the best regardless of whether or not the maximum response search method was used. The experimental results showed that the counting measure combined with the positional search was the most preferable.

  9. Correlation Functions Quantify Super-Resolution Images and Estimate Apparent Clustering Due to Over-Counting

    PubMed Central

    Veatch, Sarah L.; Machta, Benjamin B.; Shelby, Sarah A.; Chiang, Ethan N.; Holowka, David A.; Baird, Barbara A.

    2012-01-01

    We present an analytical method using correlation functions to quantify clustering in super-resolution fluorescence localization images and electron microscopy images of static surfaces in two dimensions. We use this method to quantify how over-counting of labeled molecules contributes to apparent self-clustering and to calculate the effective lateral resolution of an image. This treatment applies to distributions of proteins and lipids in cell membranes, where there is significant interest in using electron microscopy and super-resolution fluorescence localization techniques to probe membrane heterogeneity. When images are quantified using pair auto-correlation functions, the magnitude of apparent clustering arising from over-counting varies inversely with the surface density of labeled molecules and does not depend on the number of times an average molecule is counted. In contrast, we demonstrate that over-counting does not give rise to apparent co-clustering in double label experiments when pair cross-correlation functions are measured. We apply our analytical method to quantify the distribution of the IgE receptor (FcεRI) on the plasma membranes of chemically fixed RBL-2H3 mast cells from images acquired using stochastic optical reconstruction microscopy (STORM/dSTORM) and scanning electron microscopy (SEM). We find that apparent clustering of FcεRI-bound IgE is dominated by over-counting labels on individual complexes when IgE is directly conjugated to organic fluorophores. We verify this observation by measuring pair cross-correlation functions between two distinguishably labeled pools of IgE-FcεRI on the cell surface using both imaging methods. After correcting for over-counting, we observe weak but significant self-clustering of IgE-FcεRI in fluorescence localization measurements, and no residual self-clustering as detected with SEM. We also apply this method to quantify IgE-FcεRI redistribution after deliberate clustering by crosslinking with two distinct trivalent ligands of defined architectures, and we evaluate contributions from both over-counting of labels and redistribution of proteins. PMID:22384026

  10. Tower counts

    USGS Publications Warehouse

    Woody, Carol Ann; Johnson, D.H.; Shrier, Brianna M.; O'Neal, Jennifer S.; Knutzen, John A.; Augerot, Xanthippe; O'Neal, Thomas A.; Pearsons, Todd N.

    2007-01-01

    Counting towers provide an accurate, low-cost, low-maintenance, low-technology, and easily mobilized escapement estimation program compared to other methods (e.g., weirs, hydroacoustics, mark-recapture, and aerial surveys) (Thompson 1962; Siebel 1967; Cousens et al. 1982; Symons and Waldichuk 1984; Anderson 2000; Alaska Department of Fish and Game 2003). Counting tower data has been found to be consistent with that of digital video counts (Edwards 2005). Counting towers do not interfere with natural fish migration patterns, nor are fish handled or stressed; however, their use is generally limited to clear rivers that meet specific site selection criteria. The data provided by counting tower sampling allow fishery managers to determine reproductive population size, estimate total return (escapement + catch) and its uncertainty, evaluate population productivity and trends, set harvest rates, determine spawning escapement goals, and forecast future returns (Alaska Department of Fish and Game 1974-2000 and 1975-2004). The number of spawning fish is determined by subtracting subsistence, sport-caught fish, and prespawn mortality from the total estimated escapement. The methods outlined in this protocol for tower counts can be used to provide reasonable estimates ( plus or minus 6%-10%) of reproductive salmon population size and run timing in clear rivers. 

  11. Estimating the mass variance in neutron multiplicity counting-A comparison of approaches

    NASA Astrophysics Data System (ADS)

    Dubi, C.; Croft, S.; Favalli, A.; Ocherashvili, A.; Pedersen, B.

    2017-12-01

    In the standard practice of neutron multiplicity counting , the first three sampled factorial moments of the event triggered neutron count distribution are used to quantify the three main neutron source terms: the spontaneous fissile material effective mass, the relative (α , n) production and the induced fission source responsible for multiplication. This study compares three methods to quantify the statistical uncertainty of the estimated mass: the bootstrap method, propagation of variance through moments, and statistical analysis of cycle data method. Each of the three methods was implemented on a set of four different NMC measurements, held at the JRC-laboratory in Ispra, Italy, sampling four different Pu samples in a standard Plutonium Scrap Multiplicity Counter (PSMC) well counter.

  12. Estimating the mass variance in neutron multiplicity counting $-$ A comparison of approaches

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dubi, C.; Croft, S.; Favalli, A.

    In the standard practice of neutron multiplicity counting, the first three sampled factorial moments of the event triggered neutron count distribution are used to quantify the three main neutron source terms: the spontaneous fissile material effective mass, the relative (α,n) production and the induced fission source responsible for multiplication. This study compares three methods to quantify the statistical uncertainty of the estimated mass: the bootstrap method, propagation of variance through moments, and statistical analysis of cycle data method. Each of the three methods was implemented on a set of four different NMC measurements, held at the JRC-laboratory in Ispra, Italy,more » sampling four different Pu samples in a standard Plutonium Scrap Multiplicity Counter (PSMC) well counter.« less

  13. Estimating the mass variance in neutron multiplicity counting $-$ A comparison of approaches

    DOE PAGES

    Dubi, C.; Croft, S.; Favalli, A.; ...

    2017-09-14

    In the standard practice of neutron multiplicity counting, the first three sampled factorial moments of the event triggered neutron count distribution are used to quantify the three main neutron source terms: the spontaneous fissile material effective mass, the relative (α,n) production and the induced fission source responsible for multiplication. This study compares three methods to quantify the statistical uncertainty of the estimated mass: the bootstrap method, propagation of variance through moments, and statistical analysis of cycle data method. Each of the three methods was implemented on a set of four different NMC measurements, held at the JRC-laboratory in Ispra, Italy,more » sampling four different Pu samples in a standard Plutonium Scrap Multiplicity Counter (PSMC) well counter.« less

  14. The hypergraph regularity method and its applications

    PubMed Central

    Rödl, V.; Nagle, B.; Skokan, J.; Schacht, M.; Kohayakawa, Y.

    2005-01-01

    Szemerédi's regularity lemma asserts that every graph can be decomposed into relatively few random-like subgraphs. This random-like behavior enables one to find and enumerate subgraphs of a given isomorphism type, yielding the so-called counting lemma for graphs. The combined application of these two lemmas is known as the regularity method for graphs and has proved useful in graph theory, combinatorial geometry, combinatorial number theory, and theoretical computer science. Here, we report on recent advances in the regularity method for k-uniform hypergraphs, for arbitrary k ≥ 2. This method, purely combinatorial in nature, gives alternative proofs of density theorems originally due to E. Szemerédi, H. Furstenberg, and Y. Katznelson. Further results in extremal combinatorics also have been obtained with this approach. The two main components of the regularity method for k-uniform hypergraphs, the regularity lemma and the counting lemma, have been obtained recently: Rödl and Skokan (based on earlier work of Frankl and Rödl) generalized Szemerédi's regularity lemma to k-uniform hypergraphs, and Nagle, Rödl, and Schacht succeeded in proving a counting lemma accompanying the Rödl–Skokan hypergraph regularity lemma. The counting lemma is proved by reducing the counting problem to a simpler one previously investigated by Kohayakawa, Rödl, and Skokan. Similar results were obtained independently by W. T. Gowers, following a different approach. PMID:15919821

  15. Evaluation of accuracy and precision of a smartphone based automated parasite egg counting system in comparison to the McMaster and Mini-FLOTAC methods.

    PubMed

    Scare, J A; Slusarewicz, P; Noel, M L; Wielgus, K M; Nielsen, M K

    2017-11-30

    Fecal egg counts are emphasized for guiding equine helminth parasite control regimens due to the rise of anthelmintic resistance. This, however, poses further challenges, since egg counting results are prone to issues such as operator dependency, method variability, equipment requirements, and time commitment. The use of image analysis software for performing fecal egg counts is promoted in recent studies to reduce the operator dependency associated with manual counts. In an attempt to remove operator dependency associated with current methods, we developed a diagnostic system that utilizes a smartphone and employs image analysis to generate automated egg counts. The aims of this study were (1) to determine precision of the first smartphone prototype, the modified McMaster and ImageJ; (2) to determine precision, accuracy, sensitivity, and specificity of the second smartphone prototype, the modified McMaster, and Mini-FLOTAC techniques. Repeated counts on fecal samples naturally infected with equine strongyle eggs were performed using each technique to evaluate precision. Triplicate counts on 36 egg count negative samples and 36 samples spiked with strongyle eggs at 5, 50, 500, and 1000 eggs per gram were performed using a second smartphone system prototype, Mini-FLOTAC, and McMaster to determine technique accuracy. Precision across the techniques was evaluated using the coefficient of variation. In regards to the first aim of the study, the McMaster technique performed with significantly less variance than the first smartphone prototype and ImageJ (p<0.0001). The smartphone and ImageJ performed with equal variance. In regards to the second aim of the study, the second smartphone system prototype had significantly better precision than the McMaster (p<0.0001) and Mini-FLOTAC (p<0.0001) methods, and the Mini-FLOTAC was significantly more precise than the McMaster (p=0.0228). Mean accuracies for the Mini-FLOTAC, McMaster, and smartphone system were 64.51%, 21.67%, and 32.53%, respectively. The Mini-FLOTAC was significantly more accurate than the McMaster (p<0.0001) and the smartphone system (p<0.0001), while the smartphone and McMaster counts did not have statistically different accuracies. Overall, the smartphone system compared favorably to manual methods with regards to precision, and reasonably with regards to accuracy. With further refinement, this system could become useful in veterinary practice. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Rapid assessment of viable but non-culturable Bacillus coagulans MTCC 5856 in commercial formulations using Flow cytometry.

    PubMed

    Majeed, Muhammed; Majeed, Shaheen; Nagabhushanam, Kalyanam; Punnapuzha, Ardra; Philip, Sheena; Mundkur, Lakshmi

    2018-01-01

    Accurate enumeration of bacterial count in probiotic formulation is imperative to ensure that the product adheres to regulatory standards and citation in consumer product label. Standard methods like plate count, can enumerate only replicating bacterial population under selected culture conditions. Viable but non culturable bacteria (VBNC) retain characteristics of living cells and can regain cultivability by a process known as resuscitation. This is a protective mechanism adapted by bacteria to evade stressful environmental conditions. B. coagulans MTCC 5856(LactoSpore®) is a probiotic endospore which can survive for decades in hostile environments without dividing. In the present study, we explored the use of flow cytometry to enumerate the viable count of B. coagulans MTCC 5856 under acidic and alkaline conditions, high temperature and in commercial formulations like compressed tablets and capsules. Flow cytometry (FCM) was comparable to plate count method when the spores were counted at physiological conditions. We show that VBNC state is induced in B. coagulans MTCC 5856by high temperature and acidic pH. The cells get resuscitated under physiological conditions and FCM was sensitive to detect the VBNC spores. Flow cytometry showed excellent ability to assess the viable spore count in commercial probiotic formulations of B. coagulans MTCC 5856. The results establish Flow cytometry as a reliable method to count viable bacteria in commercial probiotic preparations. Sporulation as well as existence as VBNC could contribute to the extreme stability of B. coagulans MTCC 5856.

  17. Rapid assessment of viable but non-culturable Bacillus coagulans MTCC 5856 in commercial formulations using Flow cytometry

    PubMed Central

    Majeed, Muhammed; Majeed, Shaheen; Nagabhushanam, Kalyanam; Punnapuzha, Ardra; Philip, Sheena

    2018-01-01

    Accurate enumeration of bacterial count in probiotic formulation is imperative to ensure that the product adheres to regulatory standards and citation in consumer product label. Standard methods like plate count, can enumerate only replicating bacterial population under selected culture conditions. Viable but non culturable bacteria (VBNC) retain characteristics of living cells and can regain cultivability by a process known as resuscitation. This is a protective mechanism adapted by bacteria to evade stressful environmental conditions. B. coagulans MTCC 5856(LactoSpore®) is a probiotic endospore which can survive for decades in hostile environments without dividing. In the present study, we explored the use of flow cytometry to enumerate the viable count of B. coagulans MTCC 5856 under acidic and alkaline conditions, high temperature and in commercial formulations like compressed tablets and capsules. Flow cytometry (FCM) was comparable to plate count method when the spores were counted at physiological conditions. We show that VBNC state is induced in B. coagulans MTCC 5856by high temperature and acidic pH. The cells get resuscitated under physiological conditions and FCM was sensitive to detect the VBNC spores. Flow cytometry showed excellent ability to assess the viable spore count in commercial probiotic formulations of B. coagulans MTCC 5856. The results establish Flow cytometry as a reliable method to count viable bacteria in commercial probiotic preparations. Sporulation as well as existence as VBNC could contribute to the extreme stability of B. coagulans MTCC 5856. PMID:29474436

  18. Comparison of visual-based helicopter and fixed-wing forward-looking infrared surveys for counting white-tailed deer Odocoileus virginianus

    USGS Publications Warehouse

    Storm, Daniel J.; Samuel, Michael D.; Van Deelen, Timothy R.; Malcolm, Karl D.; Rolley, Robert E.; Frost, Nancy A.; Bates, Donald P.; Richards, Bryan J.

    2011-01-01

    Aerial surveys using direct counts of animals are commonly used to estimate deer abundance. Forward-looking infrared (FLIR) technology is increasingly replacing traditional methods such as visual observation from helicopters. Our goals were to compare fixed-wing FLIR and visual, helicopter-based counts in terms of relative bias, influence of snow cover and cost. We surveyed five plots: four 41.4 km2 plots with free-ranging white-tailed deer Odocoileus virginianus populations in Wisconsin and a 5.3 km2 plot with a white-tailed deer population contained by a high fence in Michigan. We surveyed plots using both fixed-wing FLIR and helicopters, both with snow cover and without snow. None of the methods counted more deer than the other when snow was present. Helicopter counts were lower in the absence of snow, but lack of snow cover did not apparently affect FLIR. Group sizes of observed deer were similar regardless of survey method or season. We found that FLIR counts were generally precise (CV = 0.089) when two or three replicate surveys were conducted within a few hours. However, at the plot level, FLIR counts differed greatly between seasons, suggesting that detection rates vary over larger time scales. Fixed-wing FLIR was more costly than visual observers in helicopters and was more restrictive in terms of acceptable survey conditions. Further research is needed to understand what factors influence the detection of deer during FLIR surveys.

  19. Total and Viable Legionella pneumophila Cells in Hot and Natural Waters as Measured by Immunofluorescence-Based Assays and Solid-Phase Cytometry ▿†

    PubMed Central

    Parthuisot, N.; Binet, M.; Touron-Bodilis, A.; Pougnard, C.; Lebaron, P.; Baudart, J.

    2011-01-01

    A new method was developed for the rapid and sensitive detection of viable Legionella pneumophila. The method combines specific immunofluorescence (IF) staining using monoclonal antibodies with a bacterial viability marker (ChemChrome V6 cellular esterase activity marker) by means of solid-phase cytometry (SPC). IF methods were applied to the detection and enumeration of both the total and viable L. pneumophila cells in water samples. The sensitivity of the IF methods coupled to SPC was 34 cells liter−1, and the reproducibility was good, with the coefficient of variation generally falling below 30%. IF methods were applied to the enumeration of total and viable L. pneumophila cells in 46 domestic hot water samples as well as in cooling tower water and natural water samples, such as thermal spring water and freshwater samples. Comparison with standard plate counts showed that (i) the total direct counts were always higher than the plate counts and (ii) the viable counts were higher than or close to the plate counts. With domestic hot waters, when the IF assay was combined with the viability test, SPC detected up to 3.4 × 103 viable but nonculturable L. pneumophila cells per liter. These direct IF methods could be a powerful tool for high-frequency monitoring of domestic hot waters or for investigating the occurrence of viable L. pneumophila in both man-made water systems and environmental water samples. PMID:21742913

  20. A semi-automated technique for labeling and counting of apoptosing retinal cells

    PubMed Central

    2014-01-01

    Background Retinal ganglion cell (RGC) loss is one of the earliest and most important cellular changes in glaucoma. The DARC (Detection of Apoptosing Retinal Cells) technology enables in vivo real-time non-invasive imaging of single apoptosing retinal cells in animal models of glaucoma and Alzheimer’s disease. To date, apoptosing RGCs imaged using DARC have been counted manually. This is time-consuming, labour-intensive, vulnerable to bias, and has considerable inter- and intra-operator variability. Results A semi-automated algorithm was developed which enabled automated identification of apoptosing RGCs labeled with fluorescent Annexin-5 on DARC images. Automated analysis included a pre-processing stage involving local-luminance and local-contrast “gain control”, a “blob analysis” step to differentiate between cells, vessels and noise, and a method to exclude non-cell structures using specific combined ‘size’ and ‘aspect’ ratio criteria. Apoptosing retinal cells were counted by 3 masked operators, generating ‘Gold-standard’ mean manual cell counts, and were also counted using the newly developed automated algorithm. Comparison between automated cell counts and the mean manual cell counts on 66 DARC images showed significant correlation between the two methods (Pearson’s correlation coefficient 0.978 (p < 0.001), R Squared = 0.956. The Intraclass correlation coefficient was 0.986 (95% CI 0.977-0.991, p < 0.001), and Cronbach’s alpha measure of consistency = 0.986, confirming excellent correlation and consistency. No significant difference (p = 0.922, 95% CI: −5.53 to 6.10) was detected between the cell counts of the two methods. Conclusions The novel automated algorithm enabled accurate quantification of apoptosing RGCs that is highly comparable to manual counting, and appears to minimise operator-bias, whilst being both fast and reproducible. This may prove to be a valuable method of quantifying apoptosing retinal cells, with particular relevance to translation in the clinic, where a Phase I clinical trial of DARC in glaucoma patients is due to start shortly. PMID:24902592

  1. Rapid classification of hairtail fish and pork freshness using an electronic nose based on the PCA method.

    PubMed

    Tian, Xiu-Ying; Cai, Qiang; Zhang, Yong-Ming

    2012-01-01

    We report a method for building a simple and reproducible electronic nose based on commercially available metal oxide sensors (MOS) to monitor the freshness of hairtail fish and pork stored at 15, 10, and 5 °C. After assembly in the laboratory, the proposed product was tested by a manufacturer. Sample delivery was based on the dynamic headspace method, and two features were extracted from the transient response of each sensor using an unsupervised principal component analysis (PCA) method. The compensation method and pattern recognition based on PCA are discussed in the current paper. PCA compensation can be used for all storage temperatures, however, pattern recognition differs according to storage conditions. Total volatile basic nitrogen (TVBN) and aerobic bacterial counts of the samples were measured simultaneously with the standard indicators of hairtail fish and pork freshness. The PCA models based on TVBN and aerobic bacterial counts were used to classify hairtail fish samples as "fresh" (TVBN ≤ 25 g and microbial counts ≤ 10(6) cfu/g) or "spoiled" (TVBN ≥ 25 g and microbial counts ≥ 10(6) cfu/g) and pork samples also as "fresh" (TVBN ≤ 15 g and microbial counts ≤ 10(6) cfu/g) or "spoiled" (TVBN ≥ 15 g and microbial counts ≥ 10(6) cfu/g). Good correlation coefficients between the responses of the electronic nose and the TVBN and aerobic bacterial counts of the samples were obtained. For hairtail fish, correlation coefficients were 0.97 and 0.91, and for pork, correlation coefficients were 0.81 and 0.88, respectively. Through laboratory simulation and field application, we were able to determine that the electronic nose could help ensure the shelf life of hairtail fish and pork, especially when an instrument is needed to take measurements rapidly. The results also showed that the electronic nose could analyze the process and level of spoilage for hairtail fish and pork.

  2. Single-Cell mRNA-Seq Using the Fluidigm C1 System and Integrated Fluidics Circuits.

    PubMed

    Gong, Haibiao; Do, Devin; Ramakrishnan, Ramesh

    2018-01-01

    Single-cell mRNA-seq is a valuable tool to dissect expression profiles and to understand the regulatory network of genes. Microfluidics is well suited for single-cell analysis owing both to the small volume of the reaction chambers and easiness of automation. Here we describe the workflow of single-cell mRNA-seq using C1 IFC, which can isolate and process up to 96 cells. Both on-chip procedure (lysis, reverse transcription, and preamplification PCR) and off-chip sequencing library preparation protocols are described. The workflow generates full-length mRNA information, which is more valuable compared to 3' end counting method for many applications.

  3. Image Accumulation in Pixel Detector Gated by Late External Trigger Signal and its Application in Imaging Activation Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakubek, J.; Cejnarova, A.; Platkevic, M.

    Single quantum counting pixel detectors of Medipix type are starting to be used in various radiographic applications. Compared to standard devices for digital imaging (such as CCDs or CMOS sensors) they present significant advantages: direct conversion of radiation to electric signal, energy sensitivity, noiseless image integration, unlimited dynamic range, absolute linearity. In this article we describe usage of the pixel device TimePix for image accumulation gated by late trigger signal. Demonstration of the technique is given on imaging coincidence instrumental neutron activation analysis (Imaging CINAA). This method allows one to determine concentration and distribution of certain preselected element in anmore » inspected sample.« less

  4. Exclusion-Based Capture and Enumeration of CD4+ T Cells from Whole Blood for Low-Resource Settings.

    PubMed

    Howard, Alexander L; Pezzi, Hannah M; Beebe, David J; Berry, Scott M

    2014-06-01

    In developing countries, demand exists for a cost-effective method to evaluate human immunodeficiency virus patients' CD4(+) T-helper cell count. The TH (CD4) cell count is the current marker used to identify when an HIV patient has progressed to acquired immunodeficiency syndrome, which results when the immune system can no longer prevent certain opportunistic infections. A system to perform TH count that obviates the use of costly flow cytometry will enable physicians to more closely follow patients' disease progression and response to therapy in areas where such advanced equipment is unavailable. Our system of two serially-operated immiscible phase exclusion-based cell isolations coupled with a rapid fluorescent readout enables exclusion-based isolation and accurate counting of T-helper cells at lower cost and from a smaller volume of blood than previous methods. TH cell isolation via immiscible filtration assisted by surface tension (IFAST) compares well against the established Dynal T4 Quant Kit and is sensitive at CD4 counts representative of immunocompromised patients (less than 200 TH cells per microliter of blood). Our technique retains use of open, simple-to-operate devices that enable IFAST as a high-throughput, automatable sample preparation method, improving throughput over previous low-resource methods. © 2013 Society for Laboratory Automation and Screening.

  5. CZT sensors for Computed Tomography: from crystal growth to image quality

    NASA Astrophysics Data System (ADS)

    Iniewski, K.

    2016-12-01

    Recent advances in Traveling Heater Method (THM) growth and device fabrication that require additional processing steps have enabled to dramatically improve hole transport properties and reduce polarization effects in Cadmium Zinc Telluride (CZT) material. As a result high flux operation of CZT sensors at rates in excess of 200 Mcps/mm2 is now possible and has enabled multiple medical imaging companies to start building prototype Computed Tomography (CT) scanners. CZT sensors are also finding new commercial applications in non-destructive testing (NDT) and baggage scanning. In order to prepare for high volume commercial production we are moving from individual tile processing to whole wafer processing using silicon methodologies, such as waxless processing, cassette based/touchless wafer handling. We have been developing parametric level screening at the wafer stage to ensure high wafer quality before detector fabrication in order to maximize production yields. These process improvements enable us, and other CZT manufacturers who pursue similar developments, to provide high volume production for photon counting applications in an economically feasible manner. CZT sensors are capable of delivering both high count rates and high-resolution spectroscopic performance, although it is challenging to achieve both of these attributes simultaneously. The paper discusses material challenges, detector design trade-offs and ASIC architectures required to build cost-effective CZT based detection systems. Photon counting ASICs are essential part of the integrated module platforms as charge-sensitive electronics needs to deal with charge-sharing and pile-up effects.

  6. GO Explorer: A gene-ontology tool to aid in the interpretation of shotgun proteomics data.

    PubMed

    Carvalho, Paulo C; Fischer, Juliana Sg; Chen, Emily I; Domont, Gilberto B; Carvalho, Maria Gc; Degrave, Wim M; Yates, John R; Barbosa, Valmir C

    2009-02-24

    Spectral counting is a shotgun proteomics approach comprising the identification and relative quantitation of thousands of proteins in complex mixtures. However, this strategy generates bewildering amounts of data whose biological interpretation is a challenge. Here we present a new algorithm, termed GO Explorer (GOEx), that leverages the gene ontology (GO) to aid in the interpretation of proteomic data. GOEx stands out because it combines data from protein fold changes with GO over-representation statistics to help draw conclusions. Moreover, it is tightly integrated within the PatternLab for Proteomics project and, thus, lies within a complete computational environment that provides parsers and pattern recognition tools designed for spectral counting. GOEx offers three independent methods to query data: an interactive directed acyclic graph, a specialist mode where key words can be searched, and an automatic search. Its usefulness is demonstrated by applying it to help interpret the effects of perillyl alcohol, a natural chemotherapeutic agent, on glioblastoma multiform cell lines (A172). We used a new multi-surfactant shotgun proteomic strategy and identified more than 2600 proteins; GOEx pinpointed key sets of differentially expressed proteins related to cell cycle, alcohol catabolism, the Ras pathway, apoptosis, and stress response, to name a few. GOEx facilitates organism-specific studies by leveraging GO and providing a rich graphical user interface. It is a simple to use tool, specialized for biologists who wish to analyze spectral counting data from shotgun proteomics. GOEx is available at http://pcarvalho.com/patternlab.

  7. Comprehensive System-Based Architecture for an Integrated High Energy Laser Test Bed

    DTIC Science & Technology

    2015-03-01

    76 4. Comparison of Sensors ................................................................76 B. TRANSMISSION...81 b. Photometers .......................................................................84 4. Comparison of Sensors ...88 a. Flat Plate Target Boards, Ablatives, and Acrylite ...........88 b. Photon-Counting Sensors

  8. Agreement between pedometer and accelerometer in measuring physical activity in overweight and obese pregnant women

    PubMed Central

    2011-01-01

    Background Inexpensive, reliable objective methods are needed to measure physical activity (PA) in large scale trials. This study compared the number of pedometer step counts with accelerometer data in pregnant women in free-living conditions to assess agreement between these measures. Methods Pregnant women (n = 58) with body mass index ≥25 kg/m2 at median 13 weeks' gestation wore a GT1M Actigraph accelerometer and a Yamax Digi-Walker CW-701 pedometer for four consecutive days. The Spearman rank correlation coefficients were determined between pedometer step counts and various accelerometer measures of PA. Total agreement between accelerometer and pedometer step counts was evaluated by determining the 95% limits of agreement estimated using a regression-based method. Agreement between the monitors in categorising participants as active or inactive was assessed by determining Kappa. Results Pedometer step counts correlated moderately (r = 0.36 to 0.54) with most accelerometer measures of PA. Overall step counts recorded by the pedometer and the accelerometer were not significantly different (medians 5961 vs. 5687 steps/day, p = 0.37). However, the 95% limits of agreement ranged from -2690 to 2656 steps/day for the mean step count value (6026 steps/day) and changed substantially over the range of values. Agreement between the monitors in categorising participants to active and inactive varied from moderate to good depending on the criteria adopted. Conclusions Despite statistically significant correlations and similar median step counts, the overall agreement between pedometer and accelerometer step counts was poor and varied with activity level. Pedometer and accelerometer steps cannot be used interchangeably in overweight and obese pregnant women. PMID:21703033

  9. A Bayesian Method for Identifying Contaminated Detectors in Low-Level Alpha Spectrometers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maclellan, Jay A.; Strom, Daniel J.; Joyce, Kevin E.

    2011-11-02

    Analyses used for radiobioassay and other radiochemical tests are normally designed to meet specified quality objectives, such relative bias, precision, and minimum detectable activity (MDA). In the case of radiobioassay analyses for alpha emitting radionuclides, a major determiner of the process MDA is the instrument background. Alpha spectrometry detectors are often restricted to only a few counts over multi-day periods in order to meet required MDAs for nuclides such as plutonium-239 and americium-241. A detector background criterion is often set empirically based on experience, or frequentist or classical statistics are applied to the calculated background count necessary to meet amore » required MDA. An acceptance criterion for the detector background is set at the multiple of the estimated background standard deviation above the assumed mean that provides an acceptably small probability of observation if the mean and standard deviation estimate are correct. The major problem with this method is that the observed background counts used to estimate the mean, and thereby the standard deviation when a Poisson distribution is assumed, are often in the range of zero to three counts. At those expected count levels it is impossible to obtain a good estimate of the true mean from a single measurement. As an alternative, Bayesian statistical methods allow calculation of the expected detector background count distribution based on historical counts from new, uncontaminated detectors. This distribution can then be used to identify detectors showing an increased probability of contamination. The effect of varying the assumed range of background counts (i.e., the prior probability distribution) from new, uncontaminated detectors will be is discussed.« less

  10. Standardization of iodine-129 by the TDCR liquid scintillation method and 4π β-γ coincidence counting

    NASA Astrophysics Data System (ADS)

    Cassette, P.; Bouchard, J.; Chauvenet, B.

    1994-01-01

    Iodine-129 is a long-lived fission product, with physical and chemical properties that make it a good candidate for evaluating the environmental impact of the nuclear energy fuel cycle. To avoid solid source preparation problems, liquid scintillation has been used to standardize this nuclide for a EUROMET intercomparison. Two methods were used to measure the iodine-129 activity: triple-to-double-coincidence ratio liquid scintillation counting and 4π β-γ coincidence counting; the results are in good agreement.

  11. Nondestructive detection of total viable count changes of chilled pork in high oxygen storage condition based on hyperspectral technology

    NASA Astrophysics Data System (ADS)

    Zheng, Xiaochun; Peng, Yankun; Li, Yongyu; Chao, Kuanglin; Qin, Jianwei

    2017-05-01

    The plate count method is commonly used to detect the total viable count (TVC) of bacteria in pork, which is timeconsuming and destructive. It has also been used to study the changes of the TVC in pork under different storage conditions. In recent years, many scholars have explored the non-destructive methods on detecting TVC by using visible near infrared (VIS/NIR) technology and hyperspectral technology. The TVC in chilled pork was monitored under high oxygen condition in this study by using hyperspectral technology in order to evaluate the changes of total bacterial count during storage, and then evaluate advantages and disadvantages of the storage condition. The VIS/NIR hyperspectral images of samples stored in high oxygen condition was acquired by a hyperspectral system in range of 400 1100nm. The actual reference value of total bacteria was measured by standard plate count method, and the results were obtained in 48 hours. The reflection spectra of the samples are extracted and used for the establishment of prediction model for TVC. The spectral preprocessing methods of standard normal variate transformation (SNV), multiple scatter correction (MSC) and derivation was conducted to the original reflectance spectra of samples. Partial least squares regression (PLSR) of TVC was performed and optimized to be the prediction model. The results show that the near infrared hyperspectral technology based on 400-1100nm combined with PLSR model can describe the growth pattern of the total bacteria count of the chilled pork under the condition of high oxygen very vividly and rapidly. The results obtained in this study demonstrate that the nondestructive method of TVC based on NIR hyperspectral has great potential in monitoring of edible safety in processing and storage of meat.

  12. ESTIMATION OF RADIOACTIVE CALCIUM-45 BY LIQUID SCINTILLATION COUNTING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lutwak, L.

    1959-03-01

    A liquid sclntillation counting method is developed for determining radioactive calcium-45 in biological materials. The calcium-45 is extracted, concentrated, and dissolved in absolute ethyl alcohol to which is added 0.4% diphenyloxazol in toluene. Counting efficiency is about 65 percent with standard deviation of the J-57 engin 7.36 percent. (auth)

  13. Repeatability of paired counts.

    PubMed

    Alexander, Neal; Bethony, Jeff; Corrêa-Oliveira, Rodrigo; Rodrigues, Laura C; Hotez, Peter; Brooker, Simon

    2007-08-30

    The Bland and Altman technique is widely used to assess the variation between replicates of a method of clinical measurement. It yields the repeatability, i.e. the value within which 95 per cent of repeat measurements lie. The valid use of the technique requires that the variance is constant over the data range. This is not usually the case for counts of items such as CD4 cells or parasites, nor is the log transformation applicable to zero counts. We investigate the properties of generalized differences based on Box-Cox transformations. For an example, in a data set of hookworm eggs counted by the Kato-Katz method, the square root transformation is found to stabilize the variance. We show how to back-transform the repeatability on the square root scale to the repeatability of the counts themselves, as an increasing function of the square mean root egg count, i.e. the square of the average of square roots. As well as being more easily interpretable, the back-transformed results highlight the dependence of the repeatability on the sample volume used.

  14. PREDICT: Pattern Representation and Evaluation of Data through Integration, Correlation, and Transformation

    DTIC Science & Technology

    2015-02-01

    than 36ºC; 2) heart rate > 90 beats/min; 3) respiratory rate > 20 breaths/min; and 4) white blood cell count > 12,000/mm3 or < 4,000/mm3 which more than...4) white blood cell count 5) heart rate variability 6) blood pressure The challenge is that once these criteria are met, it is often the case that...Figure 7) was not actually meant to be read (no individual variables or numbers). We believed that showing an increasing size (and color and

  15. Microradiography with Semiconductor Pixel Detectors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jakubek, Jan; Cejnarova, Andrea; Dammer, Jiri

    High resolution radiography (with X-rays, neutrons, heavy charged particles, ...) often exploited also in tomographic mode to provide 3D images stands as a powerful imaging technique for instant and nondestructive visualization of fine internal structure of objects. Novel types of semiconductor single particle counting pixel detectors offer many advantages for radiation imaging: high detection efficiency, energy discrimination or direct energy measurement, noiseless digital integration (counting), high frame rate and virtually unlimited dynamic range. This article shows the application and potential of pixel detectors (such as Medipix2 or TimePix) in different fields of radiation imaging.

  16. Justice Can Further Improve Its Monitoring of Changes in State/Local Voting Laws.

    DTIC Science & Technology

    1983-12-19

    voter quali- fications and eligibility; registration, bal- loting and vote counting procedures; and the eligibility or method of selecting candidates...voter qualifications and eligibility; registration, balloting, and vote counting procedures; and the eligibility or method of *$ selecting candidates...reapportionments, -* annexations, method -of-election, and bilingual assistance to mi- nority language groups. Forty-nine of the withdrawals occurred after the

  17. Alternate method of source preparation for alpha spectrometry: No electrodeposition, no hydrofluoric acid

    DOE PAGES

    Kurosaki, Hiromu; Mueller, Rebecca J.; Lambert, Susan B.; ...

    2016-07-15

    An alternate method of preparing actinide alpha counting sources was developed in place of electrodeposition or lanthanide fluoride micro-precipitation. The method uses lanthanide hydroxide micro-precipitation to avoid the use of hazardous hydrofluoric acid. Lastly, it provides a quicker, simpler, and safer way of preparing actinide alpha counting sources in routine, production-type laboratories that process many samples daily.

  18. Low-derivative operators of the Standard Model effective field theory via Hilbert series methods

    NASA Astrophysics Data System (ADS)

    Lehman, Landon; Martin, Adam

    2016-02-01

    In this work, we explore an extension of Hilbert series techniques to count operators that include derivatives. For sufficiently low-derivative operators, we conjecture an algorithm that gives the number of invariant operators, properly accounting for redundancies due to the equations of motion and integration by parts. Specifically, the conjectured technique can be applied whenever there is only one Lorentz invariant for a given partitioning of derivatives among the fields. At higher numbers of derivatives, equation of motion redundancies can be removed, but the increased number of Lorentz contractions spoils the subtraction of integration by parts redundancies. While restricted, this technique is sufficient to automatically recreate the complete set of invariant operators of the Standard Model effective field theory for dimensions 6 and 7 (for arbitrary numbers of flavors). At dimension 8, the algorithm does not automatically generate the complete operator set; however, it suffices for all but five classes of operators. For these remaining classes, there is a well defined procedure to manually determine the number of invariants. Assuming our method is correct, we derive a set of 535 dimension-8 N f = 1 operators.

  19. A model-based approach to wildland fire reconstruction using sediment charcoal records

    USGS Publications Warehouse

    Itter, Malcolm S.; Finley, Andrew O.; Hooten, Mevin B.; Higuera, Philip E.; Marlon, Jennifer R.; Kelly, Ryan; McLachlan, Jason S.

    2017-01-01

    Lake sediment charcoal records are used in paleoecological analyses to reconstruct fire history, including the identification of past wildland fires. One challenge of applying sediment charcoal records to infer fire history is the separation of charcoal associated with local fire occurrence and charcoal originating from regional fire activity. Despite a variety of methods to identify local fires from sediment charcoal records, an integrated statistical framework for fire reconstruction is lacking. We develop a Bayesian point process model to estimate the probability of fire associated with charcoal counts from individual-lake sediments and estimate mean fire return intervals. A multivariate extension of the model combines records from multiple lakes to reduce uncertainty in local fire identification and estimate a regional mean fire return interval. The univariate and multivariate models are applied to 13 lakes in the Yukon Flats region of Alaska. Both models resulted in similar mean fire return intervals (100–350 years) with reduced uncertainty under the multivariate model due to improved estimation of regional charcoal deposition. The point process model offers an integrated statistical framework for paleofire reconstruction and extends existing methods to infer regional fire history from multiple lake records with uncertainty following directly from posterior distributions.

  20. Photocounting distributions for exponentially decaying sources.

    PubMed

    Teich, M C; Card, H C

    1979-05-01

    Exact photocounting distributions are obtained for a pulse of light whose intensity is exponentially decaying in time, when the underlying photon statistics are Poisson. It is assumed that the starting time for the sampling interval (which is of arbitrary duration) is uniformly distributed. The probability of registering n counts in the fixed time T is given in terms of the incomplete gamma function for n >/= 1 and in terms of the exponential integral for n = 0. Simple closed-form expressions are obtained for the count mean and variance. The results are expected to be of interest in certain studies involving spontaneous emission, radiation damage in solids, and nuclear counting. They will also be useful in neurobiology and psychophysics, since habituation and sensitization processes may sometimes be characterized by the same stochastic model.

  1. Absolute nuclear material assay using count distribution (LAMBDA) space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Prasad, Mano K.; Snyderman, Neal J.; Rowland, Mark S.

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  2. LOW LEVEL COUNTING TECHNIQUES WITH SPECIAL REFERENCE TO BIOMEDICAL TRACER PROBLEMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hosain, F.; Nag, B.D.

    1959-12-01

    Low-level counting techniques in tracer experiments are discussed with emphasis on the measurement of beta and gamma radiations with Geiger and scintillation counting methods. The basic principles of low-level counting are outlined. Screen-wall counters, internal gas counters, low-level beta counters, scintillation spectrometers, liquid scintillators, and big scintillation installations are described. Biomedical tracer investigations are discussed. Applications of low-level techniques in archaeological dating, biology, and other problems are listed. (M.C.G.)

  3. Absolute nuclear material assay using count distribution (LAMBDA) space

    DOEpatents

    Prasad, Manoj K [Pleasanton, CA; Snyderman, Neal J [Berkeley, CA; Rowland, Mark S [Alamo, CA

    2012-06-05

    A method of absolute nuclear material assay of an unknown source comprising counting neutrons from the unknown source and providing an absolute nuclear material assay utilizing a model to optimally compare to the measured count distributions. In one embodiment, the step of providing an absolute nuclear material assay comprises utilizing a random sampling of analytically computed fission chain distributions to generate a continuous time-evolving sequence of event-counts by spreading the fission chain distribution in time.

  4. Quartz-Seq2: a high-throughput single-cell RNA-sequencing method that effectively uses limited sequence reads.

    PubMed

    Sasagawa, Yohei; Danno, Hiroki; Takada, Hitomi; Ebisawa, Masashi; Tanaka, Kaori; Hayashi, Tetsutaro; Kurisaki, Akira; Nikaido, Itoshi

    2018-03-09

    High-throughput single-cell RNA-seq methods assign limited unique molecular identifier (UMI) counts as gene expression values to single cells from shallow sequence reads and detect limited gene counts. We thus developed a high-throughput single-cell RNA-seq method, Quartz-Seq2, to overcome these issues. Our improvements in the reaction steps make it possible to effectively convert initial reads to UMI counts, at a rate of 30-50%, and detect more genes. To demonstrate the power of Quartz-Seq2, we analyzed approximately 10,000 transcriptomes from in vitro embryonic stem cells and an in vivo stromal vascular fraction with a limited number of reads.

  5. Methods for analyzing matched designs with double controls: excess risk is easily estimated and misinterpreted when evaluating traffic deaths.

    PubMed

    Redelmeier, Donald A; Tibshirani, Robert J

    2018-06-01

    To demonstrate analytic approaches for matched studies where two controls are linked to each case and events are accumulating counts rather than binary outcomes. A secondary intent is to clarify the distinction between total risk and excess risk (unmatched vs. matched perspectives). We review past research testing whether elections can lead to increased traffic risks. The results are reinterpreted by analyzing both the total count of individuals in fatal crashes and the excess count of individuals in fatal crashes, each time accounting for the matched double controls. Overall, 1,546 individuals were in fatal crashes on the 10 election days (average = 155/d), and 2,593 individuals were in fatal crashes on the 20 control days (average = 130/d). Poisson regression of total counts yielded a relative risk of 1.19 (95% confidence interval: 1.12-1.27). Poisson regression of excess counts yielded a relative risk of 3.22 (95% confidence interval: 2.72-3.80). The discrepancy between analyses of total counts and excess counts replicated with alternative statistical models and was visualized in graphical displays. Available approaches provide methods for analyzing count data in matched designs with double controls and help clarify the distinction between increases in total risk and increases in excess risk. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Mapping the layer count of few-layer hexagonal boron nitride at high lateral spatial resolutions

    NASA Astrophysics Data System (ADS)

    Mohsin, Ali; Cross, Nicholas G.; Liu, Lei; Watanabe, Kenji; Taniguchi, Takashi; Duscher, Gerd; Gu, Gong

    2018-01-01

    Layer count control and uniformity of two dimensional (2D) layered materials are critical to the investigation of their properties and to their electronic device applications, but methods to map 2D material layer count at nanometer-level lateral spatial resolutions have been lacking. Here, we demonstrate a method based on two complementary techniques widely available in transmission electron microscopes (TEMs) to map the layer count of multilayer hexagonal boron nitride (h-BN) films. The mass-thickness contrast in high-angle annular dark-field (HAADF) imaging in the scanning transmission electron microscope (STEM) mode allows for thickness determination in atomically clean regions with high spatial resolution (sub-nanometer), but is limited by surface contamination. To complement, another technique based on the boron K ionization edge in the electron energy loss spectroscopy spectrum (EELS) of h-BN is developed to quantify the layer count so that surface contamination does not cause an overestimate, albeit at a lower spatial resolution (nanometers). The two techniques agree remarkably well in atomically clean regions with discrepancies within  ±1 layer. For the first time, the layer count uniformity on the scale of nanometers is quantified for a 2D material. The methodology is applicable to layer count mapping of other 2D layered materials, paving the way toward the synthesis of multilayer 2D materials with homogeneous layer count.

  7. Static and elevated pollen traps do not provide an accurate assessment of personal pollen exposure.

    PubMed

    Penel, V; Calleja, M; Pichot, C; Charpin, D

    2017-03-01

    Background. Volumetric pollen traps are commonly used to assess pollen exposure. These traps are well suited for estimating the regional mean airborne pollen concentration but are likely not to provide an accurate index of personal exposure. In this study, we tested the hypothesis that hair sampling may provide different pollen counts from those from pollen traps, especially when the pollen exposure is diverse. Methods. We compared pollen counts in hair washes to counts provided by stationary volumetric and gravimetric pollen traps in 2 different settings: urban with volunteers living in short distance from one another and from the static trap and suburban in which volunteers live in a scattered environment, quite far from the static trap. Results. Pollen counts in hair washes are in full agreement with trap counts for uniform pollen exposure. In contrast, for diverse pollen exposure, .individual pollen counts in hair washes vary strongly in quantity and taxa composition between individuals and dates. These results demonstrate that the pollen counts method (hair washes vs. stationary pollen traps) may lead to different absolute and relative contributions of taxa to the total pollen count. Conclusions. In a geographic area with a high diversity of environmental exposure to pollen, static pollen traps, in contrast to hair washes, do not provide a reliable estimate of this higher diversity.

  8. Effects of isotretinoin on the platelet counts and the mean platelet volume in patients with acne vulgaris.

    PubMed

    Ataseven, Arzu; Ugur Bilgin, Aynur

    2014-01-01

    Aim. The aim of this study was to evaluate the platelet counts and the mean platelet volume in patients who received isotretinoin for the treatment of acne vulgaris. Method. A total of 110 patients were included in this retrospective study. Complete blood count parameters were recorded prior to and three-months following the treatment. Results. Both platelet counts and the mean platelet volume were significantly decreased following the treatment. No significant differences were noted on the levels of hemoglobin, hematocrit, and white blood cell count. Conclusion. Platelet counts and mean platelet volume significantly decreased following isotretinoin treatment. Since the decrease of platelet counts and the mean platelet volume was seen concomitantly, it is concluded that the effect of isotretinoin was through the suppression of bone marrow.

  9. Should the Standard Count Be Excluded from Neutron Probe Calibration?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Z. Fred

    About 6 decades after its introduction, the neutron probe remains one of the most accurate methods for indirect measurement of soil moisture content. Traditionally, the calibration of a neutron probe involves the ratio of the neutron count in the soil to a standard count, which is the neutron count in the fixed environment such as the probe shield or a specially-designed calibration tank. The drawback of this count-ratio-based calibration is that the error in the standard count is carried through to all the measurements. An alternative calibration is to use the neutron counts only, not the ratio, with proper correctionmore » for radioactive decay and counting time. To evaluate both approaches, the shield counts of a neutron probe used for three decades were analyzed. The results show that the surrounding conditions have a substantial effect on the standard count. The error in the standard count also impacts the calculation of water storage and could indicate false consistency among replicates. The analysis of the shield counts indicates negligible aging effect of the instrument over a period of 26 years. It is concluded that, by excluding the standard count, the use of the count-based calibration is appropriate and sometimes even better than ratio-based calibration. The count-based calibration is especially useful for historical data when the standard count was questionable or absent« less

  10. Monitoring Butterfly Abundance: Beyond Pollard Walks

    PubMed Central

    Pellet, Jérôme; Bried, Jason T.; Parietti, David; Gander, Antoine; Heer, Patrick O.; Cherix, Daniel; Arlettaz, Raphaël

    2012-01-01

    Most butterfly monitoring protocols rely on counts along transects (Pollard walks) to generate species abundance indices and track population trends. It is still too often ignored that a population count results from two processes: the biological process (true abundance) and the statistical process (our ability to properly quantify abundance). Because individual detectability tends to vary in space (e.g., among sites) and time (e.g., among years), it remains unclear whether index counts truly reflect population sizes and trends. This study compares capture-mark-recapture (absolute abundance) and count-index (relative abundance) monitoring methods in three species (Maculinea nausithous and Iolana iolas: Lycaenidae; Minois dryas: Satyridae) in contrasted habitat types. We demonstrate that intraspecific variability in individual detectability under standard monitoring conditions is probably the rule rather than the exception, which questions the reliability of count-based indices to estimate and compare specific population abundance. Our results suggest that the accuracy of count-based methods depends heavily on the ecology and behavior of the target species, as well as on the type of habitat in which surveys take place. Monitoring programs designed to assess the abundance and trends in butterfly populations should incorporate a measure of detectability. We discuss the relative advantages and inconveniences of current monitoring methods and analytical approaches with respect to the characteristics of the species under scrutiny and resources availability. PMID:22859980

  11. A Rigorous Statistical Approach to Determine Solar Wind Composition from ACE/SWICS Data, and New Ne/O Ratios

    NASA Astrophysics Data System (ADS)

    Shearer, P.; Jawed, M. K.; Raines, J. M.; Lepri, S. T.; Gilbert, J. A.; von Steiger, R.; Zurbuchen, T.

    2013-12-01

    The SWICS instruments aboard ACE and Ulysses have performed in situ measurements of individual solar wind ions for a period spanning over two decades. Solar wind composition is determined by accumulating the measurements into an ion count histogram in which each species appears as a distinct peak. Assigning counts to the appropriate species is a challenging statistical problem because of the limited counts for some species and overlap between some peaks. We show that the most commonly used count assignment methods can suffer from significant bias when a highly abundant species overlaps with a much less abundant one. For ACE/SWICS data, this bias results in an overestimated Ne/O ratio. Bias is greatly reduced by switching to a rigorous maximum likelihood count assignment method, resulting in a 30-50% reduction in the estimated Ne abundance. We will discuss the new Ne/O values and put them in context with the solar system abundances for Ne derived from other techniques, such as in situ collection from Genesis and its heritage instrument, the Solar Foil experiment during the Apollo era. The new count assignment method is currently being applied to reanalyze the archived ACE and Ulysses data and obtain revised abundances of C, N, O, Ne, Mg, Si, S, and Fe, leading to revised datasets that will be made publicly available.

  12. A Method of Recording and Predicting the Pollen Count.

    ERIC Educational Resources Information Center

    Buck, M.

    1985-01-01

    A hair dryer, plastic funnel, and microscope slide can be used for predicting pollen counts on a day-to-day basis. Materials, methods for assembly, collection technique, meteorological influences, and daily patterns are discussed. Data collected using the apparatus suggest that airborne grass products other than pollen also affect hay fever…

  13. Enumerating Small Sudoku Puzzles in a First Abstract Algebra Course

    ERIC Educational Resources Information Center

    Lorch, Crystal; Lorch, John

    2008-01-01

    Two methods are presented for counting small "essentially different" sudoku puzzles using elementary group theory: one method (due to Jarvis and Russell) uses Burnside's counting formula, while the other employs an invariant property of sudoku puzzles. Ideas are included for incorporating this material into an introductory abstract algebra course.…

  14. COMPARISON OF THREE METHODS FOR COUNTING HUMAN SPERMATOZOA

    EPA Science Inventory

    COMPARISON OF THREE METHODS FOR COUNTING HUMAN SPERMATOZOA SC Jeffay1, LF Strader1, RA Morris1, JE Schmid1, AF Olshan2, LW Lansdell2, SD Perreault1. 1US EPA/ORD, RTP, NC; 2UNC-CH, Chapel Hill, NC.
    The IDENT feature of the HTM-IVOS semen analyzer (Hamilton Thorne Research, Bev...

  15. Mu-Spec - A High Performance Ultra-Compact Photon Counting spectrometer for Space Submillimeter Astronomy

    NASA Technical Reports Server (NTRS)

    Moseley, H.; Hsieh, W.-T.; Stevenson, T.; Wollack, E.; Brown, A.; Benford, D.; Sadleir; U-Yen, I.; Ehsan, N.; Zmuidzinas, J.; hide

    2011-01-01

    We have designed and are testing elements of a fully integrated submillimeter spectrometer based on superconducting microstrip technology. The instrument can offer resolving power R approximately 1500, and its high frequency cutoff is set by the gap of available high performance superconductors. All functions of the spectrometer are integrated - light is coupled to the microstrip circuit with a planar antenna, the spectra discrimination is achieved using a synthetic grating, orders are separated using planar filter, and detected using photon counting MKID detector. This spectrometer promises to revolutionize submillimeter spectroscopy from space. It replaces instruments with the scale of 1m with a spectrometer on a 10 cm Si wafer. The reduction in mass and volume promises a much higher performance system within available resource in a space mission. We will describe the system and the performance of the components that have been fabricated and tested.

  16. Counting Necklaces and Other Patterns.

    ERIC Educational Resources Information Center

    Houghton, Chris

    1990-01-01

    A method for helping students to find formulas involving symmetry under various conditions is explained. Necklace symmetries, orbit counting, tetrahedra and cubes, relationship patterns, and finding patterns are discussed. (CW)

  17. Low-Noise Free-Running High-Rate Photon-Counting for Space Communication and Ranging

    NASA Technical Reports Server (NTRS)

    Lu, Wei; Krainak, Michael A.; Yang, Guangning; Sun, Xiaoli; Merritt, Scott

    2016-01-01

    We present performance data for low-noise free-running high-rate photon counting method for space optical communication and ranging. NASA GSFC is testing the performance of two types of novel photon-counting detectors 1) a 2x8 mercury cadmium telluride (HgCdTe) avalanche array made by DRS Inc., and a 2) a commercial 2880-element silicon avalanche photodiode (APD) array. We successfully measured real-time communication performance using both the 2 detected-photon threshold and logic AND-gate coincidence methods. Use of these methods allows mitigation of dark count, after-pulsing and background noise effects without using other method of Time Gating The HgCdTe APD array routinely demonstrated very high photon detection efficiencies ((is) greater than 50%) at near infrared wavelength. The commercial silicon APD array exhibited a fast output with rise times of 300 ps and pulse widths of 600 ps. On-chip individually filtered signals from the entire array were multiplexed onto a single fast output. NASA GSFC has tested both detectors for their potential application for space communications and ranging. We developed and compare their performances using both the 2 detected photon threshold and coincidence methods.

  18. Low-Noise Free-Running High-Rate Photon-Counting for Space Communication and Ranging

    NASA Technical Reports Server (NTRS)

    Lu, Wei; Krainak, Michael A.; Yang, Guan; Sun, Xiaoli; Merritt, Scott

    2016-01-01

    We present performance data for low-noise free-running high-rate photon counting method for space optical communication and ranging. NASA GSFC is testing the performance of two types of novel photon-counting detectors 1) a 2x8 mercury cadmium telluride (HgCdTe) avalanche array made by DRS Inc., and a 2) a commercial 2880-element silicon avalanche photodiode (APD) array. We successfully measured real-time communication performance using both the 2 detected-photon threshold and logic AND-gate coincidence methods. Use of these methods allows mitigation of dark count, after-pulsing and background noise effects without using other method of Time Gating The HgCdTe APD array routinely demonstrated very high photon detection efficiencies (50) at near infrared wavelength. The commercial silicon APD array exhibited a fast output with rise times of 300 ps and pulse widths of 600 ps. On-chip individually filtered signals from the entire array were multiplexed onto a single fast output. NASA GSFC has tested both detectors for their potential application for space communications and ranging. We developed and compare their performances using both the 2 detected photon threshold and coincidence methods.

  19. Marginalized zero-altered models for longitudinal count data.

    PubMed

    Tabb, Loni Philip; Tchetgen, Eric J Tchetgen; Wellenius, Greg A; Coull, Brent A

    2016-10-01

    Count data often exhibit more zeros than predicted by common count distributions like the Poisson or negative binomial. In recent years, there has been considerable interest in methods for analyzing zero-inflated count data in longitudinal or other correlated data settings. A common approach has been to extend zero-inflated Poisson models to include random effects that account for correlation among observations. However, these models have been shown to have a few drawbacks, including interpretability of regression coefficients and numerical instability of fitting algorithms even when the data arise from the assumed model. To address these issues, we propose a model that parameterizes the marginal associations between the count outcome and the covariates as easily interpretable log relative rates, while including random effects to account for correlation among observations. One of the main advantages of this marginal model is that it allows a basis upon which we can directly compare the performance of standard methods that ignore zero inflation with that of a method that explicitly takes zero inflation into account. We present simulations of these various model formulations in terms of bias and variance estimation. Finally, we apply the proposed approach to analyze toxicological data of the effect of emissions on cardiac arrhythmias.

  20. Marginalized zero-altered models for longitudinal count data

    PubMed Central

    Tabb, Loni Philip; Tchetgen, Eric J. Tchetgen; Wellenius, Greg A.; Coull, Brent A.

    2015-01-01

    Count data often exhibit more zeros than predicted by common count distributions like the Poisson or negative binomial. In recent years, there has been considerable interest in methods for analyzing zero-inflated count data in longitudinal or other correlated data settings. A common approach has been to extend zero-inflated Poisson models to include random effects that account for correlation among observations. However, these models have been shown to have a few drawbacks, including interpretability of regression coefficients and numerical instability of fitting algorithms even when the data arise from the assumed model. To address these issues, we propose a model that parameterizes the marginal associations between the count outcome and the covariates as easily interpretable log relative rates, while including random effects to account for correlation among observations. One of the main advantages of this marginal model is that it allows a basis upon which we can directly compare the performance of standard methods that ignore zero inflation with that of a method that explicitly takes zero inflation into account. We present simulations of these various model formulations in terms of bias and variance estimation. Finally, we apply the proposed approach to analyze toxicological data of the effect of emissions on cardiac arrhythmias. PMID:27867423

  1. Conclusions on measurement uncertainty in microbiology.

    PubMed

    Forster, Lynne I

    2009-01-01

    Since its first issue in 1999, testing laboratories wishing to comply with all the requirements of ISO/IEC 17025 have been collecting data for estimating uncertainty of measurement for quantitative determinations. In the microbiological field of testing, some debate has arisen as to whether uncertainty needs to be estimated for each method performed in the laboratory for each type of sample matrix tested. Queries also arise concerning the estimation of uncertainty when plate/membrane filter colony counts are below recommended method counting range limits. A selection of water samples (with low to high contamination) was tested in replicate with the associated uncertainty of measurement being estimated from the analytical results obtained. The analyses performed on the water samples included total coliforms, fecal coliforms, fecal streptococci by membrane filtration, and heterotrophic plate counts by the pour plate technique. For those samples where plate/membrane filter colony counts were > or =20, uncertainty estimates at a 95% confidence level were very similar for the methods, being estimated as 0.13, 0.14, 0.14, and 0.12, respectively. For those samples where plate/membrane filter colony counts were <20, estimated uncertainty values for each sample showed close agreement with published confidence limits established using a Poisson distribution approach.

  2. Alternative statistical methods for interpreting airborne Alder (Alnus glutimosa (L.) Gaertner) pollen concentrations.

    PubMed

    González Parrado, Zulima; Valencia Barrera, Rosa M; Fuertes Rodríguez, Carmen R; Vega Maray, Ana M; Pérez Romero, Rafael; Fraile, Roberto; Fernández González, Delia

    2009-01-01

    This paper reports on the behaviour of Alnus glutinosa (alder) pollen grains in the atmosphere of Ponferrada (León, NW Spain) from 1995 to 2006. The study, which sought to determine the effects of various weather-related parameters on Alnus pollen counts, was performed using a volumetric method. The main pollination period for this taxon is January-February. Alder pollen is one of the eight major airborne pollen allergens found in the study area. An analysis was made of the correlation between pollen counts and major weather-related parameters over each period. In general, the strongest positive correlation was with temperature, particularly maximum temperature. During each period, peak pollen counts occurred when the maximum temperature fell within the range 9 degrees C-14 degrees C. Finally, multivariate analysis showed that the parameter exerting the greatest influence was temperature, a finding confirmed by Spearman correlation tests. Principal components analysis suggested that periods with high pollen counts were characterised by high maximum temperature, low rainfall and an absolute humidity of around 6 g m(-3). Use of this type of analysis in conjunction with other methods is essential for obtaining an accurate record of pollen-count variations over a given period.

  3. High redshift galaxies in the ALHAMBRA survey . I. Selection method and number counts based on redshift PDFs

    NASA Astrophysics Data System (ADS)

    Viironen, K.; Marín-Franch, A.; López-Sanjuan, C.; Varela, J.; Chaves-Montero, J.; Cristóbal-Hornillos, D.; Molino, A.; Fernández-Soto, A.; Vilella-Rojo, G.; Ascaso, B.; Cenarro, A. J.; Cerviño, M.; Cepa, J.; Ederoclite, A.; Márquez, I.; Masegosa, J.; Moles, M.; Oteo, I.; Pović, M.; Aguerri, J. A. L.; Alfaro, E.; Aparicio-Villegas, T.; Benítez, N.; Broadhurst, T.; Cabrera-Caño, J.; Castander, J. F.; Del Olmo, A.; González Delgado, R. M.; Husillos, C.; Infante, L.; Martínez, V. J.; Perea, J.; Prada, F.; Quintana, J. M.

    2015-04-01

    Context. Most observational results on the high redshift restframe UV-bright galaxies are based on samples pinpointed using the so-called dropout technique or Ly-α selection. However, the availability of multifilter data now allows the dropout selections to be replaced by direct methods based on photometric redshifts. In this paper we present the methodology to select and study the population of high redshift galaxies in the ALHAMBRA survey data. Aims: Our aim is to develop a less biased methodology than the traditional dropout technique to study the high redshift galaxies in ALHAMBRA and other multifilter data. Thanks to the wide area ALHAMBRA covers, we especially aim at contributing to the study of the brightest, least frequent, high redshift galaxies. Methods: The methodology is based on redshift probability distribution functions (zPDFs). It is shown how a clean galaxy sample can be obtained by selecting the galaxies with high integrated probability of being within a given redshift interval. However, reaching both a complete and clean sample with this method is challenging. Hence, a method to derive statistical properties by summing the zPDFs of all the galaxies in the redshift bin of interest is introduced. Results: Using this methodology we derive the galaxy rest frame UV number counts in five redshift bins centred at z = 2.5,3.0,3.5,4.0, and 4.5, being complete up to the limiting magnitude at mUV(AB) = 24, where mUV refers to the first ALHAMBRA filter redwards of the Ly-α line. With the wide field ALHAMBRA data we especially contribute to the study of the brightest ends of these counts, accurately sampling the surface densities down to mUV(AB) = 21-22. Conclusions: We show that using the zPDFs it is easy to select a very clean sample of high redshift galaxies. We also show that it is better to do statistical analysis of the properties of galaxies using a probabilistic approach, which takes into account both the incompleteness and contamination issues in a natural way. Based on observations collected at the German-Spanish Astronomical Center, Calar Alto, jointly operated by the Max-Planck-Institut für Astronomie (MPIA) at Heidelberg and the Instituto de Astrofísica de Andalucía (CSIC).

  4. Method of detecting and counting bacteria

    NASA Technical Reports Server (NTRS)

    Picciolo, G. L.; Chappelle, E. W. (Inventor)

    1976-01-01

    An improved method is provided for determining bacterial levels, especially in samples of aqueous physiological fluids. The method depends on the quantitative determination of bacterial adenosine triphosphate (ATP) in the presence of nonbacterial ATP. The bacterial ATP is released by cell rupture and is measured by an enzymatic bioluminescent assay. A concentration technique is included to make the method more sensitive. It is particularly useful where the fluid to be measured contains an unknown or low bacteria count.

  5. Barriers in adopting blended learning in a private university of Pakistan and East Africa: faculty members’ perspective

    PubMed Central

    Gulzar, Saleema; Nicholas, Wachira; Nkoroi, Beatrice

    2017-01-01

    Background Education methods have undergone transformation over the centuries. Use of technology is the cornerstone for innovation in teaching methods. Hence, blended learning which includes face to face and online modalities is being increasingly explored as effective method for learning. This pilot study determines the perceptions of faculty members in a private international university on barriers influencing adoption of technology for teaching and learning. Methods A cross-sectional survey was conducted through a self-reported questionnaire using ‘survey monkey’. The data was entered and analyzed using Statistical Package for the Social Sciences (SPSS version 20). Frequencies and proportions are reported. Results Findings indicated that 51.6% faculty members perceived the importance of integration of technology in their teaching. Around 54% of the participants recognized that they do possess the ability and accessibility to integrate information communication technology (ICT) in teaching and learning, but there is a need to hone the basic information technology (IT) skills to initiate technology driven teaching. Findings revealed that 55% faculty members acknowledged the constraint of not getting protective time to develop and deliver technology driven courses. Further, results showed that 45% faculty members perceived that their innovation efforts in terms of teaching as blended learning do not count towards their professional promotion or recognition, as usually priority is given to research over teaching innovation. The findings also indicated that 54.5% participants asserted that university lack mentorship in the field of blended learning. Conclusions Therefore, study suggests that universities should provide adequate mentorship programmes for the faculty members in enhancing their skills of integrating technology in their teaching. PMID:28567414

  6. THE USE OF QUENCHING IN A LIQUID SCINTILLATION COUNTER FOR QUANTITATIVE ANALYSIS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Foster, G.V.

    1963-01-01

    Quenching was used to quantitatively determine the amonnt of quenching agent present. A sealed promethium147 source was prepared to be used for the count rate determinations. Two methods to determine the amount of quenching agent present in a sample were developed. One method related the count rate of a sample containing a quenching agent to the amount of quenching agent present. Calibration curves were plotted using both color and chemical quenchers. The quenching agents used were: F.D.C. Orange No. 2, F.D.C. Yellow No. 3, F.D.C. Yellow No. 4, Scarlet Red, acetone, benzaldehyde, and carbon tetrachloride. the color quenchers gave amore » linear-relationship, while the chemical quenchers gave a non-linear relationship. Quantities of the color quenchers between about 0.008 mg and 0.100 mg can be determined with an error less than 5%. The calibration curves were found to be usable over a long period of time. The other method related the change in the ratio of the count rates in two voltage windows to the amount of quenching agent present. The quenchers mentioned above were used. Calibration curves were plotted for both the color and chemical quenchers. The relationships of ratio versus amount of quencher were non-linear in each case. It was shown that the reproducibility of the count rate and the ratio was independent of the amount of quencher present but was dependent on the count rate. At count rates above 10,000 counts per minute the reproducibility was better than 1%. (TCO)« less

  7. Neutronic analysis of the 1D and 1E banks reflux detection system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blanchard, A.

    1999-12-21

    Two H Canyon neutron monitoring systems for early detection of postulated abnormal reflux conditions in the Second Uranium Cycle 1E and 1D Mixer-Settle Banks have been designed and built. Monte Carlo neutron transport simulations using the general purpose, general geometry, n-particle MCNP code have been performed to model expected response of the monitoring systems to varying conditions.The confirmatory studies documented herein conclude that the 1E and 1D neutron monitoring systems are able to achieve adequate neutron count rates for various neutron source and detector configurations, thereby eliminating excessive integration count time. Neutron count rate sensitivity studies are also performed. Conversely,more » the transport studies concluded that the neutron count rates are statistically insensitive to nitric acid content in the aqueous region and to the transition region length. These studies conclude that the 1E and 1D neutron monitoring systems are able to predict the postulated reflux conditions for all examined perturbations in the neutron source and detector configurations. In the cases examined, the relative change in the neutron count rates due to postulated transitions from normal {sup 235}U concentration levels to reflux levels remain satisfactory detectable.« less

  8. Validation of a low cost computer-based method for quantification of immunohistochemistry-stained sections.

    PubMed

    Montgomery, Jill D; Hensler, Heather R; Jacobson, Lisa P; Jenkins, Frank J

    2008-07-01

    The aim of the present study was to determine if the Alpha DigiDoc RT system would be an effective method of quantifying immunohistochemical staining as compared with a manual counting method, which is considered the gold standard. Two readers were used to count 31 samples by both methods. The results obtained using the Bland-Altman for concordance deemed no statistical difference between the 2 methods. Thus, the Alpha DigiDoc RT system is an effective, low cost method to quantify immunohistochemical data.

  9. TOTAL LYMPHOCYTE COUNT AND SERUM ALBUMIN AS PREDICTORS OF NUTRITIONAL RISK IN SURGICAL PATIENTS

    PubMed Central

    ROCHA, Naruna Pereira; FORTES, Renata Costa

    2015-01-01

    Background: Early detection of changes in nutritional status is important for a better approach to the surgical patient. There are several nutritional measures in clinical practice, but there is not a complete method for determining the nutritional status, so, health professionals should only choose the best method to use. Aim: To evaluate the total lymphocyte count and albumin as predictors of identification of nutritional risk in surgical patients. Methods: Prospective longitudinal study was conducted with 69 patients undergoing surgery of the gastrointestinal tract. The assessment of nutritional status was evaluated by objective methods (anthropometry and biochemical tests) and subjective methods (subjective global assessment). Results: All parameters used in the nutritional assessment detected a high prevalence of malnutrition, with the exception of BMI which detected only 7.2% (n=5). The albumin (p=0.01), the total lymphocytes count (p=0.02), the percentage of adequacy of skinfolds (p<0.002) and the subjective global assessment (p<0.001) proved to be useful as predictors of risk of postoperative complications, since the smaller the values of albumin and lymphocyte count and higher the score the subjective global assessment were higher risks of surgical complications. Conclusions: A high prevalence of malnutrition was found, except for BMI. The use of albumin and total lymphocyte count were good predictor for the risk of postoperative complications and when used with other methods of assessing the nutritional status, such as the subjective global assessment and the percentage of adequacy of skinfolds, can be useful for identification of nutritional risk and postoperative complications. PMID:26537145

  10. A comparison of 2 techniques for estimating deer density

    USGS Publications Warehouse

    Storm, G.L.; Cottam, D.F.; Yahner, R.H.; Nichols, J.D.

    1977-01-01

    We applied mark-resight and area-conversion methods to estimate deer abundance at a 2,862-ha area in and surrounding the Gettysburg National Military Park and Eisenhower National Historic Site during 1987-1991. One observer in each of 11 compartments counted marked and unmarked deer during 65-75 minutes at dusk during 3 counts in each of April and November. Use of radio-collars and vinyl collars provided a complete inventory of marked deer in the population prior to the counts. We sighted 54% of the marked deer during April 1987 and 1988, and 43% of the marked deer during November 1987 and 1988. Mean number of deer counted increased from 427 in April 1987 to 582 in April 1991, and increased from 467 in November 1987 to 662 in November 1990. Herd size during April, based on the mark-resight method, increased from approximately 700-1,400 from 1987-1991, whereas the estimates for November indicated an increase from 983 for 1987 to 1,592 for 1990. Given the large proportion of open area and the extensive road system throughout the study area, we concluded that the sighting probability for marked and unmarked deer was fairly similar. We believe that the mark-resight method was better suited to our study than the area-conversion method because deer were not evenly distributed between areas suitable and unsuitable for sighting within open and forested areas. The assumption of equal distribution is required by the area-conversion method. Deer marked for the mark-resight method also helped reduce double counting during the dusk surveys.

  11. Exploring the spatio-temporal relationship between two key aeroallergens and meteorological variables in the United Kingdom

    NASA Astrophysics Data System (ADS)

    Khwarahm, Nabaz; Dash, Jadunandan; Atkinson, Peter M.; Newnham, R. M.; Skjøth, C. A.; Adams-Groom, B.; Caulton, Eric; Head, K.

    2014-05-01

    Constructing accurate predictive models for grass and birch pollen in the air, the two most important aeroallergens, for areas with variable climate conditions such as the United Kingdom, require better understanding of the relationships between pollen count in the air and meteorological variables. Variations in daily birch and grass pollen counts and their relationship with daily meteorological variables were investigated for nine pollen monitoring sites for the period 2000-2010 in the United Kingdom. An active pollen count sampling method was employed at each of the monitoring stations to sample pollen from the atmosphere. The mechanism of this method is based on the volumetric spore traps of Hirst design (Hirst in Ann Appl Biol 39(2):257-265, 1952). The pollen season (start date, finish date) for grass and birch were determined using a first derivative method. Meteorological variables such as daily rainfall; maximum, minimum and average temperatures; cumulative sum of Sunshine duration; wind speed; and relative humidity were related to the grass and birch pollen counts for the pre-peak, post peak and the entire pollen season. The meteorological variables were correlated with the pollen count data for the following temporal supports: same-day, 1-day prior, 1-day mean prior, 3-day mean prior, 7-day mean prior. The direction of influence (positive/negative) of meteorological variables on pollen count varied for birch and grass, and also varied when the pollen season was treated as a whole season, or was segmented into the pre-peak and post-peak seasons. Maximum temperature, sunshine duration and rainfall were the most important variables influencing the count of grass pollen in the atmosphere. Both maximum temperature (pre-peak) and sunshine produced a strong positive correlation, and rain produced a strong negative correlation with grass pollen count in the air. Similarly, average temperature, wind speed and rainfall were the most important variables influencing the count of birch pollen in the air. Both wind speed and rain produced a negative correlation with birch pollen count in the air and average temperature produced a positive correlation.

  12. Investigation of contrast-enhanced subtracted breast CT images with MAP-EM based on projection-based weighting imaging.

    PubMed

    Zhou, Zhengdong; Guan, Shaolin; Xin, Runchao; Li, Jianbo

    2018-06-01

    Contrast-enhanced subtracted breast computer tomography (CESBCT) images acquired using energy-resolved photon counting detector can be helpful to enhance the visibility of breast tumors. In such technology, one challenge is the limited number of photons in each energy bin, thereby possibly leading to high noise in separate images from each energy bin, the projection-based weighted image, and the subtracted image. In conventional low-dose CT imaging, iterative image reconstruction provides a superior signal-to-noise compared with the filtered back projection (FBP) algorithm. In this paper, maximum a posteriori expectation maximization (MAP-EM) based on projection-based weighting imaging for reconstruction of CESBCT images acquired using an energy-resolving photon counting detector is proposed, and its performance was investigated in terms of contrast-to-noise ratio (CNR). The simulation study shows that MAP-EM based on projection-based weighting imaging can improve the CNR in CESBCT images by 117.7%-121.2% compared with FBP based on projection-based weighting imaging method. When compared with the energy-integrating imaging that uses the MAP-EM algorithm, projection-based weighting imaging that uses the MAP-EM algorithm can improve the CNR of CESBCT images by 10.5%-13.3%. In conclusion, MAP-EM based on projection-based weighting imaging shows significant improvement the CNR of the CESBCT image compared with FBP based on projection-based weighting imaging, and MAP-EM based on projection-based weighting imaging outperforms MAP-EM based on energy-integrating imaging for CESBCT imaging.

  13. The Safety of Acupuncture in Patients with Cancer Therapy–Related Thrombocytopenia

    PubMed Central

    Cybularz, Paul A.; Brothers, Karen; Singh, Gurneet M.; Feingold, Jennifer L.; Niesley, Michelle L.

    2015-01-01

    Abstract Background: Acceptance of acupuncture as an efficacious integrative modality for oncology-related side-effect management is rapidly expanding. It is imperative that guidelines regarding safe treatment supported by clinical experience are established. Oncology patients frequently experience thrombocytopenia as a side-effect of chemotherapy or radiation. However, safety data for acupuncture in adult patients with cancer who are thrombocytopenic is lacking. Materials and Methods: The medical records of 684 patients who received acupuncture treatments in an established acupuncture program at a private cancer treatment hospital were reviewed for adverse events occurring within the context of thrombocytopenia. Results: Of 2135 visits eligible for evaluation, 98 individual acupuncture visits occurred in patients with platelet counts <100,000/μL, including nine visits in which platelet counts were <50,000/μL. No adverse events of increased bruising or bleeding were noted. Medications and nutritional supplements or botanicals that may influence coagulation were also tabulated, with no apparent adverse events in this patient population. Conclusions: Discrepancies in the literature highlight the need to create cohesive safety guidelines backed by clinical research, specifically for groups at higher risk for adverse events. The preliminary evidence put forth in this study lays the foundation that supports the notion that acupuncture can be used safely with a high-need oncology population within an integrated model of care. In this descriptive retrospective case series of adult oncology patients with thrombocytopenia, no adverse events of increased bruising or bleeding were documented. Prospective trials are needed to confirm these initial observations. PMID:26401193

  14. Direct measurement of carbon-14 in carbon dioxide by liquid scintillation counting

    NASA Technical Reports Server (NTRS)

    Horrocks, D. L.

    1969-01-01

    Liquid scintillation counting technique is applied to the direct measurement of carbon-14 in carbon dioxide. This method has high counting efficiency and eliminates many of the basic problems encountered with previous techniques. The technique can be used to achieve a percent substitution reaction and is of interest as an analytical technique.

  15. Monitoring bird populations by point counts

    Treesearch

    C. John Ralph; John R. Sauer; Sam Droege

    1995-01-01

    This volume contains in part papers presented at the Symposium on Monitoring Bird Population Trends by Point Counts, which was held November 6-7, 1991, in Beltsville, Md., in response to the need for standardization of methods to monitor bird populations by point counts. Data from various investigators working under a wide variety of conditions are presented, and...

  16. 16 CFR 500.7 - Net quantity of contents, method of expression.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... expression. The net quantity of contents shall be expressed in terms of weight or mass, measure, numerical count, or a combination of numerical count and weight or mass, size, or measure so as to give accurate... measure, numerical count, and/or size, or (as in the case of lawn and plant care products) by cubic...

  17. 16 CFR 500.7 - Net quantity of contents, method of expression.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... expression. The net quantity of contents shall be expressed in terms of weight or mass, measure, numerical count, or a combination of numerical count and weight or mass, size, or measure so as to give accurate... measure, numerical count, and/or size, or (as in the case of lawn and plant care products) by cubic...

  18. 16 CFR 500.7 - Net quantity of contents, method of expression.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... expression. The net quantity of contents shall be expressed in terms of weight or mass, measure, numerical count, or a combination of numerical count and weight or mass, size, or measure so as to give accurate... measure, numerical count, and/or size, or (as in the case of lawn and plant care products) by cubic...

  19. 16 CFR 500.7 - Net quantity of contents, method of expression.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... expression. The net quantity of contents shall be expressed in terms of weight or mass, measure, numerical count, or a combination of numerical count and weight or mass, size, or measure so as to give accurate... measure, numerical count, and/or size, or (as in the case of lawn and plant care products) by cubic...

  20. Application of Digital Image Analysis to Determine Pancreatic Islet Mass and Purity in Clinical Islet Isolation and Transplantation

    PubMed Central

    Wang, Ling-jia; Kissler, Hermann J; Wang, Xiaojun; Cochet, Olivia; Krzystyniak, Adam; Misawa, Ryosuke; Golab, Karolina; Tibudan, Martin; Grzanka, Jakub; Savari, Omid; Grose, Randall; Kaufman, Dixon B; Millis, Michael; Witkowski, Piotr

    2015-01-01

    Pancreatic islet mass, represented by islet equivalent (IEQ), is the most important parameter in decision making for clinical islet transplantation. To obtain IEQ, the sample of islets is routinely counted manually under a microscope and discarded thereafter. Islet purity, another parameter in islet processing, is routinely acquired by estimation only. In this study, we validated our digital image analysis (DIA) system developed using the software of Image Pro Plus for islet mass and purity assessment. Application of the DIA allows to better comply with current good manufacturing practice (cGMP) standards. Human islet samples were captured as calibrated digital images for the permanent record. Five trained technicians participated in determination of IEQ and purity by manual counting method and DIA. IEQ count showed statistically significant correlations between the manual method and DIA in all sample comparisons (r >0.819 and p < 0.0001). Statistically significant difference in IEQ between both methods was found only in High purity 100μL sample group (p = 0.029). As far as purity determination, statistically significant differences between manual assessment and DIA measurement was found in High and Low purity 100μL samples (p<0.005), In addition, islet particle number (IPN) and the IEQ/IPN ratio did not differ statistically between manual counting method and DIA. In conclusion, the DIA used in this study is a reliable technique in determination of IEQ and purity. Islet sample preserved as a digital image and results produced by DIA can be permanently stored for verification, technical training and islet information exchange between different islet centers. Therefore, DIA complies better with cGMP requirements than the manual counting method. We propose DIA as a quality control tool to supplement the established standard manual method for islets counting and purity estimation. PMID:24806436

Top