Ammersbach, Mélanie; Beaufrère, Hugues; Gionet Rollick, Annick; Tully, Thomas
2015-03-01
While hematologic reference intervals (RI) are available for multiple raptorial species of the order Accipitriformes and Falconiformes, there is a lack of valuable hematologic information in Strigiformes that can be used for diagnostic and health monitoring purposes. The objective was to report RI in Strigiformes for hematologic variables and to assess agreement between manual cell counting techniques. A multi-center prospective study was designed to assess hematologic RI and blood cell morphology in owl species. Samples were collected from individuals representing 13 Strigiformes species, including Great Horned Owl, Snowy Owl, Eurasian Eagle Owl, Barred Owl, Great Gray Owl, Ural Owl, Northern Saw-Whet Owls, Northern Hawk Owl, Spectacled Owl, Barn Owl, Eastern Screech Owl, Long-Eared Owl, and Short-Eared Owl. Red blood cell count was determined manually using a hemocytometer. White blood cell count was determined using 3 manual counting techniques: (1) phloxine B technique, (2) Natt and Herrick technique, and (3) estimation from the smear. Differential counts and blood cell morphology were determined on smears. Reference intervals were determined and agreement between methods was calculated. Important species-specific differences were observed in blood cell counts and granulocyte morphology. Differences in WBC count between species did not appear to be predictable based on phylogenetic relationships. Overall, most boreal owl species exhibited a lower WBC count than other species. Important disagreements were found between different manual WBC counting techniques. Disagreements observed between manual counting techniques suggest that technique-specific RI should be used in Strigiformes. © 2015 American Society for Veterinary Clinical Pathology.
Poisson and negative binomial item count techniques for surveys with sensitive question.
Tian, Guo-Liang; Tang, Man-Lai; Wu, Qin; Liu, Yin
2017-04-01
Although the item count technique is useful in surveys with sensitive questions, privacy of those respondents who possess the sensitive characteristic of interest may not be well protected due to a defect in its original design. In this article, we propose two new survey designs (namely the Poisson item count technique and negative binomial item count technique) which replace several independent Bernoulli random variables required by the original item count technique with a single Poisson or negative binomial random variable, respectively. The proposed models not only provide closed form variance estimate and confidence interval within [0, 1] for the sensitive proportion, but also simplify the survey design of the original item count technique. Most importantly, the new designs do not leak respondents' privacy. Empirical results show that the proposed techniques perform satisfactorily in the sense that it yields accurate parameter estimate and confidence interval.
Direct measurement of carbon-14 in carbon dioxide by liquid scintillation counting
NASA Technical Reports Server (NTRS)
Horrocks, D. L.
1969-01-01
Liquid scintillation counting technique is applied to the direct measurement of carbon-14 in carbon dioxide. This method has high counting efficiency and eliminates many of the basic problems encountered with previous techniques. The technique can be used to achieve a percent substitution reaction and is of interest as an analytical technique.
A miniaturized counting technique for anaerobic bacteria.
Sharpe, A N; Pettipher, G L; Lloyd, G R
1976-12-01
A miniaturized counting technique gave results as good as the pour-plate and Most Probable Number (MPN) techniques for enumeration of clostridia spp. and anaerobic isolates from the gut. Highest counts were obtained when ascorbic acid (1%) and dithiothreitol (0.015%) were added to the reinforced clostridial medium used for counting. This minimized the effect of exposure to air before incubation. The miniature technique allowed up to 40 samples to be plated and incubated in one McIntosh-Filde's-type anaerobic jar, compared with 3 or 4 by the normal pour plate.
NASA Astrophysics Data System (ADS)
Takiue, Makoto; Fujii, Haruo; Ishikawa, Hiroaki
1984-12-01
2, 5-diphenyloxazole (PPO) has been proposed as a wavelength shifter for Cherenkov counting. Since PPO is not incorporated with water, we have introduced the fluor into water in the form of micelle using a PPO-ethanol system. This technique makes it possible to obtain a high Cherenkov counting efficiency under stable sample conditions, attributed to the proper spectrometric features of the PPO. The 32P Cherenkov counting efficiency (68.4%) obtained from this technique is large as that measured with a conventional Cherenkov technique.
LOW LEVEL COUNTING TECHNIQUES WITH SPECIAL REFERENCE TO BIOMEDICAL TRACER PROBLEMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hosain, F.; Nag, B.D.
1959-12-01
Low-level counting techniques in tracer experiments are discussed with emphasis on the measurement of beta and gamma radiations with Geiger and scintillation counting methods. The basic principles of low-level counting are outlined. Screen-wall counters, internal gas counters, low-level beta counters, scintillation spectrometers, liquid scintillators, and big scintillation installations are described. Biomedical tracer investigations are discussed. Applications of low-level techniques in archaeological dating, biology, and other problems are listed. (M.C.G.)
SURVIVAL OF SALMONELLA SPECIES IN RIVER WATER.
The survival of four Salmonella strains in river water microcosms was monitored using culturing techniques, direct counts, whole cell hybridization, scanning electron microscopy, and resuscitation techniques via the direct viable count method and flow cytrometry. Plate counts of...
SURVIVAL OF SALMONELLA SPECIES IN RIVER WATER
The survival of four Salmonella strains in river water microcosms was monitored by culturing techniques, direct counts, whole-cell hybridization, scanning electron microscopy, and resuscitation techniques via the direct viable count method and flow cytometry. Plate counts of bact...
Can reliable sage-grouse lek counts be obtained using aerial infrared technology
Gillette, Gifford L.; Coates, Peter S.; Petersen, Steven; Romero, John P.
2013-01-01
More effective methods for counting greater sage-grouse (Centrocercus urophasianus) are needed to better assess population trends through enumeration or location of new leks. We describe an aerial infrared technique for conducting sage-grouse lek counts and compare this method with conventional ground-based lek count methods. During the breeding period in 2010 and 2011, we surveyed leks from fixed-winged aircraft using cryogenically cooled mid-wave infrared cameras and surveyed the same leks on the same day from the ground following a standard lek count protocol. We did not detect significant differences in lek counts between surveying techniques. These findings suggest that using a cryogenically cooled mid-wave infrared camera from an aerial platform to conduct lek surveys is an effective alternative technique to conventional ground-based methods, but further research is needed. We discuss multiple advantages to aerial infrared surveys, including counting in remote areas, representing greater spatial variation, and increasing the number of counted leks per season. Aerial infrared lek counts may be a valuable wildlife management tool that releases time and resources for other conservation efforts. Opportunities exist for wildlife professionals to refine and apply aerial infrared techniques to wildlife monitoring programs because of the increasing reliability and affordability of this technology.
Parallel image logical operations using cross correlation
NASA Technical Reports Server (NTRS)
Strong, J. P., III
1972-01-01
Methods are presented for counting areas in an image in a parallel manner using noncoherent optical techniques. The techniques presented include the Levialdi algorithm for counting, optical techniques for binary operations, and cross-correlation.
Validation of FFM PD counts for screening personality pathology and psychopathy in adolescence.
Decuyper, Mieke; De Clercq, Barbara; De Bolle, Marleen; De Fruyt, Filip
2009-12-01
Miller and colleagues (Miller, Bagby, Pilkonis, Reynolds, & Lynam, 2005) recently developed a Five-Factor Model (FFM) personality disorder (PD) count technique for describing and diagnosing PDs and psychopathy in adulthood. This technique conceptualizes PDs relying on general trait models and uses facets from the expert-generated PD prototypes to score the FFM PDs. The present study corroborates on the study of Miller and colleagues (2005) and investigates in Study 1 whether the PD count technique shows discriminant validity to describe PDs in adolescence. Study 2 extends this objective to psychopathy. Results suggest that the FFM PD count technique is equally successful in adolescence as in adulthood to describe PD symptoms, supporting the use of this descriptive method in adolescence. The normative data and accompanying PD count benchmarks enable to use FFM scores for PD screening purposes in adolescence.
An Automated Statistical Technique for Counting Distinct Multiple Sclerosis Lesions.
Dworkin, J D; Linn, K A; Oguz, I; Fleishman, G M; Bakshi, R; Nair, G; Calabresi, P A; Henry, R G; Oh, J; Papinutto, N; Pelletier, D; Rooney, W; Stern, W; Sicotte, N L; Reich, D S; Shinohara, R T
2018-04-01
Lesion load is a common biomarker in multiple sclerosis, yet it has historically shown modest association with clinical outcome. Lesion count, which encapsulates the natural history of lesion formation and is thought to provide complementary information, is difficult to assess in patients with confluent (ie, spatially overlapping) lesions. We introduce a statistical technique for cross-sectionally counting pathologically distinct lesions. MR imaging was used to assess the probability of a lesion at each location. The texture of this map was quantified using a novel technique, and clusters resembling the center of a lesion were counted. Validity compared with a criterion standard count was demonstrated in 60 subjects observed longitudinally, and reliability was determined using 14 scans of a clinically stable subject acquired at 7 sites. The proposed count and the criterion standard count were highly correlated ( r = 0.97, P < .001) and not significantly different (t 59 = -.83, P = .41), and the variability of the proposed count across repeat scans was equivalent to that of lesion load. After accounting for lesion load and age, lesion count was negatively associated ( t 58 = -2.73, P < .01) with the Expanded Disability Status Scale. Average lesion size had a higher association with the Expanded Disability Status Scale ( r = 0.35, P < .01) than lesion load ( r = 0.10, P = .44) or lesion count ( r = -.12, P = .36) alone. This study introduces a novel technique for counting pathologically distinct lesions using cross-sectional data and demonstrates its ability to recover obscured longitudinal information. The proposed count allows more accurate estimation of lesion size, which correlated more closely with disability scores than either lesion load or lesion count alone. © 2018 by American Journal of Neuroradiology.
Counting Tree Growth Rings Moderately Difficult to Distinguish
C. B. Briscoe; M. Chudnoff
1964-01-01
There is an extensive literature dealing with techniques and gadgets to facilitate counting tree growth rings. A relatively simple method is described below, satisfactory for species too difficult to count in the field, but not sufficiently difficult to require the preparation of microscope slides nor staining techniques.
Improved confidence intervals when the sample is counted an integer times longer than the blank.
Potter, William Edward; Strzelczyk, Jadwiga Jodi
2011-05-01
Past computer solutions for confidence intervals in paired counting are extended to the case where the ratio of the sample count time to the blank count time is taken to be an integer, IRR. Previously, confidence intervals have been named Neyman-Pearson confidence intervals; more correctly they should have been named Neyman confidence intervals or simply confidence intervals. The technique utilized mimics a technique used by Pearson and Hartley to tabulate confidence intervals for the expected value of the discrete Poisson and Binomial distributions. The blank count and the contribution of the sample to the gross count are assumed to be Poisson distributed. The expected value of the blank count, in the sample count time, is assumed known. The net count, OC, is taken to be the gross count minus the product of IRR with the blank count. The probability density function (PDF) for the net count can be determined in a straightforward manner.
Considerations for monitoring raptor population trends based on counts of migrants
Titus, K.; Fuller, M.R.; Ruos, J.L.; Meyburg, B-U.; Chancellor, R.D.
1989-01-01
Various problems were identified with standardized hawk count data as annually collected at six sites. Some of the hawk lookouts increased their hours of observation from 1979-1985, thereby confounding the total counts. Data recording and missing data hamper coding of data and their use with modern analytical techniques. Coefficients of variation among years in counts averaged about 40%. The advantages and disadvantages of various analytical techniques are discussed including regression, non-parametric rank correlation trend analysis, and moving averages.
NASA Astrophysics Data System (ADS)
Ding, Xuemei; Wang, Bingyuan; Liu, Dongyuan; Zhang, Yao; He, Jie; Zhao, Huijuan; Gao, Feng
2018-02-01
During the past two decades there has been a dramatic rise in the use of functional near-infrared spectroscopy (fNIRS) as a neuroimaging technique in cognitive neuroscience research. Diffuse optical tomography (DOT) and optical topography (OT) can be employed as the optical imaging techniques for brain activity investigation. However, most current imagers with analogue detection are limited by sensitivity and dynamic range. Although photon-counting detection can significantly improve detection sensitivity, the intrinsic nature of sequential excitations reduces temporal resolution. To improve temporal resolution, sensitivity and dynamic range, we develop a multi-channel continuous-wave (CW) system for brain functional imaging based on a novel lock-in photon-counting technique. The system consists of 60 Light-emitting device (LED) sources at three wavelengths of 660nm, 780nm and 830nm, which are modulated by current-stabilized square-wave signals at different frequencies, and 12 photomultiplier tubes (PMT) based on lock-in photon-counting technique. This design combines the ultra-high sensitivity of the photon-counting technique with the parallelism of the digital lock-in technique. We can therefore acquire the diffused light intensity for all the source-detector pairs (SD-pairs) in parallel. The performance assessments of the system are conducted using phantom experiments, and demonstrate its excellent measurement linearity, negligible inter-channel crosstalk, strong noise robustness and high temporal resolution.
A COMPARISON OF GALAXY COUNTING TECHNIQUES IN SPECTROSCOPICALLY UNDERSAMPLED REGIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Specian, Mike A.; Szalay, Alex S., E-mail: mspecia1@jhu.edu, E-mail: szalay@jhu.edu
2016-11-01
Accurate measures of galactic overdensities are invaluable for precision cosmology. Obtaining these measurements is complicated when members of one’s galaxy sample lack radial depths, most commonly derived via spectroscopic redshifts. In this paper, we utilize the Sloan Digital Sky Survey’s Main Galaxy Sample to compare seven methods of counting galaxies in cells when many of those galaxies lack redshifts. These methods fall into three categories: assigning galaxies discrete redshifts, scaling the numbers counted using regions’ spectroscopic completeness properties, and employing probabilistic techniques. We split spectroscopically undersampled regions into three types—those inside the spectroscopic footprint, those outside but adjacent to it,more » and those distant from it. Through Monte Carlo simulations, we demonstrate that the preferred counting techniques are a function of region type, cell size, and redshift. We conclude by reporting optimal counting strategies under a variety of conditions.« less
Triple-Label β Liquid Scintillation Counting
Bukowski, Thomas R.; Moffett, Tyler C.; Revkin, James H.; Ploger, James D.; Bassingthwaighte, James B.
2010-01-01
The detection of radioactive compounds by liquid scintillation has revolutionized modern biology, yet few investigators make full use of the power of this technique. Even though multiple isotope counting is considerably more difficult than single isotope counting, many experimental designs would benefit from using more than one isotope. The development of accurate isotope counting techniques enabling the simultaneous use of three β-emitting tracers has facilitated studies in our laboratory using the multiple tracer indicator dilution technique for assessing rates of transmembrane transport and cellular metabolism. The details of sample preparation, and of stabilizing the liquid scintillation spectra of the tracers, are critical to obtaining good accuracy. Reproducibility is enhanced by obtaining detailed efficiency/quench curves for each particular set of tracers and solvent media. The numerical methods for multiple-isotope quantitation depend on avoiding error propagation (inherent to successive subtraction techniques) by using matrix inversion. Experimental data obtained from triple-label β counting illustrate reproducibility and good accuracy even when the relative amounts of different tracers in samples of protein/electrolyte solutions, plasma, and blood are changed. PMID:1514684
ERIC Educational Resources Information Center
Walsh, Jeffrey A.; Braithwaite, Jeremy
2008-01-01
This work, drawing on the literature on alcohol consumption, sexual behavior, and researching sensitive topics, tests the efficacy of the unmatched-count technique (UCT) in establishing higher rates of truthful self-reporting when compared to traditional survey techniques. Traditional techniques grossly underestimate the scope of problems…
Mapping the layer count of few-layer hexagonal boron nitride at high lateral spatial resolutions
NASA Astrophysics Data System (ADS)
Mohsin, Ali; Cross, Nicholas G.; Liu, Lei; Watanabe, Kenji; Taniguchi, Takashi; Duscher, Gerd; Gu, Gong
2018-01-01
Layer count control and uniformity of two dimensional (2D) layered materials are critical to the investigation of their properties and to their electronic device applications, but methods to map 2D material layer count at nanometer-level lateral spatial resolutions have been lacking. Here, we demonstrate a method based on two complementary techniques widely available in transmission electron microscopes (TEMs) to map the layer count of multilayer hexagonal boron nitride (h-BN) films. The mass-thickness contrast in high-angle annular dark-field (HAADF) imaging in the scanning transmission electron microscope (STEM) mode allows for thickness determination in atomically clean regions with high spatial resolution (sub-nanometer), but is limited by surface contamination. To complement, another technique based on the boron K ionization edge in the electron energy loss spectroscopy spectrum (EELS) of h-BN is developed to quantify the layer count so that surface contamination does not cause an overestimate, albeit at a lower spatial resolution (nanometers). The two techniques agree remarkably well in atomically clean regions with discrepancies within ±1 layer. For the first time, the layer count uniformity on the scale of nanometers is quantified for a 2D material. The methodology is applicable to layer count mapping of other 2D layered materials, paving the way toward the synthesis of multilayer 2D materials with homogeneous layer count.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, S.M.; Bayly, R.J.
1986-01-01
This book contains the following chapters: Some prerequisites to the use of radionuclides in haematology; Instrumentation and counting techniques; In vitro techniques; Cell labelling; Protein labelling; Autoradiography; Imaging and quantitative scanning; Whole body counting; Absorption and excretion studies; Blood volume studies; Plasma clearance studies; and Radionuclide blood cell survival studies.
ERIC Educational Resources Information Center
Denny, Paula J.; Test, David W.
1995-01-01
This study extended use of the One-More-Than technique by using a "cents-pile modification"; one-, five-, and ten-dollar bills; and mixed training of all dollar amounts. Three high school students with moderate mental retardation each learned to use the technique to count out nontrained amounts and to make community purchases. (Author/PB)
Probabilistic techniques for obtaining accurate patient counts in Clinical Data Warehouses
Myers, Risa B.; Herskovic, Jorge R.
2011-01-01
Proposal and execution of clinical trials, computation of quality measures and discovery of correlation between medical phenomena are all applications where an accurate count of patients is needed. However, existing sources of this type of patient information, including Clinical Data Warehouses (CDW) may be incomplete or inaccurate. This research explores applying probabilistic techniques, supported by the MayBMS probabilistic database, to obtain accurate patient counts from a clinical data warehouse containing synthetic patient data. We present a synthetic clinical data warehouse (CDW), and populate it with simulated data using a custom patient data generation engine. We then implement, evaluate and compare different techniques for obtaining patients counts. We model billing as a test for the presence of a condition. We compute billing’s sensitivity and specificity both by conducting a “Simulated Expert Review” where a representative sample of records are reviewed and labeled by experts, and by obtaining the ground truth for every record. We compute the posterior probability of a patient having a condition through a “Bayesian Chain”, using Bayes’ Theorem to calculate the probability of a patient having a condition after each visit. The second method is a “one-shot” approach that computes the probability of a patient having a condition based on whether the patient is ever billed for the condition Our results demonstrate the utility of probabilistic approaches, which improve on the accuracy of raw counts. In particular, the simulated review paired with a single application of Bayes’ Theorem produces the best results, with an average error rate of 2.1% compared to 43.7% for the straightforward billing counts. Overall, this research demonstrates that Bayesian probabilistic approaches improve patient counts on simulated patient populations. We believe that total patient counts based on billing data are one of the many possible applications of our Bayesian framework. Use of these probabilistic techniques will enable more accurate patient counts and better results for applications requiring this metric. PMID:21986292
Chi-squared and C statistic minimization for low count per bin data
NASA Astrophysics Data System (ADS)
Nousek, John A.; Shue, David R.
1989-07-01
Results are presented from a computer simulation comparing two statistical fitting techniques on data samples with large and small counts per bin; the results are then related specifically to X-ray astronomy. The Marquardt and Powell minimization techniques are compared by using both to minimize the chi-squared statistic. In addition, Cash's C statistic is applied, with Powell's method, and it is shown that the C statistic produces better fits in the low-count regime than chi-squared.
Chi-squared and C statistic minimization for low count per bin data. [sampling in X ray astronomy
NASA Technical Reports Server (NTRS)
Nousek, John A.; Shue, David R.
1989-01-01
Results are presented from a computer simulation comparing two statistical fitting techniques on data samples with large and small counts per bin; the results are then related specifically to X-ray astronomy. The Marquardt and Powell minimization techniques are compared by using both to minimize the chi-squared statistic. In addition, Cash's C statistic is applied, with Powell's method, and it is shown that the C statistic produces better fits in the low-count regime than chi-squared.
For Mole Problems, Call Avogadro: 602-1023.
ERIC Educational Resources Information Center
Uthe, R. E.
2002-01-01
Describes techniques to help introductory students become familiar with Avogadro's number and mole calculations. Techniques involve estimating numbers of common objects then calculating the length of time needed to count large numbers of them. For example, the immense amount of time required to count a mole of sand grains at one grain per second…
Ryde, S J; al-Agel, F A; Evans, C J; Hancock, D A
2000-05-01
The use of a hydrogen internal standard to enable the estimation of absolute mass during measurement of total body nitrogen by in vivo neutron activation is an established technique. Central to the technique is a determination of the H prompt gamma ray counts arising from the subject. In practice, interference counts from other sources--e.g., neutron shielding--are included. This study reports use of the Monte Carlo computer code, MCNP-4A, to investigate the interference counts arising from shielding both with and without a phantom containing a urea solution. Over a range of phantom size (depth 5 to 30 cm, width 20 to 40 cm), the counts arising from shielding increased by between 4% and 32% compared with the counts without a phantom. For any given depth, the counts increased approximately linearly with width. For any given width, there was little increase for depths exceeding 15 centimeters. The shielding counts comprised between 15% and 26% of those arising from the urea phantom. These results, although specific to the Swansea apparatus, suggest that extraneous hydrogen counts can be considerable and depend strongly on the subject's size.
Pettipher, Graham L.; Mansell, Roderick; McKinnon, Charles H.; Cousins, Christina M.
1980-01-01
Membrane filtration and epifluorescent microscopy were used for the direct enumeration of bacteria in raw milk. Somatic cells were lysed by treatment with trypsin and Triton X-100 so that 2 ml of milk containing up to 5 × 106 somatic cells/ml could be filtered. The majority of the bacteria (ca. 80%) remained intact and were concentrated on the membrane. After being stained with acridine organe, the bacteria fluoresced under ultraviolet light and could easily be counted. The clump count of orange fluorescing cells on the membrane correlated well (r = 0.91) with the corresponding plate count for farm, tanker, and silo milks. Differences between counts obtained by different operators and between the membrane clump count and plate count were not significant. The technique is rapid, taking less than 25 min, inexpensive, costing less than 50 cents per sample, and is suitable for milks containing 5 × 103 to 5 × 108 bacteria per ml. Images PMID:16345515
Scare, J A; Slusarewicz, P; Noel, M L; Wielgus, K M; Nielsen, M K
2017-11-30
Fecal egg counts are emphasized for guiding equine helminth parasite control regimens due to the rise of anthelmintic resistance. This, however, poses further challenges, since egg counting results are prone to issues such as operator dependency, method variability, equipment requirements, and time commitment. The use of image analysis software for performing fecal egg counts is promoted in recent studies to reduce the operator dependency associated with manual counts. In an attempt to remove operator dependency associated with current methods, we developed a diagnostic system that utilizes a smartphone and employs image analysis to generate automated egg counts. The aims of this study were (1) to determine precision of the first smartphone prototype, the modified McMaster and ImageJ; (2) to determine precision, accuracy, sensitivity, and specificity of the second smartphone prototype, the modified McMaster, and Mini-FLOTAC techniques. Repeated counts on fecal samples naturally infected with equine strongyle eggs were performed using each technique to evaluate precision. Triplicate counts on 36 egg count negative samples and 36 samples spiked with strongyle eggs at 5, 50, 500, and 1000 eggs per gram were performed using a second smartphone system prototype, Mini-FLOTAC, and McMaster to determine technique accuracy. Precision across the techniques was evaluated using the coefficient of variation. In regards to the first aim of the study, the McMaster technique performed with significantly less variance than the first smartphone prototype and ImageJ (p<0.0001). The smartphone and ImageJ performed with equal variance. In regards to the second aim of the study, the second smartphone system prototype had significantly better precision than the McMaster (p<0.0001) and Mini-FLOTAC (p<0.0001) methods, and the Mini-FLOTAC was significantly more precise than the McMaster (p=0.0228). Mean accuracies for the Mini-FLOTAC, McMaster, and smartphone system were 64.51%, 21.67%, and 32.53%, respectively. The Mini-FLOTAC was significantly more accurate than the McMaster (p<0.0001) and the smartphone system (p<0.0001), while the smartphone and McMaster counts did not have statistically different accuracies. Overall, the smartphone system compared favorably to manual methods with regards to precision, and reasonably with regards to accuracy. With further refinement, this system could become useful in veterinary practice. Copyright © 2017 Elsevier B.V. All rights reserved.
Dittami, Gregory M; Sethi, Manju; Rabbitt, Richard D; Ayliffe, H Edward
2012-06-21
Particle and cell counting is used for a variety of applications including routine cell culture, hematological analysis, and industrial controls(1-5). A critical breakthrough in cell/particle counting technologies was the development of the Coulter technique by Wallace Coulter over 50 years ago. The technique involves the application of an electric field across a micron-sized aperture and hydrodynamically focusing single particles through the aperture. The resulting occlusion of the aperture by the particles yields a measurable change in electric impedance that can be directly and precisely correlated to cell size/volume. The recognition of the approach as the benchmark in cell/particle counting stems from the extraordinary precision and accuracy of its particle sizing and counts, particularly as compared to manual and imaging based technologies (accuracies on the order of 98% for Coulter counters versus 75-80% for manual and vision-based systems). This can be attributed to the fact that, unlike imaging-based approaches to cell counting, the Coulter Technique makes a true three-dimensional (3-D) measurement of cells/particles which dramatically reduces count interference from debris and clustering by calculating precise volumetric information about the cells/particles. Overall this provides a means for enumerating and sizing cells in a more accurate, less tedious, less time-consuming, and less subjective means than other counting techniques(6). Despite the prominence of the Coulter technique in cell counting, its widespread use in routine biological studies has been prohibitive due to the cost and size of traditional instruments. Although a less expensive Coulter-based instrument has been produced, it has limitations as compared to its more expensive counterparts in the correction for "coincidence events" in which two or more cells pass through the aperture and are measured simultaneously. Another limitation with existing Coulter technologies is the lack of metrics on the overall health of cell samples. Consequently, additional techniques must often be used in conjunction with Coulter counting to assess cell viability. This extends experimental setup time and cost since the traditional methods of viability assessment require cell staining and/or use of expensive and cumbersome equipment such as a flow cytometer. The Moxi Z mini automated cell counter, described here, is an ultra-small benchtop instrument that combines the accuracy of the Coulter Principle with a thin-film sensor technology to enable precise sizing and counting of particles ranging from 3-25 microns, depending on the cell counting cassette used. The M type cassette can be used to count particles from with average diameters of 4 - 25 microns (dynamic range 2 - 34 microns), and the Type S cassette can be used to count particles with and average diameter of 3 - 20 microns (dynamic range 2 - 26 microns). Since the system uses a volumetric measurement method, the 4-25 microns corresponds to a cell volume range of 34 - 8,180 fL and the 3 - 20 microns corresponds to a cell volume range of 14 - 4200 fL, which is relevant when non-spherical particles are being measured. To perform mammalian cell counts using the Moxi Z, the cells to be counted are first diluted with ORFLO or similar diluent. A cell counting cassette is inserted into the instrument, and the sample is loaded into the port of the cassette. Thousands of cells are pulled, single-file through a "Cell Sensing Zone" (CSZ) in the thin-film membrane over 8-15 seconds. Following the run, the instrument uses proprietary curve-fitting in conjunction with a proprietary software algorithm to provide coincidence event correction along with an assessment of overall culture health by determining the ratio of the number of cells in the population of interest to the total number of particles. The total particle counts include shrunken and broken down dead cells, as well as other debris and contaminants. The results are presented in histogram format with an automatic curve fit, with gates that can be adjusted manually as needed. Ultimately, the Moxi Z enables counting with a precision and accuracy comparable to a Coulter Z2, the current gold standard, while providing additional culture health information. Furthermore it achieves these results in less time, with a smaller footprint, with significantly easier operation and maintenance, and at a fraction of the cost of comparable technologies.
Van den Broeck, Joke; Rossi, Gina; De Clercq, Barbara; Dierckx, Eva; Bastiaansen, Leen
2013-01-01
Research on the applicability of the five factor model (FFM) to capture personality pathology coincided with the development of a FFM personality disorder (PD) count technique, which has been validated in adolescent, young, and middle-aged samples. This study extends the literature by validating this technique in an older sample. Five alternative FFM PD counts based upon the Revised NEO Personality Inventory (NEO PI-R) are computed and evaluated in terms of both convergent and divergent validity with the Assessment of DSM-IV Personality Disorders Questionnaire (shortly ADP-IV; DSM-IV, Diagnostic and Statistical Manual of Mental Disorders - Fourth edition). For the best working count for each PD normative data are presented, from which cut-off scores are derived. The validity of these cut-offs and their usefulness as a screening tool is tested against both a categorical (i.e., the DSM-IV - Text Revision), and a dimensional (i.e., the Dimensional Assessment of Personality Pathology; DAPP) measure of personality pathology. All but the Antisocial and Obsessive-Compulsive counts exhibited adequate convergent and divergent validity, supporting the use of this method in older adults. Using the ADP-IV and the DAPP - Short Form as validation criteria, results corroborate the use of the FFM PD count technique to screen for PDs in older adults, in particular for the Paranoid, Borderline, Histrionic, Avoidant, and Dependent PDs. Given the age-neutrality of the NEO PI-R and the considerable lack of valid personality assessment tools, current findings appear to be promising for the assessment of pathology in older adults.
Using DNA to test the utility of pellet-group counts as an index of deer counts
T. J. Brinkman; D. K. Person; W. Smith; F. Stuart Chapin; K. McCoy; M. Leonawicz; K. Hundertmark
2013-01-01
Despite widespread use of fecal pellet-group counts as an index of ungulate density, techniques used to convert pellet-group numbers to ungulate numbers rarely are based on counts of known individuals, seldom evaluated across spatial and temporal scales, and precision is infrequently quantified. Using DNA from fecal pellets to identify individual deer, we evaluated the...
Measurement of total-body cobalt-57 vitamin B12 absorption with a gamma camera.
Cardarelli, J A; Slingerland, D W; Burrows, B A; Miller, A
1985-08-01
Previously described techniques for the measurement of the absorption of [57Co]vitamin B12 by total-body counting have required an iron room equipped with scanning or multiple detectors. The present study uses simplifying modifications which make the technique more available and include the use of static geometry, the measurement of body thickness to correct for attenuation, a simple formula to convert the capsule-in-air count to a 100% absorption count, and finally the use of an adequately shielded gamma camera obviating the need of an iron room.
Rhudy, Matthew B; Mahoney, Joseph M
2018-04-01
The goal of this work is to compare the differences between various step counting algorithms using both accelerometer and gyroscope measurements from wrist and ankle-mounted sensors. Participants completed four different conditions on a treadmill while wearing an accelerometer and gyroscope on the wrist and the ankle. Three different step counting techniques were applied to the data from each sensor type and mounting location. It was determined that using gyroscope measurements allowed for better performance than the typically used accelerometers, and that ankle-mounted sensors provided better performance than those mounted on the wrist.
Simple to complex modeling of breathing volume using a motion sensor.
John, Dinesh; Staudenmayer, John; Freedson, Patty
2013-06-01
To compare simple and complex modeling techniques to estimate categories of low, medium, and high ventilation (VE) from ActiGraph™ activity counts. Vertical axis ActiGraph™ GT1M activity counts, oxygen consumption and VE were measured during treadmill walking and running, sports, household chores and labor-intensive employment activities. Categories of low (<19.3 l/min), medium (19.3 to 35.4 l/min) and high (>35.4 l/min) VEs were derived from activity intensity classifications (light <2.9 METs, moderate 3.0 to 5.9 METs and vigorous >6.0 METs). We examined the accuracy of two simple techniques (multiple regression and activity count cut-point analyses) and one complex (random forest technique) modeling technique in predicting VE from activity counts. Prediction accuracy of the complex random forest technique was marginally better than the simple multiple regression method. Both techniques accurately predicted VE categories almost 80% of the time. The multiple regression and random forest techniques were more accurate (85 to 88%) in predicting medium VE. Both techniques predicted the high VE (70 to 73%) with greater accuracy than low VE (57 to 60%). Actigraph™ cut-points for light, medium and high VEs were <1381, 1381 to 3660 and >3660 cpm. There were minor differences in prediction accuracy between the multiple regression and the random forest technique. This study provides methods to objectively estimate VE categories using activity monitors that can easily be deployed in the field. Objective estimates of VE should provide a better understanding of the dose-response relationship between internal exposure to pollutants and disease. Copyright © 2013 Elsevier B.V. All rights reserved.
The contribution of simple random sampling to observed variations in faecal egg counts.
Torgerson, Paul R; Paul, Michaela; Lewis, Fraser I
2012-09-10
It has been over 100 years since the classical paper published by Gosset in 1907, under the pseudonym "Student", demonstrated that yeast cells suspended in a fluid and measured by a haemocytometer conformed to a Poisson process. Similarly parasite eggs in a faecal suspension also conform to a Poisson process. Despite this there are common misconceptions how to analyse or interpret observations from the McMaster or similar quantitative parasitic diagnostic techniques, widely used for evaluating parasite eggs in faeces. The McMaster technique can easily be shown from a theoretical perspective to give variable results that inevitably arise from the random distribution of parasite eggs in a well mixed faecal sample. The Poisson processes that lead to this variability are described and illustrative examples of the potentially large confidence intervals that can arise from observed faecal eggs counts that are calculated from the observations on a McMaster slide. Attempts to modify the McMaster technique, or indeed other quantitative techniques, to ensure uniform egg counts are doomed to failure and belie ignorance of Poisson processes. A simple method to immediately identify excess variation/poor sampling from replicate counts is provided. Copyright © 2012 Elsevier B.V. All rights reserved.
Comparison of McMaster and FECPAKG2 methods for counting nematode eggs in the faeces of alpacas.
Rashid, Mohammed H; Stevenson, Mark A; Waenga, Shea; Mirams, Greg; Campbell, Angus J D; Vaughan, Jane L; Jabbar, Abdul
2018-05-02
This study aimed to compare the FECPAK G2 and the McMaster techniques for counting of gastrointestinal nematode eggs in the faeces of alpacas using two floatation solutions (saturated sodium chloride and sucrose solutions). Faecal eggs counts from both techniques were compared using the Lin's concordance correlation coefficient and Bland and Altman statistics. Results showed moderate to good agreement between the two methods, with better agreement achieved when saturated sugar is used as a floatation fluid, particularly when faecal egg counts are less than 1000 eggs per gram of faeces. To the best of our knowledge this is the first study to assess agreement of measurements between McMaster and FECPAK G2 methods for estimating faecal eggs in South American camelids.
Repeatability of paired counts.
Alexander, Neal; Bethony, Jeff; Corrêa-Oliveira, Rodrigo; Rodrigues, Laura C; Hotez, Peter; Brooker, Simon
2007-08-30
The Bland and Altman technique is widely used to assess the variation between replicates of a method of clinical measurement. It yields the repeatability, i.e. the value within which 95 per cent of repeat measurements lie. The valid use of the technique requires that the variance is constant over the data range. This is not usually the case for counts of items such as CD4 cells or parasites, nor is the log transformation applicable to zero counts. We investigate the properties of generalized differences based on Box-Cox transformations. For an example, in a data set of hookworm eggs counted by the Kato-Katz method, the square root transformation is found to stabilize the variance. We show how to back-transform the repeatability on the square root scale to the repeatability of the counts themselves, as an increasing function of the square mean root egg count, i.e. the square of the average of square roots. As well as being more easily interpretable, the back-transformed results highlight the dependence of the repeatability on the sample volume used.
Bowser, Jacquelyn E; Costa, Lais R R; Rodil, Alba U; Lopp, Christine T; Johnson, Melanie E; Wills, Robert W; Swiderski, Cyprianna E
2018-03-01
OBJECTIVE To evaluate the effect of 2 bronchoalveolar lavage (BAL) sampling techniques and the use of N-butylscopolammonium bromide (NBB) on the quantity and quality of BAL fluid (BALF) samples obtained from horses with the summer pasture endophenotype of equine asthma. ANIMALS 8 horses with the summer pasture endophenotype of equine asthma. PROCEDURES BAL was performed bilaterally (right and left lung sites) with a flexible videoendoscope passed through the left or right nasal passage. During lavage of the first lung site, a BALF sample was collected by means of either gentle syringe aspiration or mechanical suction with a pressure-regulated wall-mounted suction pump. The endoscope was then maneuvered into the contralateral lung site, and lavage was performed with the alternate fluid retrieval technique. For each horse, BAL was performed bilaterally once with and once without premedication with NBB (21-day interval). The BALF samples retrieved were evaluated for volume, total cell count, differential cell count, RBC count, and total protein concentration. RESULTS Use of syringe aspiration significantly increased total BALF volume (mean volume increase, 40 mL [approx 7.5% yield]) and decreased total RBC count (mean decrease, 142 cells/μL), compared with use of mechanical suction. The BALF nucleated cell count and differential cell count did not differ between BAL procedures. Use of NBB had no effect on BALF retrieval. CONCLUSIONS AND CLINICAL RELEVANCE Results indicated that retrieval of BALF by syringe aspiration may increase yield and reduce barotrauma in horses at increased risk of bronchoconstriction and bronchiolar collapse. Further studies to determine the usefulness of NBB and other bronchodilators during BAL procedures in horses are warranted.
Grizelle Gonzalez; Elianid Espinoza; Zhigang Liu; Xiaoming Zou
2006-01-01
We used a fluorescence technique to mark and re-count the invasive earthworm, Pontoscolex corethrurus from PVC tubes established in a forest and a recently abandoned pasture in Puerto Rico to test the effects of the labeling treatment on earthworm population survival over time. A fluorescent marker was injected into the earthworms in the middle third section of the...
Counting conformal correlators
NASA Astrophysics Data System (ADS)
Kravchuk, Petr; Simmons-Duffin, David
2018-02-01
We introduce simple group-theoretic techniques for classifying conformallyinvariant tensor structures. With them, we classify tensor structures of general n-point functions of non-conserved operators, and n ≥ 4-point functions of general conserved currents, with or without permutation symmetries, and in any spacetime dimension d. Our techniques are useful for bootstrap applications. The rules we derive simultaneously count tensor structures for flat-space scattering amplitudes in d + 1 dimensions.
Reliable enumeration of malaria parasites in thick blood films using digital image analysis.
Frean, John A
2009-09-23
Quantitation of malaria parasite density is an important component of laboratory diagnosis of malaria. Microscopy of Giemsa-stained thick blood films is the conventional method for parasite enumeration. Accurate and reproducible parasite counts are difficult to achieve, because of inherent technical limitations and human inconsistency. Inaccurate parasite density estimation may have adverse clinical and therapeutic implications for patients, and for endpoints of clinical trials of anti-malarial vaccines or drugs. Digital image analysis provides an opportunity to improve performance of parasite density quantitation. Accurate manual parasite counts were done on 497 images of a range of thick blood films with varying densities of malaria parasites, to establish a uniformly reliable standard against which to assess the digital technique. By utilizing descriptive statistical parameters of parasite size frequency distributions, particle counting algorithms of the digital image analysis programme were semi-automatically adapted to variations in parasite size, shape and staining characteristics, to produce optimum signal/noise ratios. A reliable counting process was developed that requires no operator decisions that might bias the outcome. Digital counts were highly correlated with manual counts for medium to high parasite densities, and slightly less well correlated with conventional counts. At low densities (fewer than 6 parasites per analysed image) signal/noise ratios were compromised and correlation between digital and manual counts was poor. Conventional counts were consistently lower than both digital and manual counts. Using open-access software and avoiding custom programming or any special operator intervention, accurate digital counts were obtained, particularly at high parasite densities that are difficult to count conventionally. The technique is potentially useful for laboratories that routinely perform malaria parasite enumeration. The requirements of a digital microscope camera, personal computer and good quality staining of slides are potentially reasonably easy to meet.
Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors.
Dutton, Neale A W; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K
2016-07-20
SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed.
Neutron Detection With Ultra-Fast Digitizer and Pulse Identification Techniques on DIII-D
NASA Astrophysics Data System (ADS)
Zhu, Y. B.; Heidbrink, W. W.; Piglowski, D. A.
2013-10-01
A prototype system for neutron detection with an ultra-fast digitizer and pulse identification techniques has been implemented on the DIII-D tokamak. The system consists of a cylindrical neutron fission chamber, a charge sensitive amplifier, and a GaGe Octopus 12-bit CompuScope digitizer card installed in a Linux computer. Digital pulse identification techniques have been successfully performed at maximum data acquisition rate of 50 MSPS with on-board memory of 2 GS. Compared to the traditional approach with fast nuclear electronics for pulse counting, this straightforward digital solution has many advantages, including reduced expense, improved accuracy, higher counting rate, and easier maintenance. The system also provides the capability of neutron-gamma pulse shape discrimination and pulse height analysis. Plans for the upgrade of the old DIII-D neutron counting system with these techniques will be presented. Work supported by the US Department of Energy under SC-G903402, and DE-FC02-04ER54698.
Measuring Transmission Efficiencies Of Mass Spectrometers
NASA Technical Reports Server (NTRS)
Srivastava, Santosh K.
1989-01-01
Coincidence counts yield absolute efficiencies. System measures mass-dependent transmission efficiencies of mass spectrometers, using coincidence-counting techniques reminiscent of those used for many years in calibration of detectors for subatomic particles. Coincidences between detected ions and electrons producing them counted during operation of mass spectrometer. Under certain assumptions regarding inelastic scattering of electrons, electron/ion-coincidence count is direct measure of transmission efficiency of spectrometer. When fully developed, system compact, portable, and used routinely to calibrate mass spectrometers.
Sheffield, L.M.; Gall, Adrian E.; Roby, D.D.; Irons, D.B.; Dugger, K.M.
2006-01-01
Least Auklets (Aethia pusilla (Pallas, 1811)) are the most abundant species of seabird in the Bering Sea and offer a relatively efficient means of monitoring secondary productivity in the marine environment. Counting auklets on surface plots is the primary method used to track changes in numbers of these crevice-nesters, but counts can be highly variable and may not be representative of the number of nesting individuals. We compared average maximum counts of Least Auklets on surface plots with density estimates based on mark–resight data at a colony on St. Lawrence Island, Alaska, during 2001–2004. Estimates of breeding auklet abundance from mark–resight averaged 8 times greater than those from maximum surface counts. Our results also indicate that average maximum surface counts are poor indicators of breeding auklet abundance and do not vary consistently with auklet nesting density across the breeding colony. Estimates of Least Auklet abundance from mark–resight were sufficiently precise to meet management goals for tracking changes in seabird populations. We recommend establishing multiple permanent banding plots for mark–resight studies on colonies selected for intensive long-term monitoring. Mark–resight is more likely to detect biologically significant changes in size of auklet breeding colonies than traditional surface count techniques.
Hayes, Robert B; Peña, Adan M; Goff, Thomas E
2005-08-01
This paper demonstrates the utility of a portable alpha Continuous Air Monitor (CAM) as a bench top scalar counter for multiple sample types. These include using the CAM to count fixed air sample filters and radiological smears. In counting radiological smears, the CAM is used very much like a gas flow proportional counter (GFPC), albeit with a lower efficiency. Due to the typically low background in this configuration, the minimum detectable activity for a 5-min count should be in the range of about 10 dpm which is acceptably below the 20 dpm limit for transuranic isotopes. When counting fixed air sample filters, the CAM algorithm along with other measurable characteristics can be used to identify and quantify the presence of transuranic isotopes in the samples. When the radiological control technician wants to take some credit from naturally occurring radioactive material contributions due to radon progeny producing higher energy peaks (as in the case with a fixed air sample filter), then more elaborate techniques are required. The techniques presented here will generate a decision level of about 43 dpm for such applications. The calibration for this application should alternatively be done using the default values of channels 92-126 for region of interest 1. This can be done within 10 to 15 min resulting in a method to rapidly evaluate air filters for transuranic activity. When compared to the 1-h count technique described by , the technique presented in the present work demonstrates a technique whereby more than two thirds of samples can be rapidly shown (within 10 to 15 min) to be within regulatory compliant limits. In both cases, however, spectral quality checks are required to insure sample self attenuation is not a significant bias in the activity estimates. This will allow the same level of confidence when using these techniques for activity quantification as is presently available for air monitoring activity quantification using CAMs.
Reliability of a rapid hematology stain for sputum cytology*
Gonçalves, Jéssica; Pizzichini, Emilio; Pizzichini, Marcia Margaret Menezes; Steidle, Leila John Marques; Rocha, Cristiane Cinara; Ferreira, Samira Cardoso; Zimmermann, Célia Tânia
2014-01-01
Objective: To determine the reliability of a rapid hematology stain for the cytological analysis of induced sputum samples. Methods: This was a cross-sectional study comparing the standard technique (May-Grünwald-Giemsa stain) with a rapid hematology stain (Diff-Quik). Of the 50 subjects included in the study, 21 had asthma, 19 had COPD, and 10 were healthy (controls). From the induced sputum samples collected, we prepared four slides: two were stained with May-Grünwald-Giemsa, and two were stained with Diff-Quik. The slides were read independently by two trained researchers blinded to the identification of the slides. The reliability for cell counting using the two techniques was evaluated by determining the intraclass correlation coefficients (ICCs) for intraobserver and interobserver agreement. Agreement in the identification of neutrophilic and eosinophilic sputum between the observers and between the stains was evaluated with kappa statistics. Results: In our comparison of the two staining techniques, the ICCs indicated almost perfect interobserver agreement for neutrophil, eosinophil, and macrophage counts (ICC: 0.98-1.00), as well as substantial agreement for lymphocyte counts (ICC: 0.76-0.83). Intraobserver agreement was almost perfect for neutrophil, eosinophil, and macrophage counts (ICC: 0.96-0.99), whereas it was moderate to substantial for lymphocyte counts (ICC = 0.65 and 0.75 for the two observers, respectively). Interobserver agreement for the identification of eosinophilic and neutrophilic sputum using the two techniques ranged from substantial to almost perfect (kappa range: 0.91-1.00). Conclusions: The use of Diff-Quik can be considered a reliable alternative for the processing of sputum samples. PMID:25029648
NASA Astrophysics Data System (ADS)
Stogdale, Nick; Hollock, Steve; Johnson, Neil; Sumpter, Neil
2003-09-01
A 16x16 element un-cooled pyroelectric detector array has been developed which, when allied with advanced tracking and detection algorithms, has created a universal detector with multiple applications. Low-cost manufacturing techniques are used to fabricate a hybrid detector, intended for economic use in commercial markets. The detector has found extensive application in accurate people counting, detection, tracking, secure area protection, directional sensing and area violation; topics which are all pertinent to the provision of Homeland Security. The detection and tracking algorithms have, when allied with interpolation techniques, allowed a performance much higher than might be expected from a 16x16 array. This paper reviews the technology, with particular attention to the array structure, algorithms and interpolation techniques and outlines its application in a number of challenging market areas. Viewed from above, moving people are seen as 'hot blobs' moving through the field of view of the detector; background clutter or stationary objects are not seen and the detector works irrespective of lighting or environmental conditions. Advanced algorithms detect the people and extract size, shape, direction and velocity vectors allowing the number of people to be detected and their trajectories of motion to be tracked. Provision of virtual lines in the scene allows bi-directional counting of people flowing in and out of an entrance or area. Definition of a virtual closed area in the scene allows counting of the presence of stationary people within a defined area. Definition of 'counting lines' allows the counting of people, the ability to augment access control devices by confirming a 'one swipe one entry' judgement and analysis of the flow and destination of moving people. For example, passing the 'wrong way' up a denied passageway can be detected. Counting stationary people within a 'defined area' allows the behaviour and size of groups of stationary people to be analysed and counted, an alarm condition can also be generated when people stray into such areas.
Reconfigurable Computing As an Enabling Technology for Single-Photon-Counting Laser Altimetry
NASA Technical Reports Server (NTRS)
Powell, Wesley; Hicks, Edward; Pinchinat, Maxime; Dabney, Philip; McGarry, Jan; Murray, Paul
2003-01-01
Single-photon-counting laser altimetry is a new measurement technique offering significant advantages in vertical resolution, reducing instrument size, mass, and power, and reducing laser complexity as compared to analog or threshold detection laser altimetry techniques. However, these improvements come at the cost of a dramatically increased requirement for onboard real-time data processing. Reconfigurable computing has been shown to offer considerable performance advantages in performing this processing. These advantages have been demonstrated on the Multi-KiloHertz Micro-Laser Altimeter (MMLA), an aircraft based single-photon-counting laser altimeter developed by NASA Goddard Space Flight Center with several potential spaceflight applications. This paper describes how reconfigurable computing technology was employed to perform MMLA data processing in real-time under realistic operating constraints, along with the results observed. This paper also expands on these prior results to identify concepts for using reconfigurable computing to enable spaceflight single-photon-counting laser altimeter instruments.
Rain volume estimation over areas using satellite and radar data
NASA Technical Reports Server (NTRS)
Doneaud, A. A.; Vonderhaar, T. H.
1985-01-01
An investigation of the feasibility of rain volume estimation using satellite data following a technique recently developed with radar data called the Arera Time Integral was undertaken. Case studies were selected on the basis of existing radar and satellite data sets which match in space and time. Four multicell clusters were analyzed. Routines for navigation remapping amd smoothing of satellite images were performed. Visible counts were normalized for solar zenith angle. A radar sector of interest was defined to delineate specific radar echo clusters for each radar time throughout the radar echo cluster lifetime. A satellite sector of interest was defined by applying small adjustments to the radar sector using a manual processing technique. The radar echo area, the IR maximum counts and the IR counts matching radar echo areas were found to evolve similarly, except for the decaying phase of the cluster where the cirrus debris keeps the IR counts high.
Photon Counting - One More Time
NASA Astrophysics Data System (ADS)
Stanton, Richard H.
2012-05-01
Photon counting has been around for more than 60 years, and has been available to amateurs for most of that time. In most cases single photons are detected using photomultiplier tubes, "old technology" that became available after the Second World War. But over the last couple of decades the perfection of CCD devices has given amateurs the ability to perform accurate photometry with modest telescopes. Is there any reason to still count photons? This paper discusses some of the strengths of current photon counting technology, particularly relating to the search for fast optical transients. Technology advances in counters and photomultiplier modules are briefly mentioned. Illustrative data are presented including FFT analysis of bright star photometry and a technique for finding optical pulses in a large file of noisy data. This latter technique is shown to enable the discovery of a possible optical flare on the polar variable AM Her.
Uncertainty of quantitative microbiological methods of pharmaceutical analysis.
Gunar, O V; Sakhno, N G
2015-12-30
The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.
Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors
Dutton, Neale A. W.; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K.
2016-01-01
SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed. PMID:27447643
Chavali, Pooja; Uppin, Megha S; Uppin, Shantveer G; Challa, Sundaram
2017-01-01
The most reliable histological correlate of recurrence risk in meningiomas is increased mitotic activity. Proliferative index with Ki-67 immunostaining is a helpful adjunct to manual counting. However, both show considerable inter-observer variability. A new immunohistochemical method for counting mitotic figures, using antibody against the phosphohistone H3 (PHH3) protein was introduced. Similarly, a computer based automated counting for Ki-67 labelling index (LI) is available. To study the use of these new techniques in the objective assessment of proliferation indices in meningiomas. This was a retrospective study of intracranial meningiomas diagnosed during the year 2013.The hematoxylin and eosin (H and E) sections and immunohistochemistry (IHC) with Ki-67 were reviewed by two pathologists. Photomicrographs of the representative areas were subjected to Ki-67 analysis by Immunoratio (IR) software. Mean Ki-67 LI, both manual and by IR were calculated. IHC with PHH3 was performed. PHH3 positive nuclei were counted and mean values calculated. Data analysis was done using SPSS software. A total of 64 intracranial meningiomas were diagnosed. Evaluation on H and E, PHH3, Ki-67 LI (both manual and IR) were done in 32 cases (22 grade I and 10 grade II meningiomas). Statistically significant correlation was seen between the mitotic count in each grade and PHH3 values and also between the grade of the tumor and values of Ki-67 and PHH3. Both the techniques used in the study had advantage over, as well as, correlated well with the existing techniques and hence, can be applied to routine use.
Radiation Discrimination in LiBaF3 Scintillator Using Digital Signal Processing Techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aalseth, Craig E.; Bowyer, Sonya M.; Reeder, Paul L.
2002-11-01
The new scintillator material LiBaF3:Ce offers the possibility of measuring neutron or alpha count rates and energy spectra simultaneously while measuring gamma count rates and spectra using a single detector.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faby, Sebastian, E-mail: sebastian.faby@dkfz.de; Kuchenbecker, Stefan; Sawall, Stefan
2015-07-15
Purpose: To study the performance of different dual energy computed tomography (DECT) techniques, which are available today, and future multi energy CT (MECT) employing novel photon counting detectors in an image-based material decomposition task. Methods: The material decomposition performance of different energy-resolved CT acquisition techniques is assessed and compared in a simulation study of virtual non-contrast imaging and iodine quantification. The material-specific images are obtained via a statistically optimal image-based material decomposition. A projection-based maximum likelihood approach was used for comparison with the authors’ image-based method. The different dedicated dual energy CT techniques are simulated employing realistic noise models andmore » x-ray spectra. The authors compare dual source DECT with fast kV switching DECT and the dual layer sandwich detector DECT approach. Subsequent scanning and a subtraction method are studied as well. Further, the authors benchmark future MECT with novel photon counting detectors in a dedicated DECT application against the performance of today’s DECT using a realistic model. Additionally, possible dual source concepts employing photon counting detectors are studied. Results: The DECT comparison study shows that dual source DECT has the best performance, followed by the fast kV switching technique and the sandwich detector approach. Comparing DECT with future MECT, the authors found noticeable material image quality improvements for an ideal photon counting detector; however, a realistic detector model with multiple energy bins predicts a performance on the level of dual source DECT at 100 kV/Sn 140 kV. Employing photon counting detectors in dual source concepts can improve the performance again above the level of a single realistic photon counting detector and also above the level of dual source DECT. Conclusions: Substantial differences in the performance of today’s DECT approaches were found for the application of virtual non-contrast and iodine imaging. Future MECT with realistic photon counting detectors currently can only perform comparably to dual source DECT at 100 kV/Sn 140 kV. Dual source concepts with photon counting detectors could be a solution to this problem, promising a better performance.« less
Assessment of cell concentration and viability of isolated hepatocytes using flow cytometry.
Wigg, Alan J; Phillips, John W; Wheatland, Loretta; Berry, Michael N
2003-06-01
The assessment of cell concentration and viability of freshly isolated hepatocyte preparations has been traditionally performed using manual counting with a Neubauer counting chamber and staining for trypan blue exclusion. Despite the simple and rapid nature of this assessment, concerns about the accuracy of these methods exist. Simple flow cytometry techniques which determine cell concentration and viability are available yet surprisingly have not been extensively used or validated with isolated hepatocyte preparations. We therefore investigated the use of flow cytometry using TRUCOUNT Tubes and propidium iodide staining to measure cell concentration and viability of isolated rat hepatocytes in suspension. Analysis using TRUCOUNT Tubes provided more accurate and reproducible measurement of cell concentration than manual cell counting. Hepatocyte viability, assessed using propidium iodide, correlated more closely than did trypan blue exclusion with all indicators of hepatocyte integrity and function measured (lactate dehydrogenase leakage, cytochrome p450 content, cellular ATP concentration, ammonia and lactate removal, urea and albumin synthesis). We conclude that flow cytometry techniques can be used to measure cell concentration and viability of isolated hepatocyte preparations. The techniques are simple, rapid, and more accurate than manual cell counting and trypan blue staining and the results are not affected by protein-containing media.
Characterization of the 2012-044C Briz-M Upper Stage Breakup
NASA Technical Reports Server (NTRS)
Hamilton, Joseph A.; Matney, Mark
2013-01-01
The NASA breakup model prediction was close to the observed population for catalog objects. The NASA breakup model predicted a larger population than was observed for objects under 10 cm. The stare technique produces low observation counts, but is readily comparable to model predictions. Customized stare parameters (Az, El, Range) were effective to increase the opportunities for HAX to observe the debris cloud. Other techniques to increase observation count will be considered for future breakup events.
Funk, Anna L; Boisson, Sophie; Clasen, Thomas; Ensink, Jeroen H J
2013-06-01
The Kato-Katz, conventional ethyl-acetate sedimentation, and Midi Parasep(®) methods for diagnosing infection with soil-transmitted helminths were compared. The Kato-Katz technique gave the best overall diagnostic performance with the highest results in all measures (prevalence, faecal egg count, sensitivity) followed by the conventional ethyl-acetate and then the Midi Parasep(®) technique. The Kato-Katz technique showed a significantly higher faecal egg count and sensitivity for both hookworm and Trichuris as compared to the Midi Parasep(®) technique. The conventional ethyl-acetate technique produced smaller pellets and showed lower pellet mobility as compared to the Midi Parasep(®). Copyright © 2013 Elsevier B.V. All rights reserved.
Hsu, Guo-Liang; Tang, Jung-Chang; Hwang, Wu-Yuin
2014-08-01
The one-more-than technique is an effective strategy for individuals with intellectual disabilities (ID) to use when making purchases. However, the heavy cognitive demands of money counting skills potentially limit how individuals with ID shop. This study employed a multiple-probe design across participants and settings, via the assistance of a mobile purchasing assistance system (MPAS), to assess the effectiveness of the one-more-than technique on independent purchases for items with prices beyond the participants' money counting skills. Results indicated that the techniques with the MPAS could effectively convert participants' initial money counting problems into useful advantages for successfully promoting the independent purchasing skills of three secondary school students with ID. Also noteworthy is the fact that mobile technologies could be a permanent prompt for those with ID to make purchases in their daily lives. The treatment effects could be maintained for eight weeks and generalized across three community settings. Implications for practice and future studies are provided. Copyright © 2014 Elsevier Ltd. All rights reserved.
Shrestha, Sachin L; Breen, Andrew J; Trimby, Patrick; Proust, Gwénaëlle; Ringer, Simon P; Cairney, Julie M
2014-02-01
The identification and quantification of the different ferrite microconstituents in steels has long been a major challenge for metallurgists. Manual point counting from images obtained by optical and scanning electron microscopy (SEM) is commonly used for this purpose. While classification systems exist, the complexity of steel microstructures means that identifying and quantifying these phases is still a great challenge. Moreover, point counting is extremely tedious, time consuming, and subject to operator bias. This paper presents a new automated identification and quantification technique for the characterisation of complex ferrite microstructures by electron backscatter diffraction (EBSD). This technique takes advantage of the fact that different classes of ferrite exhibit preferential grain boundary misorientations, aspect ratios and mean misorientation, all of which can be detected using current EBSD software. These characteristics are set as criteria for identification and linked to grain size to determine the area fractions. The results of this method were evaluated by comparing the new automated technique with point counting results. The technique could easily be applied to a range of other steel microstructures. © 2013 Published by Elsevier B.V.
NASA Astrophysics Data System (ADS)
Akashi-Ronquest, M.; Amaudruz, P.-A.; Batygov, M.; Beltran, B.; Bodmer, M.; Boulay, M. G.; Broerman, B.; Buck, B.; Butcher, A.; Cai, B.; Caldwell, T.; Chen, M.; Chen, Y.; Cleveland, B.; Coakley, K.; Dering, K.; Duncan, F. A.; Formaggio, J. A.; Gagnon, R.; Gastler, D.; Giuliani, F.; Gold, M.; Golovko, V. V.; Gorel, P.; Graham, K.; Grace, E.; Guerrero, N.; Guiseppe, V.; Hallin, A. L.; Harvey, P.; Hearns, C.; Henning, R.; Hime, A.; Hofgartner, J.; Jaditz, S.; Jillings, C. J.; Kachulis, C.; Kearns, E.; Kelsey, J.; Klein, J. R.; Kuźniak, M.; LaTorre, A.; Lawson, I.; Li, O.; Lidgard, J. J.; Liimatainen, P.; Linden, S.; McFarlane, K.; McKinsey, D. N.; MacMullin, S.; Mastbaum, A.; Mathew, R.; McDonald, A. B.; Mei, D.-M.; Monroe, J.; Muir, A.; Nantais, C.; Nicolics, K.; Nikkel, J. A.; Noble, T.; O'Dwyer, E.; Olsen, K.; Orebi Gann, G. D.; Ouellet, C.; Palladino, K.; Pasuthip, P.; Perumpilly, G.; Pollmann, T.; Rau, P.; Retière, F.; Rielage, K.; Schnee, R.; Seibert, S.; Skensved, P.; Sonley, T.; Vázquez-Jáuregui, E.; Veloce, L.; Walding, J.; Wang, B.; Wang, J.; Ward, M.; Zhang, C.
2015-05-01
Many current and future dark matter and neutrino detectors are designed to measure scintillation light with a large array of photomultiplier tubes (PMTs). The energy resolution and particle identification capabilities of these detectors depend in part on the ability to accurately identify individual photoelectrons in PMT waveforms despite large variability in pulse amplitudes and pulse pileup. We describe a Bayesian technique that can identify the times of individual photoelectrons in a sampled PMT waveform without deconvolution, even when pileup is present. To demonstrate the technique, we apply it to the general problem of particle identification in single-phase liquid argon dark matter detectors. Using the output of the Bayesian photoelectron counting algorithm described in this paper, we construct several test statistics for rejection of backgrounds for dark matter searches in argon. Compared to simpler methods based on either observed charge or peak finding, the photoelectron counting technique improves both energy resolution and particle identification of low energy events in calibration data from the DEAP-1 detector and simulation of the larger MiniCLEAN dark matter detector.
42 CFR 493.1276 - Standard: Clinical cytogenetics.
Code of Federal Regulations, 2010 CFR
2010-10-01
... of accessioning, cell preparation, photographing or other image reproduction technique, photographic... records that document the following: (1) The media used, reactions observed, number of cells counted, number of cells karyotyped, number of chromosomes counted for each metaphase spread, and the quality of...
Validation of an automated colony counting system for group A Streptococcus.
Frost, H R; Tsoi, S K; Baker, C A; Laho, D; Sanderson-Smith, M L; Steer, A C; Smeesters, P R
2016-02-08
The practice of counting bacterial colony forming units on agar plates has long been used as a method to estimate the concentration of live bacteria in culture. However, due to the laborious and potentially error prone nature of this measurement technique, an alternative method is desirable. Recent technologic advancements have facilitated the development of automated colony counting systems, which reduce errors introduced during the manual counting process and recording of information. An additional benefit is the significant reduction in time taken to analyse colony counting data. Whilst automated counting procedures have been validated for a number of microorganisms, the process has not been successful for all bacteria due to the requirement for a relatively high contrast between bacterial colonies and growth medium. The purpose of this study was to validate an automated counting system for use with group A Streptococcus (GAS). Twenty-one different GAS strains, representative of major emm-types, were selected for assessment. In order to introduce the required contrast for automated counting, 2,3,5-triphenyl-2H-tetrazolium chloride (TTC) dye was added to Todd-Hewitt broth with yeast extract (THY) agar. Growth on THY agar with TTC was compared with growth on blood agar and THY agar to ensure the dye was not detrimental to bacterial growth. Automated colony counts using a ProtoCOL 3 instrument were compared with manual counting to confirm accuracy over the stages of the growth cycle (latent, mid-log and stationary phases) and in a number of different assays. The average percentage differences between plating and counting methods were analysed using the Bland-Altman method. A percentage difference of ±10 % was determined as the cut-off for a critical difference between plating and counting methods. All strains measured had an average difference of less than 10 % when plated on THY agar with TTC. This consistency was also observed over all phases of the growth cycle and when plated in blood following bactericidal assays. Agreement between these methods suggest the use of an automated colony counting technique for GAS will significantly reduce time spent counting bacteria to enable a more efficient and accurate measurement of bacteria concentration in culture.
An Algorithm to Automatically Generate the Combinatorial Orbit Counting Equations
Melckenbeeck, Ine; Audenaert, Pieter; Michoel, Tom; Colle, Didier; Pickavet, Mario
2016-01-01
Graphlets are small subgraphs, usually containing up to five vertices, that can be found in a larger graph. Identification of the graphlets that a vertex in an explored graph touches can provide useful information about the local structure of the graph around that vertex. Actually finding all graphlets in a large graph can be time-consuming, however. As the graphlets grow in size, more different graphlets emerge and the time needed to find each graphlet also scales up. If it is not needed to find each instance of each graphlet, but knowing the number of graphlets touching each node of the graph suffices, the problem is less hard. Previous research shows a way to simplify counting the graphlets: instead of looking for the graphlets needed, smaller graphlets are searched, as well as the number of common neighbors of vertices. Solving a system of equations then gives the number of times a vertex is part of each graphlet of the desired size. However, until now, equations only exist to count graphlets with 4 or 5 nodes. In this paper, two new techniques are presented. The first allows to generate the equations needed in an automatic way. This eliminates the tedious work needed to do so manually each time an extra node is added to the graphlets. The technique is independent on the number of nodes in the graphlets and can thus be used to count larger graphlets than previously possible. The second technique gives all graphlets a unique ordering which is easily extended to name graphlets of any size. Both techniques were used to generate equations to count graphlets with 4, 5 and 6 vertices, which extends all previous results. Code can be found at https://github.com/IneMelckenbeeck/equation-generator and https://github.com/IneMelckenbeeck/graphlet-naming. PMID:26797021
Ripple, Dean C; Montgomery, Christopher B; Hu, Zhishang
2015-02-01
Accurate counting and sizing of protein particles has been limited by discrepancies of counts obtained by different methods. To understand the bias and repeatability of techniques in common use in the biopharmaceutical community, the National Institute of Standards and Technology has conducted an interlaboratory comparison for sizing and counting subvisible particles from 1 to 25 μm. Twenty-three laboratories from industry, government, and academic institutions participated. The circulated samples consisted of a polydisperse suspension of abraded ethylene tetrafluoroethylene particles, which closely mimic the optical contrast and morphology of protein particles. For restricted data sets, agreement between data sets was reasonably good: relative standard deviations (RSDs) of approximately 25% for light obscuration counts with lower diameter limits from 1 to 5 μm, and approximately 30% for flow imaging with specified manufacturer and instrument setting. RSDs of the reported counts for unrestricted data sets were approximately 50% for both light obscuration and flow imaging. Differences between instrument manufacturers were not statistically significant for light obscuration but were significant for flow imaging. We also report a method for accounting for differences in the reported diameter for flow imaging and electrical sensing zone techniques; the method worked well for diameters greater than 15 μm. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
Radionuclide counting technique for measuring wind velocity and direction
NASA Technical Reports Server (NTRS)
Singh, J. J. (Inventor)
1984-01-01
An anemometer utilizing a radionuclide counting technique for measuring both the velocity and the direction of wind is described. A pendulum consisting of a wire and a ball with a source of radiation on the lower surface of the ball is positioned by the wind. Detectors and are located in a plane perpendicular to pendulum (no wind). The detectors are located on the circumferene of a circle and are equidistant from each other as well as the undisturbed (no wind) source ball position.
Star counts and visual extinctions in dark nebulae
NASA Technical Reports Server (NTRS)
Dickman, R. L.
1978-01-01
Application of star count techniques to the determination of visual extinctions in compact, fairly high-extinction dark nebulae is discussed. Particular attention is devoted to the determination of visual extinctions for a cloud having a possibly anomalous ratio of total to selective extinction. The techniques discussed are illustrated in application at two colors to four well-known compact dust clouds or Bok globules: Barnard 92, B 133, B 134, and B 335. Minimum masses and lower limits to the central extinction of these objects are presented.
Night Sky Weather Monitoring System Using Fish-Eye CCD
NASA Astrophysics Data System (ADS)
Tomida, Takayuki; Saito, Yasunori; Nakamura, Ryo; Yamazaki, Katsuya
Telescope Array (TA) is international joint experiment observing ultra-high energy cosmic rays. TA employs fluorescence detection technique to observe cosmic rays. In this technique, tho existence of cloud significantly affects quality of data. Therefore, cloud monitoring provides important information. We are developing two new methods for evaluating night sky weather with pictures taken by charge-coupled device (CCD) camera. One is evaluating the amount of cloud with pixels brightness. The other is counting the number of stars with contour detection technique. The results of these methods show clear correlation, and we concluded both the analyses are reasonable methods for weather monitoring. We discuss reliability of the star counting method.
NASA Astrophysics Data System (ADS)
Jiang, Xiao-Pan; Zhang, Zi-Liang; Qin, Xiu-Bo; Yu, Run-Sheng; Wang, Bao-Yi
2010-12-01
Positronium time of flight spectroscopy (Ps-TOF) is an effective technique for porous material research. It has advantages over other techniques for analyzing the porosity and pore tortuosity of materials. This paper describes a design for Ps-TOF apparatus based on the Beijing intense slow positron beam, supplying a new material characterization technique. In order to improve the time resolution and increase the count rate of the apparatus, the detector system is optimized. For 3 eV o-Ps, the time broadening is 7.66 ns and the count rate is 3 cps after correction.
A Novel In-Beam Delayed Neutron Counting Technique for Characterization of Special Nuclear Materials
NASA Astrophysics Data System (ADS)
Bentoumi, G.; Rogge, R. B.; Andrews, M. T.; Corcoran, E. C.; Dimayuga, I.; Kelly, D. G.; Li, L.; Sur, B.
2016-12-01
A delayed neutron counting (DNC) system, where the sample to be analyzed remains stationary in a thermal neutron beam outside of the reactor, has been developed at the National Research Universal (NRU) reactor of the Canadian Nuclear Laboratories (CNL) at Chalk River. The new in-beam DNC is a novel approach for non-destructive characterization of special nuclear materials (SNM) that could enable identification and quantification of fissile isotopes within a large and shielded sample. Despite the orders of magnitude reduction in neutron flux, the in-beam DNC method can be as informative as the conventional in-core DNC for most cases while offering practical advantages and mitigated risk when dealing with large radioactive samples of unknown origin. This paper addresses (1) the qualification of in-beam DNC using a monochromatic thermal neutron beam in conjunction with a proven counting apparatus designed originally for in-core DNC, and (2) application of in-beam DNC to an examination of large sealed capsules containing unknown radioactive materials. Initial results showed that the in-beam DNC setup permits non-destructive analysis of bulky and gamma shielded samples. The method does not lend itself to trace analysis, and at best could only reveal the presence of a few milligrams of 235U via the assay of in-beam DNC total counts. Through analysis of DNC count rates, the technique could be used in combination with other neutron or gamma techniques to quantify isotopes present within samples.
NASA Astrophysics Data System (ADS)
Hu, Jianwei; Tobin, Stephen J.; LaFleur, Adrienne M.; Menlove, Howard O.; Swinhoe, Martyn T.
2013-11-01
Self-Interrogation Neutron Resonance Densitometry (SINRD) is one of several nondestructive assay (NDA) techniques being integrated into systems to measure spent fuel as part of the Next Generation Safeguards Initiative (NGSI) Spent Fuel Project. The NGSI Spent Fuel Project is sponsored by the US Department of Energy's National Nuclear Security Administration to measure plutonium in, and detect diversion of fuel pins from, spent nuclear fuel assemblies. SINRD shows promising capability in determining the 239Pu and 235U content in spent fuel. SINRD is a relatively low-cost and lightweight instrument, and it is easy to implement in the field. The technique makes use of the passive neutron source existing in a spent fuel assembly, and it uses ratios between the count rates collected in fission chambers that are covered with different absorbing materials. These ratios are correlated to key attributes of the spent fuel assembly, such as the total mass of 239Pu and 235U. Using count rate ratios instead of absolute count rates makes SINRD less vulnerable to systematic uncertainties. Building upon the previous research, this work focuses on the underlying physics of the SINRD technique: quantifying the individual impacts on the count rate ratios of a few important nuclides using the perturbation method; examining new correlations between count rate ratio and mass quantities based on the results of the perturbation study; quantifying the impacts on the energy windows of the filtering materials that cover the fission chambers by tallying the neutron spectra before and after the neutrons go through the filters; and identifying the most important nuclides that cause cooling-time variations in the count rate ratios. The results of these studies show that 235U content has a major impact on the SINRD signal in addition to the 239Pu content. Plutonium-241 and 241Am are the two main nuclides responsible for the variation in the count rate ratio with cooling time. In short, this work provides insights into some of the main factors that affect the performance of SINRD, and it should help improve the hardware design and the algorithm used to interpret the signal for the SINRD technique. In addition, the modeling and simulation techniques used in this work can be easily adopted for analysis of other NDA systems, especially when complex systems like spent nuclear fuel are involved. These studies were conducted at Los Alamos National Laboratory.
Radioactivities of Long Duration Exposure Facility (LDEF) materials: Baggage and bonanzas
NASA Technical Reports Server (NTRS)
Smith, Alan R.; Hurley, Donna L.
1991-01-01
Radioactivities in materials onboard the returned Long Duration Exposure Facility (LDEF) satellite were studied by a variety of techniques. Among the most powerful is low background Ge semiconductor detector gamma ray spectrometry. The observed radioactivities are of two origins: those radionuclides produced by nuclear reactions with the radiation field in orbit and radionuclides present initially as contaminants in materials used for construction of the spacecraft and experimental assemblies. In the first category are experiment related monitor foils and tomato seeds, and such spacecraft materials as Al, stainless steel, and Ti. In the second category are Al, Be, Ti, Va, and some special glasses. Consider that measured peak-area count rates from both categories range from a high value of about 1 count per minute down to less than 0.001 count per minute. Successful measurement of count rates toward the low end of this range can be achieved only through low background techniques, such as used to obtain the results presented here.
Radioactivities of Long Duration Exposure Facility (LDEF) materials: Baggage and bonanzas
NASA Astrophysics Data System (ADS)
Smith, Alan R.; Hurley, Donna L.
1991-06-01
Radioactivities in materials onboard the returned Long Duration Exposure Facility (LDEF) satellite were studied by a variety of techniques. Among the most powerful is low background Ge semiconductor detector gamma ray spectrometry. The observed radioactivities are of two origins: those radionuclides produced by nuclear reactions with the radiation field in orbit and radionuclides present initially as contaminants in materials used for construction of the spacecraft and experimental assemblies. In the first category are experiment related monitor foils and tomato seeds, and such spacecraft materials as Al, stainless steel, and Ti. In the second category are Al, Be, Ti, Va, and some special glasses. Consider that measured peak-area count rates from both categories range from a high value of about 1 count per minute down to less than 0.001 count per minute. Successful measurement of count rates toward the low end of this range can be achieved only through low background techniques, such as used to obtain the results presented here.
Test of a mosquito eggshell isolation method and subsampling procedure.
Turner, P A; Streever, W J
1997-03-01
Production of Aedes vigilax, the common salt-marsh mosquito, can be assessed by determining eggshell densities found in soil. In this study, 14 field-collected eggshell samples were used to test a subsampling technique and compare eggshell counts obtained with a flotation method to those obtained by direct examination of sediment (DES). Relative precision of the subsampling technique was assessed by determining the minimum number of subsamples required to estimate the true mean and confidence interval of a sample at a predetermined confidence level. A regression line was fitted to cube-root transformed eggshell counts obtained from flotation and DES and found to be significant (P < 0.001, r2 = 0.97). The flotation method allowed processing of samples in about one-third of the time required by DES, but recovered an average of 44% of the eggshells present. Eggshells obtained with the flotation method can be used to predict those from DES using the following equation: DES count = [1.386 x (flotation count)0.33 - 0.01]3.
Conversion from Engineering Units to Telemetry Counts on Dryden Flight Simulators
NASA Technical Reports Server (NTRS)
Fantini, Jay A.
1998-01-01
Dryden real-time flight simulators encompass the simulation of pulse code modulation (PCM) telemetry signals. This paper presents a new method whereby the calibration polynomial (from first to sixth order), representing the conversion from counts to engineering units (EU), is numerically inverted in real time. The result is less than one-count error for valid EU inputs. The Newton-Raphson method is used to numerically invert the polynomial. A reverse linear interpolation between the EU limits is used to obtain an initial value for the desired telemetry count. The method presented here is not new. What is new is how classical numerical techniques are optimized to take advantage of modem computer power to perform the desired calculations in real time. This technique makes the method simple to understand and implement. There are no interpolation tables to store in memory as in traditional methods. The NASA F-15 simulation converts and transmits over 1000 parameters at 80 times/sec. This paper presents algorithm development, FORTRAN code, and performance results.
NASA Astrophysics Data System (ADS)
Vandergoes, Marcus J.; Howarth, Jamie D.; Dunbar, Gavin B.; Turnbull, Jocelyn C.; Roop, Heidi A.; Levy, Richard H.; Li, Xun; Prior, Christine; Norris, Margaret; Keller, Liz D.; Baisden, W. Troy; Ditchburn, Robert; Fitzsimons, Sean J.; Bronk Ramsey, Christopher
2018-05-01
Annually resolved (varved) lake sequences are important palaeoenvironmental archives as they offer a direct incremental dating technique for high-frequency reconstruction of environmental and climate change. Despite the importance of these records, establishing a robust chronology and quantifying its precision and accuracy (estimations of error) remains an essential but challenging component of their development. We outline an approach for building reliable independent chronologies, testing the accuracy of layer counts and integrating all chronological uncertainties to provide quantitative age and error estimates for varved lake sequences. The approach incorporates (1) layer counts and estimates of counting precision; (2) radiometric and biostratigrapic dating techniques to derive independent chronology; and (3) the application of Bayesian age modelling to produce an integrated age model. This approach is applied to a case study of an annually resolved sediment record from Lake Ohau, New Zealand. The most robust age model provides an average error of 72 years across the whole depth range. This represents a fractional uncertainty of ∼5%, higher than the <3% quoted for most published varve records. However, the age model and reported uncertainty represent the best fit between layer counts and independent chronology and the uncertainties account for both layer counting precision and the chronological accuracy of the layer counts. This integrated approach provides a more representative estimate of age uncertainty and therefore represents a statistically more robust chronology.
Fast distributed large-pixel-count hologram computation using a GPU cluster.
Pan, Yuechao; Xu, Xuewu; Liang, Xinan
2013-09-10
Large-pixel-count holograms are one essential part for big size holographic three-dimensional (3D) display, but the generation of such holograms is computationally demanding. In order to address this issue, we have built a graphics processing unit (GPU) cluster with 32.5 Tflop/s computing power and implemented distributed hologram computation on it with speed improvement techniques, such as shared memory on GPU, GPU level adaptive load balancing, and node level load distribution. Using these speed improvement techniques on the GPU cluster, we have achieved 71.4 times computation speed increase for 186M-pixel holograms. Furthermore, we have used the approaches of diffraction limits and subdivision of holograms to overcome the GPU memory limit in computing large-pixel-count holograms. 745M-pixel and 1.80G-pixel holograms were computed in 343 and 3326 s, respectively, for more than 2 million object points with RGB colors. Color 3D objects with 1.02M points were successfully reconstructed from 186M-pixel hologram computed in 8.82 s with all the above three speed improvement techniques. It is shown that distributed hologram computation using a GPU cluster is a promising approach to increase the computation speed of large-pixel-count holograms for large size holographic display.
Visual counts as an index of White-Tailed Prairie Dog density
Menkens, George E.; Biggins, Dean E.; Anderson, Stanley H.
1990-01-01
Black-footed ferrets (Mustela nigripes) are depended on prairie dogs (Cynomys spp.) for food and shelter and were historically restricted to prairie dog towns (Anderson et al. 1986). Because ferrets and prairie dogs are closely associated, successful ferret management and conservation depends on successful prairie dog management. A critical component of any management program for ferrets will be monitoring prairie dog population dynamics on towns containing ferrets or on towns proposed as ferret reintroduction sites. Three techniques for estimating prairie dog population size and density are counts of plugged and reopened burrows (Tietjen and Matschke 1982), mark-recapture (Otis et al. 1978; Seber 1982, 1986; Menkens and Anderson 1989), and visual counts (Fagerstone and Biggins 1986, Knowles 1986). The technique of plugging burrows and counting the number reopened by prairie dogs is too time and labor intensive for population evaluation on a large number of towns or over large areas. Total burrow counts are not correlated with white-tailed prairie dog (C. leucurus) densities and thus cannot be used for populated evaluation (Menkens et al. 1988). Mark-recapture requires trapping that is expensive and time and labor intensive. Monitoring a large number of prairie dog populations using mark-recapture would be difficult. Alternatively a large number of populations could be monitored in short periods of time using the visual count technique (Fagerstone and Biggins 1986, Knowles 1986). However, the accuracy of visual counts has only been evaluated in a few locations. Thus, it is not known whether the relationship between counts and prairie dog density is consistent throughout the prairie dog's range. Our objective was to evaluate the potential of using visual counts as a rapid means of estimating white-tailed prairie dog density in prairie dog towns throughout Wyoming. We studied 18 white-tailed prairie dog towns in 4 white-tailed prairie dog complexes in Wyoming near Laramie (105°40'W, 41°20'N, 3 grids), Pathfinder reservoir (106°55'W, 42°30'N, 6 grids), Shirley Basin (106°10'W, 42°20'N, 6 grids), and Meeteetse (108°10'W, 44°10'N, 3 grids). All towns were dominated by grasses, forbs, and shrubs (details in Collins and Lichvar 1986). Topography of towns ranged from flat to gently rolling hills.
A radionuclide counting technique for measuring wind velocity. [drag force anemometers
NASA Technical Reports Server (NTRS)
Singh, J. J.; Khandelwal, G. S.; Mall, G. H.
1981-01-01
A technique for measuring wind velocities of meteorological interest is described. It is based on inverse-square-law variation of the counting rates as the radioactive source-to-counter distance is changed by wind drag on the source ball. Results of a feasibility study using a weak bismuth 207 radiation source and three Geiger-Muller radiation counters are reported. The use of the technique is not restricted to Martian or Mars-like environments. A description of the apparatus, typical results, and frequency response characteristics are included. A discussion of a double-pendulum arrangement is presented. Measurements reported herein indicate that the proposed technique may be suitable for measuring wind speeds up to 100 m/sec, which are either steady or whose rates of fluctuation are less than 1 kHz.
Effects of the frame acquisition rate on the sensitivity of gastro-oesophageal reflux scintigraphy
Codreanu, I; Chamroonrat, W; Edwards, K
2013-01-01
Objective: To compare the sensitivity of gastro-oesophageal reflux (GOR) scintigraphy at 5-s and 60-s frame acquisition rates. Methods: GOR scintigraphy of 50 subjects (1 month–20 years old, mean 42 months) were analysed concurrently using 5-s and 60-s acquisition frames. Reflux episodes were graded as low if activity was detected in the distal half of the oesophagus and high if activity was detected in its upper half or in the oral cavity. For comparison purposes, detected GOR in any number of 5-s frames corresponding to one 60-s frame was counted as one episode. Results: A total of 679 episodes of GOR to the upper oesophagus were counted using a 5-s acquisition technique. Only 183 of such episodes were detected on 60-s acquisition images. To the lower oesophagus, a total of 1749 GOR episodes were detected using a 5-s acquisition technique and only 1045 episodes using 60-s acquisition frames (these also included the high-level GOR on 5-s frames counted as low level on 60-s acquisition frames). 10 patients had high-level GOR episodes that were detected only using a 5-s acquisition technique, leading to a different diagnosis in these patients. No correlation between the number of reflux episodes and the gastric emptying rates was noted. Conclusion: The 5-s frame acquisition technique is more sensitive than the 60-s frame acquisition technique for detecting both high- and low-level GOR. Advances in knowledge: Brief GOR episodes with a relatively low number of radioactive counts are frequently indistinguishable from intense background activity on 60-s acquisition frames. PMID:23520226
Photographic techniques for characterizing streambed particle sizes
Whitman, Matthew S.; Moran, Edward H.; Ourso, Robert T.
2003-01-01
We developed photographic techniques to characterize coarse (>2-mm) and fine (≤2-mm) streambed particle sizes in 12 streams in Anchorage, Alaska. Results were compared with current sampling techniques to assess which provided greater sampling efficiency and accuracy. The streams sampled were wadeable and contained gravel—cobble streambeds. Gradients ranged from about 5% at the upstream sites to about 0.25% at the downstream sites. Mean particle sizes and size-frequency distributions resulting from digitized photographs differed significantly from those resulting from Wolman pebble counts for five sites in the analysis. Wolman counts were biased toward selecting larger particles. Photographic analysis also yielded a greater number of measured particles (mean = 989) than did the Wolman counts (mean = 328). Stream embeddedness ratings assigned from field and photographic observations were significantly different at 5 of the 12 sites, although both types of ratings showed a positive relationship with digitized surface fines. Visual estimates of embeddedness and digitized surface fines may both be useful indicators of benthic conditions, but digitizing surface fines produces quantitative rather than qualitative data. Benefits of the photographic techniques include reduced field time, minimal streambed disturbance, convenience of postfield processing, easy sample archiving, and improved accuracy and replication potential.
Comparison of line transects and point counts for monitoring spring migration in forested wetlands
Wilson, R.R.; Twedt, D.J.; Elliott, A.B.
2000-01-01
We compared the efficacy of 400-m line transects and sets of three point counts at detecting avian richness and abundance in bottomland hardwood forests and intensively managed cottonwood (Populus deltoides) plantations within the Mississippi Alluvial Valley. We detected more species and more individuals on line transects than on three point counts during 218 paired surveys conducted between 24 March and 3 June, 1996 and 1997. Line transects also yielded more birds per unit of time, even though point counts yielded higher estimates of relative bird density. In structurally more-complex bottomland hardwood forests, we detected more species and individuals on line transects, but in more-open cottonwood plantations, transects surpassed point counts only at detecting species within 50 m of the observer. Species richness and total abundance of Nearctic-Neotropical migrants and temperate migrants were greater on line transects within bottomland hardwood forests. Within cottonwood plantations, however, only species richness of Nearctic-Neotropical migrants and total abundance of temperate migrants were greater on line transects. Because we compared survey techniques using the same observer, within the same forest stand on a given day, we assumed that the technique yielding greater estimates of avian species richness and total abundance per unit of effort is superior. Thus, for monitoring migration within hardwood forests of the Mississippi Alluvial Valley, we recommend using line transects instead of point counts.
ERIC Educational Resources Information Center
Falter, H. Ellie
2011-01-01
How do teachers teach students to count rhythms? Teachers can choose from various techniques. Younger students may learn themed words (such as "pea," "carrot," or "avocado"), specific rhythm syllables (such as "ta" and "ti-ti"), or some other counting method to learn notation and internalize rhythms. As students grow musically, and especially when…
ERIC Educational Resources Information Center
Magnus, Brooke E.; Thissen, David
2017-01-01
Questionnaires that include items eliciting count responses are becoming increasingly common in psychology. This study proposes methodological techniques to overcome some of the challenges associated with analyzing multivariate item response data that exhibit zero inflation, maximum inflation, and heaping at preferred digits. The modeling…
Data indexing techniques for the EUVE all-sky survey
NASA Technical Reports Server (NTRS)
Lewis, J.; Saba, V.; Dobson, C.
1992-01-01
This poster describes techniques developed for manipulating large full-sky data sets for the Extreme Ultraviolet Explorer project. The authors have adapted the quatrilateralized cubic sphere indexing algorithm to allow us to efficiently store and process several types of large data sets, such as full-sky maps of photon counts, exposure time, and count rates. A variation of this scheme is used to index sparser data such as individual photon events and viewing times for selected areas of the sky, which are eventually used to create EUVE source catalogs.
Dynamic time-correlated single-photon counting laser ranging
NASA Astrophysics Data System (ADS)
Peng, Huan; Wang, Yu-rong; Meng, Wen-dong; Yan, Pei-qin; Li, Zhao-hui; Li, Chen; Pan, Hai-feng; Wu, Guang
2018-03-01
We demonstrate a photon counting laser ranging experiment with a four-channel single-photon detector (SPD). The multi-channel SPD improve the counting rate more than 4×107 cps, which makes possible for the distance measurement performed even in daylight. However, the time-correlated single-photon counting (TCSPC) technique cannot distill the signal easily while the fast moving targets are submersed in the strong background. We propose a dynamic TCSPC method for fast moving targets measurement by varying coincidence window in real time. In the experiment, we prove that targets with velocity of 5 km/s can be detected according to the method, while the echo rate is 20% with the background counts of more than 1.2×107 cps.
Poka-yoke process controller: designed for individuals with cognitive impairments.
Erlandson, R F; Sant, D
1998-01-01
Poka-yoke is a Japanese term meaning "error proofing." Poka-yoke techniques were developed to achieve zero defects in manufacturing and assembly processes. The application of these techniques tends to reduce both the physical and cognitive demands of tasks and thereby make them more accessible. Poka-yoke interventions create a dialogue between the worker and the process, and this dialogue provides the feedback necessary for workers to prevent errors. For individuals with cognitive impairments, weighing and counting tasks can be difficult or impossible. Interventions that provide sufficient feedback to workers without disabilities tend to be too subtle for workers with cognitive impairments; hence, the feedback must be enhanced. The Poka-Yoke Controller (PYC) was designed to assist individuals with counting and weighing tasks. The PYC interfaces to an Ohaus CT6000 digital scale for weighing parts and for counting parts by weight. It also interfaces to sensors and switches for object counting tasks. The PYC interfaces to a variety of programmable voice output devices so that voice feedback or prompting can be provided at specific points in the weighing or counting process. The PYC can also be interfaced to conveyor systems, indexed turntables, and other material handling systems for coordinated counting and material handling operations. In all of our applications to date, we have observed improved worker performance, improved process quality, and greater worker independence. These observed benefits have also significantly reduced the need for staff intervention. The process controller is described and three applications are presented: a weighing task and two counting applications.
Wedge sampling for computing clustering coefficients and triangle counts on large graphs
Seshadhri, C.; Pinar, Ali; Kolda, Tamara G.
2014-05-08
Graphs are used to model interactions in a variety of contexts, and there is a growing need to quickly assess the structure of such graphs. Some of the most useful graph metrics are based on triangles, such as those measuring social cohesion. Despite the importance of these triadic measures, algorithms to compute them can be extremely expensive. We discuss the method of wedge sampling. This versatile technique allows for the fast and accurate approximation of various types of clustering coefficients and triangle counts. Furthermore, these techniques are extensible to counting directed triangles in digraphs. Our methods come with provable andmore » practical time-approximation tradeoffs for all computations. We provide extensive results that show our methods are orders of magnitude faster than the state of the art, while providing nearly the accuracy of full enumeration.« less
Computer measurement of particle sizes in electron microscope images
NASA Technical Reports Server (NTRS)
Hall, E. L.; Thompson, W. B.; Varsi, G.; Gauldin, R.
1976-01-01
Computer image processing techniques have been applied to particle counting and sizing in electron microscope images. Distributions of particle sizes were computed for several images and compared to manually computed distributions. The results of these experiments indicate that automatic particle counting within a reasonable error and computer processing time is feasible. The significance of the results is that the tedious task of manually counting a large number of particles can be eliminated while still providing the scientist with accurate results.
Minimum Detectable Activity for Tomographic Gamma Scanning System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Venkataraman, Ram; Smith, Susan; Kirkpatrick, J. M.
2015-01-01
For any radiation measurement system, it is useful to explore and establish the detection limits and a minimum detectable activity (MDA) for the radionuclides of interest, even if the system is to be used at far higher values. The MDA serves as an important figure of merit, and often a system is optimized and configured so that it can meet the MDA requirements of a measurement campaign. The non-destructive assay (NDA) systems based on gamma ray analysis are no exception and well established conventions, such the Currie method, exist for estimating the detection limits and the MDA. However, the Tomographicmore » Gamma Scanning (TGS) technique poses some challenges for the estimation of detection limits and MDAs. The TGS combines high resolution gamma ray spectrometry (HRGS) with low spatial resolution image reconstruction techniques. In non-imaging gamma ray based NDA techniques measured counts in a full energy peak can be used to estimate the activity of a radionuclide, independently of other counting trials. However, in the case of the TGS each “view” is a full spectral grab (each a counting trial), and each scan consists of 150 spectral grabs in the transmission and emission scans per vertical layer of the item. The set of views in a complete scan are then used to solve for the radionuclide activities on a voxel by voxel basis, over 16 layers of a 10x10 voxel grid. Thus, the raw count data are not independent trials any more, but rather constitute input to a matrix solution for the emission image values at the various locations inside the item volume used in the reconstruction. So, the validity of the methods used to estimate MDA for an imaging technique such as TGS warrant a close scrutiny, because the pair-counting concept of Currie is not directly applicable. One can also raise questions as to whether the TGS, along with other image reconstruction techniques which heavily intertwine data, is a suitable method if one expects to measure samples whose activities are at or just above MDA levels. The paper examines methods used to estimate MDAs for a TGS system, and explores possible solutions that can be rigorously defended.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Erin A.; Robinson, Sean M.; Anderson, Kevin K.
2015-01-19
Here we present a novel technique for the localization of radiological sources in urban or rural environments from an aerial platform. The technique is based on a Bayesian approach to localization, in which measured count rates in a time series are compared with predicted count rates from a series of pre-calculated test sources to define likelihood. Furthermore, this technique is expanded by using a localized treatment with a limited field of view (FOV), coupled with a likelihood ratio reevaluation, allowing for real-time computation on commodity hardware for arbitrarily complex detector models and terrain. In particular, detectors with inherent asymmetry ofmore » response (such as those employing internal collimation or self-shielding for enhanced directional awareness) are leveraged by this approach to provide improved localization. Our results from the localization technique are shown for simulated flight data using monolithic as well as directionally-aware detector models, and the capability of the methodology to locate radioisotopes is estimated for several test cases. This localization technique is shown to facilitate urban search by allowing quick and adaptive estimates of source location, in many cases from a single flyover near a source. In particular, this method represents a significant advancement from earlier methods like full-field Bayesian likelihood, which is not generally fast enough to allow for broad-field search in real time, and highest-net-counts estimation, which has a localization error that depends strongly on flight path and cannot generally operate without exhaustive search« less
A Method of Recording and Predicting the Pollen Count.
ERIC Educational Resources Information Center
Buck, M.
1985-01-01
A hair dryer, plastic funnel, and microscope slide can be used for predicting pollen counts on a day-to-day basis. Materials, methods for assembly, collection technique, meteorological influences, and daily patterns are discussed. Data collected using the apparatus suggest that airborne grass products other than pollen also affect hay fever…
Carbon fiber counting. [aircraft structures
NASA Technical Reports Server (NTRS)
Pride, R. A.
1980-01-01
A method was developed for characterizing the number and lengths of carbon fibers accidentally released by the burning of composite portions of civil aircraft structure in a jet fuel fire after an accident. Representative samplings of carbon fibers collected on transparent sticky film were counted from photographic enlargements with a computer aided technique which also provided fiber lengths.
Surpassing Humans and Computers with JellyBean: Crowd-Vision-Hybrid Counting Algorithms.
Sarma, Akash Das; Jain, Ayush; Nandi, Arnab; Parameswaran, Aditya; Widom, Jennifer
2015-11-01
Counting objects is a fundamental image processisng primitive, and has many scientific, health, surveillance, security, and military applications. Existing supervised computer vision techniques typically require large quantities of labeled training data, and even with that, fail to return accurate results in all but the most stylized settings. Using vanilla crowd-sourcing, on the other hand, can lead to significant errors, especially on images with many objects. In this paper, we present our JellyBean suite of algorithms, that combines the best of crowds and computer vision to count objects in images, and uses judicious decomposition of images to greatly improve accuracy at low cost. Our algorithms have several desirable properties: (i) they are theoretically optimal or near-optimal , in that they ask as few questions as possible to humans (under certain intuitively reasonable assumptions that we justify in our paper experimentally); (ii) they operate under stand-alone or hybrid modes, in that they can either work independent of computer vision algorithms, or work in concert with them, depending on whether the computer vision techniques are available or useful for the given setting; (iii) they perform very well in practice, returning accurate counts on images that no individual worker or computer vision algorithm can count correctly, while not incurring a high cost.
Grabow, W O; du Preez, M
1979-01-01
Total coliform counts obtained by means of standard membrane filtration techniques, using MacConkey agar, m-Endo LES agar, Teepol agar, and pads saturated with Teepol broth as growth media, were compared. Various combinations of these media were used in tests on 490 samples of river water and city wastewater after different stages of conventional purification and reclamation processes including lime treatment, and filtration, active carbon treatment, ozonation, and chlorination. Endo agar yielded the highest average counts for all these samples. Teepol agar generally had higher counts then Teepol broth, whereas MacConkey agar had the lowest average counts. Identification of 871 positive isolates showed that Aeromonas hydrophila was the species most commonly detected. Species of Escherichia, Citrobacter, Klebsiella, and Enterobacter represented 55% of isolates which conformed to the definition of total coliforms on Endo agar, 54% on Teepol agar, and 45% on MacConkey agar. Selection for species on the media differed considerably. Evaluation of these data and literature on alternative tests, including most probable number methods, indicated that the technique of choice for routine analysis of total coliform bacteria in drinking water is membrane filtration using m-Endo LES agar as growth medium without enrichment procedures or a cytochrome oxidase restriction. PMID:394678
Detection of microbial concentration in ice-cream using the impedance technique.
Grossi, M; Lanzoni, M; Pompei, A; Lazzarini, R; Matteuzzi, D; Riccò, B
2008-06-15
The detection of microbial concentration, essential for safe and high quality food products, is traditionally made with the plate count technique, that is reliable, but also slow and not easily realized in the automatic form, as required for direct use in industrial machines. To this purpose, the method based on impedance measurements represents an attractive alternative since it can produce results in about 10h, instead of the 24-48h needed by standard plate counts and can be easily realized in automatic form. In this paper such a method has been experimentally studied in the case of ice-cream products. In particular, all main ice-cream compositions of real interest have been considered and no nutrient media has been used to dilute the samples. A measurement set-up has been realized using benchtop instruments for impedance measurements on samples whose bacteria concentration was independently measured by means of standard plate counts. The obtained results clearly indicate that impedance measurement represents a feasible and reliable technique to detect total microbial concentration in ice-cream, suitable to be implemented as an embedded system for industrial machines.
Subnuclear foci quantification using high-throughput 3D image cytometry
NASA Astrophysics Data System (ADS)
Wadduwage, Dushan N.; Parrish, Marcus; Choi, Heejin; Engelward, Bevin P.; Matsudaira, Paul; So, Peter T. C.
2015-07-01
Ionising radiation causes various types of DNA damages including double strand breaks (DSBs). DSBs are often recognized by DNA repair protein ATM which forms gamma-H2AX foci at the site of the DSBs that can be visualized using immunohistochemistry. However most of such experiments are of low throughput in terms of imaging and image analysis techniques. Most of the studies still use manual counting or classification. Hence they are limited to counting a low number of foci per cell (5 foci per nucleus) as the quantification process is extremely labour intensive. Therefore we have developed a high throughput instrumentation and computational pipeline specialized for gamma-H2AX foci quantification. A population of cells with highly clustered foci inside nuclei were imaged, in 3D with submicron resolution, using an in-house developed high throughput image cytometer. Imaging speeds as high as 800 cells/second in 3D were achieved by using HiLo wide-field depth resolved imaging and a remote z-scanning technique. Then the number of foci per cell nucleus were quantified using a 3D extended maxima transform based algorithm. Our results suggests that while most of the other 2D imaging and manual quantification studies can count only up to about 5 foci per nucleus our method is capable of counting more than 100. Moreover we show that 3D analysis is significantly superior compared to the 2D techniques.
A New Method for Estimating Bacterial Abundances in Natural Samples using Sublimation
NASA Technical Reports Server (NTRS)
Glavin, Daniel P.; Cleaves, H. James; Schubert, Michael; Aubrey, Andrew; Bada, Jeffrey L.
2004-01-01
We have developed a new method based on the sublimation of adenine from Escherichia coli to estimate bacterial cell counts in natural samples. To demonstrate this technique, several types of natural samples including beach sand, seawater, deep-sea sediment, and two soil samples from the Atacama Desert were heated to a temperature of 500 C for several seconds under reduced pressure. The sublimate was collected on a cold finger and the amount of adenine released from the samples then determined by high performance liquid chromatography (HPLC) with UV absorbance detection. Based on the total amount of adenine recovered from DNA and RNA in these samples, we estimated bacterial cell counts ranging from approx. l0(exp 5) to l0(exp 9) E. coli cell equivalents per gram. For most of these samples, the sublimation based cell counts were in agreement with total bacterial counts obtained by traditional DAPI staining. The simplicity and robustness of the sublimation technique compared to the DAPI staining method makes this approach particularly attractive for use by spacecraft instrumentation. NASA is currently planning to send a lander to Mars in 2009 in order to assess whether or not organic compounds, especially those that might be associated with life, are present in Martian surface samples. Based on our analyses of the Atacama Desert soil samples, several million bacterial cells per gam of Martian soil should be detectable using this sublimation technique.
Lubow, Bruce C.; Ransom, Jason I.
2007-01-01
An aerial survey technique combining simultaneous double-count and sightability bias correction methodologies was used to estimate the population of wild horses inhabiting Adobe Town and Salt Wells Creek Herd Management Areas, Wyoming. Based on 5 surveys over 4 years, we conclude that the technique produced estimates consistent with the known number of horses removed between surveys and an annual population growth rate of 16.2 percent per year. Therefore, evidence from this series of surveys supports the validity of this survey method. Our results also indicate that the ability of aerial observers to see horse groups is very strongly dependent on skill of the individual observer, size of the horse group, and vegetation cover. It is also more modestly dependent on the ruggedness of the terrain and the position of the sun relative to the observer. We further conclude that censuses, or uncorrected raw counts, are inadequate estimates of population size for this herd. Such uncorrected counts were all undercounts in our trials, and varied in magnitude from year to year and observer to observer. As of April 2007, we estimate that the population of the Adobe Town /Salt Wells Creek complex is 906 horses with a 95 percent confidence interval ranging from 857 to 981 horses.
Design, development and manufacture of a breadboard radio frequency mass gauging system
NASA Technical Reports Server (NTRS)
1975-01-01
The feasibility of the RF gauging mode, counting technique was demonstrated for gauging liquid hydrogen and liquid oxygen under all attitude conditions. With LH2, it was also demonstrated under dynamic fluid conditions, in which the fluid assumes ever changing positions within the tank, that the RF gauging technique on the average provides a very good indication of mass. It is significant that the distribution of the mode count data at each fill level during dynamic LH2 and LOX orientation testing does approach a statistical normal distribution. Multiple space-diversity probes provide better coupling to the resonant modes than utilization of a single probe element. The variable sweep rate generator technique provides a more uniform mode versus time distribution for processing.
Hallas, Gary; Monis, Paul
2015-01-01
The enumeration of bacteria using plate-based counts is a core technique used by food and water microbiology testing laboratories. However, manual counting of bacterial colonies is both time and labour intensive, can vary between operators and also requires manual entry of results into laboratory information management systems, which can be a source of data entry error. An alternative is to use automated digital colony counters, but there is a lack of peer-reviewed validation data to allow incorporation into standards. We compared the performance of digital counting technology (ProtoCOL3) against manual counting using criteria defined in internationally recognized standard methods. Digital colony counting provided a robust, standardized system suitable for adoption in a commercial testing environment. The digital technology has several advantages:•Improved measurement of uncertainty by using a standard and consistent counting methodology with less operator error.•Efficiency for labour and time (reduced cost).•Elimination of manual entry of data onto LIMS.•Faster result reporting to customers.
Grabowski, Nils Th; Klein, Günter
2017-01-01
To increase the shelf life of edible insects, modern techniques (e.g. freeze-drying) add to the traditional methods (degutting, boiling, sun-drying or roasting). However, microorganisms become inactivated rather than being killed, and when rehydrated, many return to vegetative stadia. Crickets (Gryllus bimaculatus) and superworms (Zophobas atratus) were submitted to four different drying techniques (T1 = 10' cooking, 24 h drying at 60℃; T2 = 10' cooking, 24 h drying at 80℃; T3 = 30' cooking, 12 h drying at 80℃, and 12 h drying at 100℃; T4 = boiling T3-treated insects after five days) and analysed for total bacteria counts, Enterobacteriaceae, staphylococci, bacilli, yeasts and moulds counts, E. coli, salmonellae, and Listeria monocytogenes (the latter three being negative throughout). The microbial counts varied strongly displaying species- and treatment-specific patterns. T3 was the most effective of the drying treatments tested to decrease all counts but bacilli, for which T2 was more efficient. Still, total bacteria counts remained high (G. bimaculatus > Z. atratus). Other opportunistically pathogenic microorganisms (Bacillus thuringiensis, B. licheniformis, B. pumilis, Pseudomonas aeruginosa, and Cryptococcus neoformans) were also encountered. The tyndallisation-like T4 reduced all counts to below detection limit, but nutrients leakage should be considered regarding food quality. In conclusion, species-specific drying procedures should be devised to ensure food safety. © The Author(s) 2016.
Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources.
Klumpp, John; Brandl, Alexander
2015-03-01
A particle counting and detection system is proposed that searches for elevated count rates in multiple energy regions simultaneously. The system analyzes time-interval data (e.g., time between counts), as this was shown to be a more sensitive technique for detecting low count rate sources compared to analyzing counts per unit interval (Luo et al. 2013). Two distinct versions of the detection system are developed. The first is intended for situations in which the sample is fixed and can be measured for an unlimited amount of time. The second version is intended to detect sources that are physically moving relative to the detector, such as a truck moving past a fixed roadside detector or a waste storage facility under an airplane. In both cases, the detection system is expected to be active indefinitely; i.e., it is an online detection system. Both versions of the multi-energy detection systems are compared to their respective gross count rate detection systems in terms of Type I and Type II error rates and sensitivity.
Tutorial on X-ray photon counting detector characterization.
Ren, Liqiang; Zheng, Bin; Liu, Hong
2018-01-01
Recent advances in photon counting detection technology have led to significant research interest in X-ray imaging. As a tutorial level review, this paper covers a wide range of aspects related to X-ray photon counting detector characterization. The tutorial begins with a detailed description of the working principle and operating modes of a pixelated X-ray photon counting detector with basic architecture and detection mechanism. Currently available methods and techniques for charactering major aspects including energy response, noise floor, energy resolution, count rate performance (detector efficiency), and charge sharing effect of photon counting detectors are comprehensively reviewed. Other characterization aspects such as point spread function (PSF), line spread function (LSF), contrast transfer function (CTF), modulation transfer function (MTF), noise power spectrum (NPS), detective quantum efficiency (DQE), bias voltage, radiation damage, and polarization effect are also remarked. A cadmium telluride (CdTe) pixelated photon counting detector is employed for part of the characterization demonstration and the results are presented. This review can serve as a tutorial for X-ray imaging researchers and investigators to understand, operate, characterize, and optimize photon counting detectors for a variety of applications.
A photographic technique for estimating egg density of the white pine weevil, Pissodes strobi (Peck)
Roger T. Zerillo
1975-01-01
Compares a photographic technique with visual and dissection techniques for estimating egg density of the white pine weevil, Pissodes strobi (Peck). The relatively high correlations (.67 and .79) between counts from photographs and those obtained by dissection indicate that the non-destructive photographic technique could be a useful tool for...
Laser-induced photo emission detection: data acquisition based on light intensity counting
NASA Astrophysics Data System (ADS)
Yulianto, N.; Yudasari, N.; Putri, K. Y.
2017-04-01
Laser Induced Breakdown Detection (LIBD) is one of the quantification techniques for colloids. There are two ways of detection in LIBD: optical detection and acoustic detection. LIBD is based on the detection of plasma emission due to the interaction between particle and laser beam. In this research, the changing of light intensity during plasma formations was detected by a photodiode sensor. A photo emission data acquisition system was built to collect and transform them into digital counts. The real-time system used data acquisition device National Instrument DAQ 6009 and LABVIEW software. The system has been tested on distilled water and tap water samples. The result showed 99.8% accuracy by using counting technique in comparison to the acoustic detection with sample rate of 10 Hz, thus the acquisition system can be applied as an alternative method to the existing LIBD acquisition system.
NASA Technical Reports Server (NTRS)
Jenniskens, Peter; Crawford, Chris; Butow, Steven J.; Nugent, David; Koop, Mike; Holman, David; Houston, Jane; Jobse, Klaas; Kronk, Gary
2000-01-01
A new hybrid technique of visual and video meteor observations was developed to provide high precision near real-time flux measurements for satellite operators from airborne platforms. A total of 33,000 Leonids. recorded on video during the 1999 Leonid storm, were watched by a team of visual observers using a video head display and an automatic counting tool. The counts reveal that the activity profile of the Leonid storm is a Lorentz profile. By assuming a radial profile for the dust trail that is also a Lorentzian, we make predictions for future encounters. If that assumption is correct, we passed 0.0003 AU deeper into the 1899 trailet than expected during the storm of 1999 and future encounters with the 1866 trailet will be less intense than. predicted elsewhere.
Methods of detecting and counting raptors: A review
Fuller, M.R.; Mosher, J.A.; Ralph, C. John; Scott, J. Michael
1981-01-01
Most raptors are wide-ranging, secretive, and occur at relatively low densities. These factors, in conjunction with the nocturnal activity of owls, cause the counting of raptors by most standard census and survey efforts to be very time consuming and expensive. This paper reviews the most common methods of detecting and counting raptors. It is hoped that it will be of use to the ever-increasing number of biologists, land-use planners, and managers that must determine the occurrence, density, or population dynamics of raptors. Road counts of fixed station or continuous transect design are often used to sample large areas. Detection of spontaneous or elicited vocalizations, especially those of owls, provides a means of detecting and estimating raptor numbers. Searches for nests are accomplished from foot surveys, observations from automobiles and boats, or from aircraft when nest structures are conspicuous (e.g., Osprey). Knowledge of nest habitat, historic records, and inquiries of local residents are useful for locating nests. Often several of these techniques are combined to help find nest sites. Aerial searches have also been used to locate or count large raptors (e.g., eagles), or those that may be conspicuous in open habitats (e.g., tundra). Counts of birds entering or leaving nest colonies or colonial roosts have been attempted on a limited basis. Results from Christmas Bird Counts have provided an index of the abundance of some species. Trapping and banding generally has proven to be an inefficient method of detecting raptors or estimating their populations. Concentrations of migrants at strategically located points around the world afford the best opportunity to count many rap tors in a relatively short period of time, but the influence of many unquantified variables has inhibited extensive interpretation of these counts. Few data exist to demonstrate the effectiveness of these methods. We believe more research on sampling techniques, rather than complete counts or intensive searches, will provide adequate yet affordable estimates of raptor numbers in addition to providing methods for detecting the presence of raptors on areas of interest to researchers and managers.
A Land Manager's Guide to Point Counts of Birds in the Southeast
Paul B. Hamel; Winston P. Smith; Daniel J. Twedt; James R. Woehr; Eddie Morris; Robert B. Hamilton; Robert J. Cooper
1996-01-01
Current widespread concern for the status of neotropical migratory birds has sparked interest in techniques for inventorying and monitoring populations of these and other birds in southeastern forest habitats. The present guide gives detailed instructions for conducting point counts of birds. It further presents a detailed methodology for the design and conduct of...
Selective photon counter for digital x-ray mammography tomosynthesis
NASA Astrophysics Data System (ADS)
Goldan, Amir H.; Karim, Karim S.; Rowlands, J. A.
2006-03-01
Photon counting is an emerging detection technique that is promising for mammography tomosynthesis imagers. In photon counting systems, the value of each image pixel is equal to the number of photons that interact with the detector. In this research, we introduce the design and implementation of a low noise, novel selective photon counting pixel for digital mammography tomosynthesis in crystalline silicon CMOS (complementary metal oxide semiconductor) 0.18 micron technology. The design comprises of a low noise charge amplifier (CA), two low offset voltage comparators, a decision-making unit (DMU), a mode selector, and a pseudo-random counter. Theoretical calculations and simulation results of linearity, gain, and noise of the photon counting pixel are presented.
A land manager's guide to point counts of birds in the Southeast
Hamel, P.B.; Smith, W.P.; Twedt, D.J.; Woehr, J.R.; Morris, E.; Hamilton, R.B.; Cooper, R.J.
1996-01-01
Current widespread concern for the status of neotropical migratory birds has sparked interest in techniques for inventorying and monitoring populations of these and other birds in southeastern forest habitats. The present guide gives detailed instructions for conducting point counts of birds. It further presents a detailed methodology for the design and conduct of inventorial and monitoring surveys based on point counts, including discussion of sample size determination, distribution of counts among habitats, cooperation among neighboring land managers, vegetation sampling, standard data format, and other topics. Appendices provide additional information, making this guide a stand-alone text for managers interested in developing inventories of bird populations on their lands.
Dermatoglyphics: A Diagnostic Aid?
Fuller, I. C.
1973-01-01
Dermatoglyphics of patients suffering from diabetes, schizophrenia, duodenal ulcer, asthma, and various cancers have been contrasted and significant differences in the digital ridge counts, maximum atd angles, and distal palmar loop ridge counts have been found. A discriminant analysis of the digital ridge counts was performed and the function was used to attempt differential diagnosis between these conditions on dermatoglyphic evidence alone. This diagnostic trial failed, and possible reasons for its failure are discussed. Attention is drawn to the possibility that prognostic implications of dermatoglyphics might be relevant to screening techniques. PMID:4714584
Automated Video-Based Traffic Count Analysis.
DOT National Transportation Integrated Search
2016-01-01
The goal of this effort has been to develop techniques that could be applied to the : detection and tracking of vehicles in overhead footage of intersections. To that end we : have developed and published techniques for vehicle tracking based on dete...
Intraosseous repair of the inferior alveolar nerve in rats: an experimental model.
Curtis, N J; Trickett, R I; Owen, E; Lanzetta, M
1998-08-01
A reliable method of exposure of the inferior alveolar nerve in Wistar rats has been developed, to allow intraosseous repair with two microsurgical techniques under halothane inhalational anaesthesia. The microsuturing technique involves anastomosis with 10-0 nylon sutures; a laser-weld technique uses an albumin-based solder containing indocyanine green, plus an infrared (810 nm wavelength) diode laser Seven animals had left inferior alveolar nerve repairs performed with the microsuture and laser-weld techniques. Controls were provided by unoperated nerves in the repaired cases. Histochemical analysis was performed utilizing neuron counts and horseradish peroxidase tracer (HRP) uptake in the mandibular division of the trigeminal ganglion, following sacrifice and staining of frozen sections with cresyl violet and diaminobenzidene. The results of this analysis showed similar mean neuron counts and mean HRP uptake by neurons for the unoperated controls and both microsuture and laser-weld groups. This new technique of intraosseous exposure of the inferior alveolar nerve in rats is described. It allows reliable and reproducible microsurgical repairs using both microsuture and laser-weld techniques.
A technique for automatically extracting useful field of view and central field of view images.
Pandey, Anil Kumar; Sharma, Param Dev; Aheer, Deepak; Kumar, Jay Prakash; Sharma, Sanjay Kumar; Patel, Chetan; Kumar, Rakesh; Bal, Chandra Sekhar
2016-01-01
It is essential to ensure the uniform response of the single photon emission computed tomography gamma camera system before using it for the clinical studies by exposing it to uniform flood source. Vendor specific acquisition and processing protocol provide for studying flood source images along with the quantitative uniformity parameters such as integral and differential uniformity. However, a significant difficulty is that the time required to acquire a flood source image varies from 10 to 35 min depending both on the activity of Cobalt-57 flood source and the pre specified counts in the vendors protocol (usually 4000K-10,000K counts). In case the acquired total counts are less than the total prespecified counts, and then the vendor's uniformity processing protocol does not precede with the computation of the quantitative uniformity parameters. In this study, we have developed and verified a technique for reading the flood source image, remove unwanted information, and automatically extract and save the useful field of view and central field of view images for the calculation of the uniformity parameters. This was implemented using MATLAB R2013b running on Ubuntu Operating system and was verified by subjecting it to the simulated and real flood sources images. The accuracy of the technique was found to be encouraging, especially in view of practical difficulties with vendor-specific protocols. It may be used as a preprocessing step while calculating uniformity parameters of the gamma camera in lesser time with fewer constraints.
Kleine, Tilmann O; Nebe, C Thomas; Löwer, Christa; Lehmitz, Reinhard; Kruse, Rolf; Geilenkeuser, Wolf-Jochen; Dorn-Beineke, Alexandra
2009-08-01
Flow cytometry (FCM) is used with haematology analyzers (HAs) to count cells and differentiate leukocytes in cerebrospinal fluid (CSF). To evaluate the FCM techniques of HAs, 10 external DGKL trials with CSF controls were carried out in 2004 to 2008. Eight single platform HAs with and without CSF equipment were evaluated with living blood leukocytes and erythrocytes in CSF like DGKL controls: Coulter (LH750,755), Abbott CD3200, CD3500, CD3700, CD4000, Sapphire, ADVIA 120(R) CSF assay, and Sysmex XE-2100(R). Results were compared with visual counting of native cells in Fuchs-Rosenthal chamber, unstained, and absolute values of leukocyte differentiation, assayed by dual platform analysis with immune-FCM (FACSCalibur, CD45, CD14) and the chamber counts. Reference values X were compared with HA values Y by statistical evaluation with Passing/Bablock (P/B) linear regression analysis to reveal conformity of both methods. The HAs, studied, produced no valid results with DGKL CSF controls, because P/B regression revealed no conformity with the reference values due to:-blank problems with impedance analysis,-leukocyte loss with preanalytical erythrocyte lysis procedures, especially of monocytes,-inaccurate results with ADVIA cell sphering and cell differentiation with algorithms and enzyme activities (e.g., peroxidase). HA techniques have to be improved, e.g., using no erythrocyte lysis and CSF adequate techniques, to examine CSF samples precise and accurate. Copyright 2009 International Society for Advancement of Cytometry.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fliermans, C.B.; Dougherty, J.M.; Franck, M.M.
Effective in situ bioremediation strategies require an understanding of the effects pollutants and remediation techniques have on subsurface microbial communities. Therefore, detailed characterization of a site`s microbial communities is important. Subsurface sediment borings and water samples were collected from a trichloroethylene (TCE) contaminated site, before and after horizontal well in situ air stripping and bioventing, as well as during methane injection for stimulation of methane-utilizing microorganisms. Subsamples were processed for heterotrophic plate counts, acridine orange direct counts (AODC), community diversity, direct fluorescent antibodies (DFA) enumeration for several nitrogen-transforming bacteria, and Biolog {reg_sign} evaluation of enzyme activity in collected water samples.more » Plate counts were higher in near-surface depths than in the vadose zone sediment samples. During the in situ air stripping and bioventing, counts increased at or near the saturated zone, remained elevated throughout the aquifer, but did not change significantly after the air stripping. Sporadic increases in plate counts at different depths as well as increased diversity appeared to be linked to differing lithologies. AODCs were orders of magnitude higher than plate counts and remained relatively constant with depth except for slight increases near the surface depths and the capillary fringe. Nitrogen-transforming bacteria, as measured by serospecific DFA, were greatly affected both by the in situ air stripping and the methane injection. Biolog{reg_sign} activity appeared to increase with subsurface stimulation both by air and methane. The complexity of subsurface systems makes the use of selective monitoring tools imperative.« less
Lens-free microscopy of cerebrospinal fluid for the laboratory diagnosis of meningitis
NASA Astrophysics Data System (ADS)
Delacroix, Robin; Morel, Sophie Nhu An; Hervé, Lionel; Bordy, Thomas; Blandin, Pierre; Dinten, Jean-Marc; Drancourt, Michel; Allier, Cédric
2018-02-01
The cytology of the cerebrospinal fluid is traditionally performed by an operator (physician, biologist) by means of a conventional light microscope. The operator visually counts the leukocytes (white blood cells) present in a sample of cerebrospinal fluid (10 μl). It is a tedious job and the result is operator-dependent. Here in order to circumvent the limitations of manual counting, we approach the question of numeration of erythrocytes and leukocytes for the cytological diagnosis of meningitis by means of lens-free microscopy. In a first step, a prospective counts of leukocytes was performed by five different operators using conventional optical microscopy. The visual counting yielded an overall 16.7% misclassification of 72 cerebrospinal fluid specimens in meningitis/non-meningitis categories using a 10 leukocyte/μL cut-off. In a second step, the lens-free microscopy algorithm was adapted step-by-step for counting cerebrospinal fluid cells and discriminating leukocytes from erythrocytes. The optimization of the automatic lens-free counting was based on the prospective analysis of 215 cerebrospinal fluid specimens. The optimized algorithm yielded a 100% sensitivity and a 86% specificity compared to confirmed diagnostics. In a third step, a blind lens-free microscopic analysis of 116 cerebrospinal fluid specimens, including six cases of microbiology confirmed infectious meningitis, yielded a 100% sensitivity and a 79% specificity. Adapted lens-free microscopy is thus emerging as an operator-independent technique for the rapid numeration of leukocytes and erythrocytes in cerebrospinal fluid. In particular, this technique is well suited to the rapid diagnosis of meningitis at point-of-care laboratories.
A review of costing methodologies in critical care studies.
Pines, Jesse M; Fager, Samuel S; Milzman, David P
2002-09-01
Clinical decision making in critical care has traditionally been based on clinical outcome measures such as mortality and morbidity. Over the past few decades, however, increasing competition in the health care marketplace has made it necessary to consider costs when making clinical and managerial decisions in critical care. Sophisticated costing methodologies have been developed to aid this decision-making process. We performed a narrative review of published costing studies in critical care during the past 6 years. A total of 282 articles were found, of which 68 met our search criteria. They involved a mean of 508 patients (range, 20-13,907). A total of 92.6% of the studies (63 of 68) used traditional cost analysis, whereas the remaining 7.4% (5 of 68) used cost-effectiveness analysis. None (0 of 68) used cost-benefit analysis or cost-utility analysis. A total of 36.7% (25 of 68) used hospital charges as a surrogate for actual costs. Of the 43 articles that actually counted costs, 37.2% (16 of 43) counted physician costs, 27.9% (12 of 43) counted facility costs, 34.9% (15 of 43) counted nursing costs, 9.3% (4 of 43) counted societal costs, and 90.7% (39 of 43) counted laboratory, equipment, and pharmacy costs. Our conclusion is that despite considerable progress in costing methodologies, critical care studies have not adequately implemented these techniques. Given the importance of financial implications in medicine, it would be prudent for critical care studies to use these more advanced techniques. Copyright 2002, Elsevier Science (USA). All rights reserved.
Beta/alpha continuous air monitor
Becker, Gregory K.; Martz, Dowell E.
1989-01-01
A single deep layer silicon detector in combination with a microcomputer, recording both alpha and beta activity and the energy of each pulse, distinguishing energy peaks using a novel curve fitting technique to reduce the natural alpha counts in the energy region where plutonium and other transuranic alpha emitters are present, and using a novel algorithm to strip out radon daughter contribution to actual beta counts.
ERIC Educational Resources Information Center
Bunde, Gary R.
A statistical comparison was made between two automated devices which were used to count data points (words, sentences, and syllables) needed in the Flesch Reading Ease Score to determine the reading grade level of written material. Determination of grade level of all Rate Training Manuals and Non-Resident Career Courses had been requested by the…
Estimation of U content in coffee samples by fission-track counting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, P.K.; Lal, N.; Nagpaul, K.K.
1985-06-01
Because coffee is consumed in large quantities by humans, the authors undertook the study of the uranium content of coffee as a continuation of earlier work to estimate the U content of foodstuffs. Since literature on this subject is scarce, they decided to use the well-established fission-track-counting technique to determine the U content of coffee.
NASA Astrophysics Data System (ADS)
Araújo, M. M.; Duarte, R. C.; Silva, P. V.; Marchioni, E.; Villavicencio, A. L. C. H.
2009-07-01
Marketing of minimally processed vegetables (MPV) are gaining impetus due to its convenience, freshness and apparent health effect. However, minimal processing does not reduce pathogenic microorganisms to safe levels. Food irradiation is used to extend the shelf life and to inactivate food-borne pathogens. In combination with minimal processing it could improve safety and quality of MPV. A microbiological screening method based on the use of direct epifluorescent filter technique (DEFT) and aerobic plate count (APC) has been established for the detection of irradiated foodstuffs. The aim of this study was to evaluate the applicability of this technique in detecting MPV irradiation. Samples from retail markets were irradiated with 0.5 and 1.0 kGy using a 60Co facility. In general, with a dose increment, DEFT counts remained similar independent of the irradiation while APC counts decreased gradually. The difference of the two counts gradually increased with dose increment in all samples. It could be suggested that a DEFT/APC difference over 2.0 log would be a criteria to judge if a MPV was treated by irradiation. The DEFT/APC method could be used satisfactorily as a screening method for indicating irradiation processing.
Chen, Yao-Yi; Dasari, Surendra; Ma, Ze-Qiang; Vega-Montoto, Lorenzo J.; Li, Ming
2013-01-01
Spectral counting has become a widely used approach for measuring and comparing protein abundance in label-free shotgun proteomics. However, when analyzing complex samples, the ambiguity of matching between peptides and proteins greatly affects the assessment of peptide and protein inventories, differentiation, and quantification. Meanwhile, the configuration of database searching algorithms that assign peptides to MS/MS spectra may produce different results in comparative proteomic analysis. Here, we present three strategies to improve comparative proteomics through spectral counting. We show that comparing spectral counts for peptide groups rather than for protein groups forestalls problems introduced by shared peptides. We demonstrate the advantage and flexibility of this new method in two datasets. We present four models to combine four popular search engines that lead to significant gains in spectral counting differentiation. Among these models, we demonstrate a powerful vote counting model that scales well for multiple search engines. We also show that semi-tryptic searching outperforms tryptic searching for comparative proteomics. Overall, these techniques considerably improve protein differentiation on the basis of spectral count tables. PMID:22552787
Chen, Yao-Yi; Dasari, Surendra; Ma, Ze-Qiang; Vega-Montoto, Lorenzo J; Li, Ming; Tabb, David L
2012-09-01
Spectral counting has become a widely used approach for measuring and comparing protein abundance in label-free shotgun proteomics. However, when analyzing complex samples, the ambiguity of matching between peptides and proteins greatly affects the assessment of peptide and protein inventories, differentiation, and quantification. Meanwhile, the configuration of database searching algorithms that assign peptides to MS/MS spectra may produce different results in comparative proteomic analysis. Here, we present three strategies to improve comparative proteomics through spectral counting. We show that comparing spectral counts for peptide groups rather than for protein groups forestalls problems introduced by shared peptides. We demonstrate the advantage and flexibility of this new method in two datasets. We present four models to combine four popular search engines that lead to significant gains in spectral counting differentiation. Among these models, we demonstrate a powerful vote counting model that scales well for multiple search engines. We also show that semi-tryptic searching outperforms tryptic searching for comparative proteomics. Overall, these techniques considerably improve protein differentiation on the basis of spectral count tables.
CALCIUM ABSORPTION IN MAN: BASED ON LARGE VOLUME LIQUID SCINTILLATION COUNTER STUDIES.
LUTWAK, L; SHAPIRO, J R
1964-05-29
A technique has been developed for the in vivo measurement of absorption of calcium in man after oral administration of 1 to 5 microcuries of calcium-47 and continuous counting of the radiation in the subject's arm with a large volume liquid scintillation counter. The maximum value for the arm counting technique is proportional to the absorption of tracer as measured by direct stool analysis. The rate of uptake by the arm is lower in subjects with either the malabsorption syndrome or hypoparathyroidism. The administration of vitamin D increases both the absorption rate and the maximum amount of calcium absorbed.
Sanchez-Cabeza, J A; Pujol, L
1995-05-01
The radiological examination of water requires a rapid screening technique that permits the determination of the gross alpha and beta activities of each sample in order to decide if further radiological analyses are necessary. In this work, the use of a low background liquid scintillation system (Quantulus 1220) is proposed to simultaneously determine the gross activities in water samples. Liquid scintillation is compared to more conventional techniques used in most monitoring laboratories. In order to determine the best counting configuration of the system, pulse shape discrimination was optimized for 6 scintillant/vial combinations. It was concluded that the best counting configuration was obtained with the scintillation cocktail Optiphase Hisafe 3 in Zinsser low diffusion vials. The detection limits achieved were 0.012 Bq L-1 and 0.14 Bq L-1 for gross alpha and beta activity respectively, after a 1:10 concentration process by simple evaporation and for a counting time of only 360 min. The proposed technique is rapid, gives spectral information, and is adequate to determine gross activities according to the World Health Organization (WHO) guideline values.
Relativistic Transformations of Light Power.
ERIC Educational Resources Information Center
McKinley, John M.
1979-01-01
Using a photon-counting technique, finds the angular distribution of emitted and detected power and the total radiated power of an arbitrary moving source, and uses the technique to verify the predicted effect of the earth's motion through the cosmic blackbody radiation. (Author/GA)
Using Pinochle to motivate the restricted combinations with repetitions problem
NASA Astrophysics Data System (ADS)
Gorman, Patrick S.; Kunkel, Jeffrey D.; Vasko, Francis J.
2011-07-01
A standard example used in introductory combinatoric courses is to count the number of five-card poker hands possible from a straight deck of 52 distinct cards. A more interesting problem is to count the number of distinct hands possible from a Pinochle deck in which there are multiple, but obviously limited, copies of each type of card (two copies for single-deck, four for double deck). This problem is more interesting because our only concern is to count the number of distinguishable hands that can be dealt. In this note, under various scenarios, we will discuss two combinatoric techniques for counting these hands; namely, the inclusion-exclusion principle and generating functions. We will then show that these Pinochle examples motivate a general counting formula for what are called 'regular' combinations by Riordan. Finally, we prove the correctness of this formula using generating functions.
Evaluation of a Multicolor, Single-Tube Technique To Enumerate Lymphocyte Subpopulations▿
Colombo, F.; Cattaneo, A.; Lopa, R.; Portararo, P.; Rebulla, P.; Porretti, L.
2008-01-01
To evaluate the fully automated FACSCanto software, we compared lymphocyte subpopulation counts obtained using three-color FACSCalibur-CELLQuest and six-color FACSCanto-FACSCanto software techniques. High correlations were observed between data obtained with these techniques. Our study indicated that FACSCanto clinical software is accurate and sensitive in single-platform lymphocyte immunophenotyping. PMID:18448621
USDA-ARS?s Scientific Manuscript database
Different measurement techniques for aerosol characterization and quantification either directly or indirectly measure different aerosol properties (i.e. count, mass, speciation, etc.). Comparisons and combinations of multiple measurement techniques sampling the same aerosol can provide insight into...
Beta/alpha continuous air monitor
Becker, G.K.; Martz, D.E.
1988-06-27
A single deep layer silicon detector in combination with a microcomputer, recording both alpha and beta activity and the energy of each pulse, distinquishing energy peaks using a novel curve fitting technique to reduce the natural alpha counts in the energy region where plutonium and other transuranic alpha emitters are present, and using a novel algorithm to strip out radon daughter contribution to actual beta counts. 7 figs.
Habash, Marc; Johns, Robert
2009-10-01
This study compared an automated Escherichia coli and coliform detection system with the membrane filtration direct count technique for water testing. The automated instrument performed equal to or better than the membrane filtration test in analyzing E. coli-spiked samples and blind samples with interference from Proteus vulgaris or Aeromonas hydrophila.
Benítez, Francisco Moreno; Camacho, Antonio Letrán; del Cuvillo Bernal, Alfonso; de Medina, Pedro Lobatón Sánchez; García Cózar, Francisco J; Romeu, Marisa Espinazo
2014-01-01
There is an increase in the incidence of pollen related allergy, thus information on pollen schedules would be a great asset for physicians to improve the clinical care of patients. Like cypress pollen sensitization shows a high prevalence among the causes of allergic rhinitis, and therefore it is of interest to use it like a model of study, distinguishing cypress pollen, pollen count, and allergenic load level. In this work, we use a flow cytometry based technique to obtain both Cupressus arizonica pollen count and allergenic load, using specific rabbit polyclonal antibody Cup a1 and its comparison with optical microscopy technique measurement. Airborne samples were collected from Burkard Spore-Trap and Burkard Cyclone Cupressus arizonica pollen was studied using specific rabbit polyclonal antibody Cup a1, labeled with AlexaFluor(®) 488 or 750 and analysed by Flow Cytometry in both an EPICS XL and Cyan ADP cytometers (Beckman Coulter(®) ). Optical microscopy study was realized with a Leica optical microscope. Bland and Altman was used to determine agreement between both techniques measured. We can identify three different populations based on rabbit polyclonal antibody Cup a1 staining. The main region (44.5%) had 97.3% recognition, a second region (25%) with 28% and a third region (30.5%) with 68% respectively. Immunofluorescence and confocal microscopy showed that main region corresponds to whole pollen grains, the second region are pollen without exine and the third region is constituted by smaller particles with allergenic properties. Pollen schedule shows a higher correlation measured by optical microscopy and flow cytometry in the pollen count with a P-value: 0.0008 E(-2) and 0.0002 with regard to smaller particles, so the Bland and Altman measurement showed a good correlation between them, P-value: 0.0003. Determination of pollen count and allergenic load by flow cytometry represents an important tool in the determination of airborne respiratory allergens. We showed that not only whole pollen but also smaller particles could induce allergic sensitization. This is the first study where flow cytometry is used for calculating pollen counts and allergenic load. Copyright © 2013 Clinical Cytometry Society.
Assessment of Intervertebral Disc Degeneration Based on Quantitative MRI Analysis: an in vivo study
Grunert, Peter; Hudson, Katherine D.; Macielak, Michael R.; Aronowitz, Eric; Borde, Brandon H.; Alimi, Marjan; Njoku, Innocent; Ballon, Douglas; Tsiouris, Apostolos John; Bonassar, Lawrence J.; Härtl, Roger
2015-01-01
Study design Animal experimental study Objective To evaluate a novel quantitative imaging technique for assessing disc degeneration. Summary of Background Data T2-relaxation time (T2-RT) measurements have been used to quantitatively assess disc degeneration. T2 values correlate with the water content of inter vertebral disc tissue and thereby allow for the indirect measurement of nucleus pulposus (NP) hydration. Methods We developed an algorithm to subtract out MRI voxels not representing NP tissue based on T2-RT values. Filtered NP voxels were used to measure nuclear size by their amount and nuclear hydration by their mean T2-RT. This technique was applied to 24 rat-tail intervertebral discs’ (IVDs), which had been punctured with an 18-gauge needle according to different techniques to induce varying degrees of degeneration. NP voxel count and average T2-RT were used as parameters to assess the degeneration process at 1 and 3 months post puncture. NP voxel counts were evaluated against X-ray disc height measurements and qualitative MRI studies based on the Pfirrmann grading system. Tails were collected for histology to correlate NP voxel counts to histological disc degeneration grades and to NP cross-sectional area measurements. Results NP voxel count measurements showed strong correlations to qualitative MRI analyses (R2=0.79, p<0.0001), histological degeneration grades (R2=0.902, p<0.0001) and histological NP cross-sectional area measurements (R2=0.887, p<0.0001). In contrast to NP voxel counts, the mean T2-RT for each punctured group remained constant between months 1 and 3. The mean T2-RTs for the punctured groups did not show a statistically significant difference from those of healthy IVDs (63.55ms ±5.88ms month 1 and 62.61ms ±5.02ms) at either time point. Conclusion The NP voxel count proved to be a valid parameter to quantitatively assess disc degeneration in a needle puncture model. The mean NP T2-RT does not change significantly in needle-puncture induced degenerated IVDs. IVDs can be segmented into different tissue components according to their innate T2-RT. PMID:24384655
Nisari, Mehtap; Ertekin, Tolga; Ozçelik, Ozlem; Cınar, Serife; Doğanay, Selim; Acer, Niyazi
2012-11-01
Brain development in early life is thought to be critical period in neurodevelopmental disorder. Knowledge relating to this period is currently quite limited. This study aimed to evaluate the volume relation of total brain (TB), cerebrum, cerebellum and bulbus+pons by the use of Archimedes' principle and stereological (point-counting) method and after that to compare these approaches with each other in newborns. This study was carried out on five newborn cadavers mean weighing 2.220 ± 1.056 g with no signs of neuropathology. The mean (±SD) age of the subjects was 39.7 (±1.5) weeks. The volume and volume fraction of the total brain, cerebrum, cerebellum and bulbus+pons were determined on magnetic resonance (MR) images using the point-counting approach of stereological methods and by the use of fluid displacement technique. The mean (±SD) TB, cerebrum, cerebellum and bulbus+pons volumes by fluid displacement were 271.48 ± 78.3, 256.6 ± 71.8, 12.16 ± 6.1 and 2.72 ± 1.6 cm3, respectively. By the Cavalieri principle (point-counting) using sagittal MRIs, they were 262.01 ± 74.9, 248.11 ± 68.03, 11.68 ± 6.1 and 2.21 ± 1.13 cm3, respectively. The mean (± SD) volumes by point-counting technique using axial MR images were 288.06 ± 88.5, 275.2 ± 83.1, 19.75 ± 5.3 and 2.11 ± 0.7 cm3, respectively. There were no differences between the fluid displacement and point-counting (using axial and sagittal images) for all structures (p > 0.05). This study presents the basic data for studies relative to newborn's brain volume fractions according to two methods. Stereological (point-counting) estimation may be accepted a beneficial and new tool for neurological evaluation in vivo research of the brain. Based on these techniques we introduce here, the clinician may evaluate the growth of the brain in a more efficient and precise manner.
The impact of varicocelectomy on sperm parameters: a meta-analysis.
Schauer, Ingrid; Madersbacher, Stephan; Jost, Romy; Hübner, Wilhelm Alexander; Imhof, Martin
2012-05-01
We determined the impact of 3 surgical techniques (high ligation, inguinal varicocelectomy and the subinguinal approach) for varicocelectomy on sperm parameters (count and motility) and pregnancy rates. By searching the literature using MEDLINE and the Cochrane Library with the last search performed in February 2011, focusing on the last 20 years, a total of 94 articles published between 1975 and 2011 reporting on sperm parameters before and after varicocelectomy were identified. Inclusion criteria for this meta-analysis were at least 2 semen analyses (before and 3 or more months after the procedure), patient age older than 19 years, clinical subfertility and/or abnormal semen parameters, and a clinically palpable varicocele. To rule out skewing factors a bias analysis was performed, and statistical analysis was done with RevMan5(®) and SPSS 15.0(®). A total of 14 articles were included in the statistical analysis. All 3 surgical approaches led to significant or highly significant postoperative improvement of both parameters with only slight numeric differences among the techniques. This difference did not reach statistical significance for sperm count (p = 0.973) or sperm motility (p = 0.372). After high ligation surgery sperm count increased by 10.85 million per ml (p = 0.006) and motility by 6.80% (p <0.00001) on the average. Inguinal varicocelectomy led to an improvement in sperm count of 7.17 million per ml (p <0.0001) while motility changed by 9.44% (p = 0.001). Subinguinal varicocelectomy provided an increase in sperm count of 9.75 million per ml (p = 0.002) and sperm motility by 12.25% (p = 0.001). Inguinal varicocelectomy showed the highest pregnancy rate of 41.48% compared to 26.90% and 26.56% after high ligation and subinguinal varicocelectomy, respectively, and the difference was statistically significant (p = 0.035). This meta-analysis suggests that varicocelectomy leads to significant improvements in sperm count and motility regardless of surgical technique, with the inguinal approach offering the highest pregnancy rate. Copyright © 2012 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
A Review of Statistical Disclosure Control Techniques Employed by Web-Based Data Query Systems.
Matthews, Gregory J; Harel, Ofer; Aseltine, Robert H
We systematically reviewed the statistical disclosure control techniques employed for releasing aggregate data in Web-based data query systems listed in the National Association for Public Health Statistics and Information Systems (NAPHSIS). Each Web-based data query system was examined to see whether (1) it employed any type of cell suppression, (2) it used secondary cell suppression, and (3) suppressed cell counts could be calculated. No more than 30 minutes was spent on each system. Of the 35 systems reviewed, no suppression was observed in more than half (n = 18); observed counts below the threshold were observed in 2 sites; and suppressed values were recoverable in 9 sites. Six sites effectively suppressed small counts. This inquiry has revealed substantial weaknesses in the protective measures used in data query systems containing sensitive public health data. Many systems utilized no disclosure control whatsoever, and the vast majority of those that did deployed it inconsistently or inadequately.
Compact SPAD-Based Pixel Architectures for Time-Resolved Image Sensors
Perenzoni, Matteo; Pancheri, Lucio; Stoppa, David
2016-01-01
This paper reviews the state of the art of single-photon avalanche diode (SPAD) image sensors for time-resolved imaging. The focus of the paper is on pixel architectures featuring small pixel size (<25 μm) and high fill factor (>20%) as a key enabling technology for the successful implementation of high spatial resolution SPAD-based image sensors. A summary of the main CMOS SPAD implementations, their characteristics and integration challenges, is provided from the perspective of targeting large pixel arrays, where one of the key drivers is the spatial uniformity. The main analog techniques aimed at time-gated photon counting and photon timestamping suitable for compact and low-power pixels are critically discussed. The main features of these solutions are the adoption of analog counting techniques and time-to-analog conversion, in NMOS-only pixels. Reliable quantum-limited single-photon counting, self-referenced analog-to-digital conversion, time gating down to 0.75 ns and timestamping with 368 ps jitter are achieved. PMID:27223284
Detection of bremsstrahlung radiation of 90Sr-90Y for emergency lung counting.
Ho, A; Hakmana Witharana, S S; Jonkmans, G; Li, L; Surette, R A; Dubeau, J; Dai, X
2012-09-01
This study explores the possibility of developing a field-deployable (90)Sr detector for rapid lung counting in emergency situations. The detection of beta-emitters (90)Sr and its daughter (90)Y inside the human lung via bremsstrahlung radiation was performed using a 3″ × 3″ NaI(Tl) crystal detector and a polyethylene-encapsulated source to emulate human lung tissue. The simulation results show that this method is a viable technique for detecting (90)Sr with a minimum detectable activity (MDA) of 1.07 × 10(4) Bq, using a realistic dual-shielded detector system in a 0.25-µGy h(-1) background field for a 100-s scan. The MDA is sufficiently sensitive to meet the requirement for emergency lung counting of Type S (90)Sr intake. The experimental data were verified using Monte Carlo calculations, including an estimate for internal bremsstrahlung, and an optimisation of the detector geometry was performed. Optimisations in background reduction techniques and in the electronic acquisition systems are suggested.
Impact of donor- and collection-related variables on product quality in ex utero cord blood banking.
Askari, Sabeen; Miller, John; Chrysler, Gayl; McCullough, Jeffrey
2005-02-01
Optimizing product quality is a current focus in cord blood banking. This study evaluates the role of selected donor- and collection-related variables. Retrospective review was performed of cord blood units (CBUs) collected ex utero between February 1, 2000, and February 28, 2002. Preprocessing volume and total nucleated cell (TNC) counts and postprocessing CD34 cell counts were used as product quality indicators. Of 2084 CBUs, volume determinations and TNC counts were performed on 1628 and CD34+ counts on 1124 CBUs. Mean volume and TNC and CD34+ counts were 85.2 mL, 118.9 x 10(7), and 5.2 x 10(6), respectively. In univariate analysis, placental weight of greater than 500 g and meconium in amniotic fluid correlated with better volume and TNC and CD34+ counts. Greater than 40 weeks' gestation predicted enhanced volume and TNC count. Cesarean section, two- versus one-person collection, and not greater than 5 minutes between placental delivery and collection produced superior volume. Increased TNC count was also seen in Caucasian women, primigravidae, female newborns, and collection duration of more than 5 minutes. A time between delivery of newborn and placenta of not greater than 10 minutes predicted better volume and CD34+ count. By regression analysis, collection within not greater than 5 minutes of placental delivery produced superior volume and TNC count. Donor selection and collection technique modifications may improve product quality. TNC count appears to be more affected by different variables than CD34+ count.
Differential white cell count by centrifugal microfluidics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sommer, Gregory Jon; Tentori, Augusto M.; Schaff, Ulrich Y.
We present a method for counting white blood cells that is uniquely compatible with centrifugation based microfluidics. Blood is deposited on top of one or more layers of density media within a microfluidic disk. Spinning the disk causes the cell populations within whole blood to settle through the media, reaching an equilibrium based on the density of each cell type. Separation and fluorescence measurement of cell types stained with a DNA dye is demonstrated using this technique. The integrated signal from bands of fluorescent microspheres is shown to be proportional to their initial concentration in suspension. Among the current generationmore » of medical diagnostics are devices based on the principle of centrifuging a CD sized disk functionalized with microfluidics. These portable 'lab on a disk' devices are capable of conducting multiple assays directly from a blood sample, embodied by platforms developed by Gyros, Samsung, and Abaxis. [1,2] However, no centrifugal platform to date includes a differential white blood cell count, which is an important metric complimentary to diagnostic assays. Measuring the differential white blood cell count (the relative fraction of granulocytes, lymphocytes, and monocytes) is a standard medical diagnostic technique useful for identifying sepsis, leukemia, AIDS, radiation exposure, and a host of other conditions that affect the immune system. Several methods exist for measuring the relative white blood cell count including flow cytometry, electrical impedance, and visual identification from a stained drop of blood under a microscope. However, none of these methods is easily incorporated into a centrifugal microfluidic diagnostic platform.« less
Allen, Robert C; John, Mallory G; Rutan, Sarah C; Filgueira, Marcelo R; Carr, Peter W
2012-09-07
A singular value decomposition-based background correction (SVD-BC) technique is proposed for the reduction of background contributions in online comprehensive two-dimensional liquid chromatography (LC×LC) data. The SVD-BC technique was compared to simply subtracting a blank chromatogram from a sample chromatogram and to a previously reported background correction technique for one dimensional chromatography, which uses an asymmetric weighted least squares (AWLS) approach. AWLS was the only background correction technique to completely remove the background artifacts from the samples as evaluated by visual inspection. However, the SVD-BC technique greatly reduced or eliminated the background artifacts as well and preserved the peak intensity better than AWLS. The loss in peak intensity by AWLS resulted in lower peak counts at the detection thresholds established using standards samples. However, the SVD-BC technique was found to introduce noise which led to detection of false peaks at the lower detection thresholds. As a result, the AWLS technique gave more precise peak counts than the SVD-BC technique, particularly at the lower detection thresholds. While the AWLS technique resulted in more consistent percent residual standard deviation values, a statistical improvement in peak quantification after background correction was not found regardless of the background correction technique used. Copyright © 2012 Elsevier B.V. All rights reserved.
Wild turkey poult survival in southcentral Iowa
Hubbard, M.W.; Garner, D.L.; Klaas, E.E.
1999-01-01
Poult survival is key to understanding annual change in wild turkey (Meleagris gallopavo) populations. Survival of eastern wild turkey poults (M. g. silvestris) 0-4 weeks posthatch was studied in southcentral Iowa during 1994-97. Survival estimates of poults were calculated based on biweekly flush counts and daily locations acquired via radiotelemetry. Poult survival averaged 0.52 ?? 0.14% (?? ?? SE) for telemetry counts and 0.40 ?? 0.15 for flush counts. No within-year or across-year differences were detected between estimation techniques. More than 72% (n = 32) of documented poult mortality occurred ???14 days posthatch, and mammalian predation accounted for 92.9% of documented mortality. If mortality agents are not of concern, we suggest biologists conduct 4-week flush counts to obtain poult survival estimates for use in population models and development of harvest recommendations.
Stochastic hybrid systems for studying biochemical processes.
Singh, Abhyudai; Hespanha, João P
2010-11-13
Many protein and mRNA species occur at low molecular counts within cells, and hence are subject to large stochastic fluctuations in copy numbers over time. Development of computationally tractable frameworks for modelling stochastic fluctuations in population counts is essential to understand how noise at the cellular level affects biological function and phenotype. We show that stochastic hybrid systems (SHSs) provide a convenient framework for modelling the time evolution of population counts of different chemical species involved in a set of biochemical reactions. We illustrate recently developed techniques that allow fast computations of the statistical moments of the population count, without having to run computationally expensive Monte Carlo simulations of the biochemical reactions. Finally, we review different examples from the literature that illustrate the benefits of using SHSs for modelling biochemical processes.
A Next Generation Digital Counting System For Low-Level Tritium Studies (Project Report)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bowman, P.
2016-10-03
Since the early seventies, SRNL has pioneered low-level tritium analysis using various nuclear counting technologies and techniques. Since 1999, SRNL has successfully performed routine low-level tritium analyses with counting systems based on digital signal processor (DSP) modules developed in the late 1990s. Each of these counting systems are complex, unique to SRNL, and fully dedicated to performing routine tritium analyses of low-level environmental samples. It is time to modernize these systems due to a variety of issues including (1) age, (2) lack of direct replacement electronics modules and (3) advances in digital signal processing and computer technology. There has beenmore » considerable development in many areas associated with the enterprise of performing low-level tritium analyses. The objective of this LDRD project was to design, build, and demonstrate a Next Generation Tritium Counting System (NGTCS), while not disrupting the routine low-level tritium analyses underway in the facility on the legacy counting systems. The work involved (1) developing a test bed for building and testing new counting system hardware that does not interfere with our routine analyses, (2) testing a new counting system based on a modern state of the art DSP module, and (3) evolving the low-level tritium counter design to reflect the state of the science.« less
Arraycount, an algorithm for automatic cell counting in microwell arrays.
Kachouie, Nezamoddin; Kang, Lifeng; Khademhosseini, Ali
2009-09-01
Microscale technologies have emerged as a powerful tool for studying and manipulating biological systems and miniaturizing experiments. However, the lack of software complementing these techniques has made it difficult to apply them for many high-throughput experiments. This work establishes Arraycount, an approach to automatically count cells in microwell arrays. The procedure consists of fluorescent microscope imaging of cells that are seeded in microwells of a microarray system and then analyzing images via computer to recognize the array and count cells inside each microwell. To start counting, green and red fluorescent images (representing live and dead cells, respectively) are extracted from the original image and processed separately. A template-matching algorithm is proposed in which pre-defined well and cell templates are matched against the red and green images to locate microwells and cells. Subsequently, local maxima in the correlation maps are determined and local maxima maps are thresholded. At the end, the software records the cell counts for each detected microwell on the original image in high-throughput. The automated counting was shown to be accurate compared with manual counting, with a difference of approximately 1-2 cells per microwell: based on cell concentration, the absolute difference between manual and automatic counting measurements was 2.5-13%.
Evaluation of canoe surveys for anurans along the Rio Grande in Big Bend National Park, Texas
Jung, R.E.; Bonine, K.E.; Rosenshield, M.L.; de la Reza, A.; Raimondo, S.; Droege, S.
2002-01-01
Surveys for amphibians along large rivers pose monitoring and sampling problems. We used canoes at night to spotlight and listen for anurans along four stretches of the Rio Grande in Big Bend National Park, Texas, in 1998 and 1999. We explored temporal and spatial variation in amphibian counts and species richness and assessed relationships between amphibian counts and environmental variables, as well as amphibian-habitat associations along the banks of the Rio Grande. We documented seven anuran species, but Rio Grande leopard frogs (Rana berlandieri) accounted for 96% of the visual counts. Chorus surveys along the river detected similar or fewer numbers of species, but orders of magnitude fewer individuals compared to visual surveys. The number of species varied on average by 37% across monthly and nightly surveys. We found similar average coefficients of variation in counts of Rio Grande leopard frogs on monthly and nightly bases (CVs = 42-44%), suggesting that canoe surveys are a fairly precise technique for counts of this species. Numbers of Rio Grande leopard frogs observed were influenced by river gage levels and air and water temperatures, suggesting that surveys should be conducted under certain environmental conditions to maximize counts and maintain consistency. We found significant differences in species richness and bullfrog (Rana catesbeiana) counts among the four river stretches. Four rare anuran species were found along certain stretches but not others, which could represent either sampling error or unmeasured environmental or habitat differences among the river stretches. We found a greater association of Rio Grande leopard frogs with mud banks compared to rock or cliff (canyon) areas and with seepwillow and open areas compared to giant reed and other vegetation types. Canoe surveys appear to be a useful survey technique for anurans along the Rio Grande and may work for other large river systems as well.
Rapid enumeration of viable bacteria by image analysis
NASA Technical Reports Server (NTRS)
Singh, A.; Pyle, B. H.; McFeters, G. A.
1989-01-01
A direct viable counting method for enumerating viable bacteria was modified and made compatible with image analysis. A comparison was made between viable cell counts determined by the spread plate method and direct viable counts obtained using epifluorescence microscopy either manually or by automatic image analysis. Cultures of Escherichia coli, Salmonella typhimurium, Vibrio cholerae, Yersinia enterocolitica and Pseudomonas aeruginosa were incubated at 35 degrees C in a dilute nutrient medium containing nalidixic acid. Filtered samples were stained for epifluorescence microscopy and analysed manually as well as by image analysis. Cells enlarged after incubation were considered viable. The viable cell counts determined using image analysis were higher than those obtained by either the direct manual count of viable cells or spread plate methods. The volume of sample filtered or the number of cells in the original sample did not influence the efficiency of the method. However, the optimal concentration of nalidixic acid (2.5-20 micrograms ml-1) and length of incubation (4-8 h) varied with the culture tested. The results of this study showed that under optimal conditions, the modification of the direct viable count method in combination with image analysis microscopy provided an efficient and quantitative technique for counting viable bacteria in a short time.
Computer Simulation of the Population Growth (Schizosaccharomyces Pombe) Experiment.
ERIC Educational Resources Information Center
Daley, Michael; Hillier, Douglas
1981-01-01
Describes a computer program (available from authors) developed to simulate "Growth of a Population (Yeast) Experiment." Students actively revise the counting techniques with realistically simulated haemocytometer or eye-piece grid and are reminded of the necessary dilution technique. Program can be modified to introduce such variables…
ERIC Educational Resources Information Center
Lockwood, Elise
2014-01-01
Formulas, problem types, keywords, and tricky techniques can certainly be valuable tools for successful counters. However, they can easily become substitutes for critical thinking about counting problems and for deep consideration of the set of outcomes. Formulas and techniques should serve as tools for students as they think critically about…
Acta Aeronautica et Astronautica Sinica,
1983-03-04
power spectrum and counting methods [1,2,3]. If the stochastic load-time mechanism (such as gusts of wind, random 38...34 "- " ° - " . . .. . . . . . . . . ’ - - - Ř vibrations, etc.), then we can use the power spectrum technique, and we can also use the counting method. However, the...simplification for treat - ment so that the differences in obtained results are very minute, and are also closest to the random spectrum. This then tells us
NASA Astrophysics Data System (ADS)
Rahnemoonfar, Maryam; Foster, Jamie; Starek, Michael J.
2017-05-01
Beef production is the main agricultural industry in Texas, and livestock are managed in pasture and rangeland which are usually huge in size, and are not easily accessible by vehicles. The current research method for livestock location identification and counting is visual observation which is very time consuming and costly. For animals on large tracts of land, manned aircraft may be necessary to count animals which is noisy and disturbs the animals, and may introduce a source of error in counts. Such manual approaches are expensive, slow and labor intensive. In this paper we study the combination of small unmanned aerial vehicle (sUAV) and machine vision technology as a valuable solution to manual animal surveying. A fixed-wing UAV fitted with GPS and digital RGB camera for photogrammetry was flown at the Welder Wildlife Foundation in Sinton, TX. Over 600 acres were flown with four UAS flights and individual photographs used to develop orthomosaic imagery. To detect animals in UAV imagery, a fully automatic technique was developed based on spatial and spectral characteristics of objects. This automatic technique can even detect small animals that are partially occluded by bushes. Experimental results in comparison to ground-truth show the effectiveness of our algorithm.
Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources
DOE Office of Scientific and Technical Information (OSTI.GOV)
Klumpp, John
We propose a radiation detection system which generates its own discrete sampling distribution based on past measurements of background. The advantage to this approach is that it can take into account variations in background with respect to time, location, energy spectra, detector-specific characteristics (i.e. different efficiencies at different count rates and energies), etc. This would therefore be a 'machine learning' approach, in which the algorithm updates and improves its characterization of background over time. The system would have a 'learning mode,' in which it measures and analyzes background count rates, and a 'detection mode,' in which it compares measurements frommore » an unknown source against its unique background distribution. By characterizing and accounting for variations in the background, general purpose radiation detectors can be improved with little or no increase in cost. The statistical and computational techniques to perform this kind of analysis have already been developed. The necessary signal analysis can be accomplished using existing Bayesian algorithms which account for multiple channels, multiple detectors, and multiple time intervals. Furthermore, Bayesian machine-learning techniques have already been developed which, with trivial modifications, can generate appropriate decision thresholds based on the comparison of new measurements against a nonparametric sampling distribution. (authors)« less
Ocular Biocompatibility of Nitinol Intraocular Clips
Velez-Montoya, Raul; Erlanger, Michael
2012-01-01
Purpose. To evaluate the tolerance and biocompatibility of a preformed nitinol intraocular clip in an animal model after anterior segment surgery. Methods. Yucatan mini-pigs were used. A 30-gauge prototype injector was used to attach a shape memory nitinol clip to the iris of five pigs. Another five eyes received conventional polypropylene suture with a modified Seipser slip knot. The authors compared the surgical time of each technique. All eyes underwent standard full-field electroretinogram at baseline and 8 weeks after surgery. The animals were euthanized and eyes collected for histologic analysis after 70 days (10 weeks) postsurgery. The corneal thickness, corneal endothelial cell counts, specular microscopy parameters, retina cell counts, and electroretinogram parameters were compared between the groups. A two sample t-test for means and a P value of 0.05 were use for assessing statistical differences between measurements. Results. The injection of the nitinol clip was 15 times faster than conventional suturing. There were no statistical differences between the groups for corneal thickness, endothelial cell counts, specular microscopy parameters, retina cell counts, and electroretinogram measurements. Conclusions. The nitinol clip prototype is well tolerated and showed no evidence of toxicity in the short-term. The injectable delivery system was faster and technically less challenging than conventional suture techniques. PMID:22064995
On the use of positron counting for radio-Assay in nuclear pharmaceutical production.
Maneuski, D; Giacomelli, F; Lemaire, C; Pimlott, S; Plenevaux, A; Owens, J; O'Shea, V; Luxen, A
2017-07-01
Current techniques for the measurement of radioactivity at various points during PET radiopharmaceutical production and R&D are based on the detection of the annihilation gamma rays from the radionuclide in the labelled compound. The detection systems to measure these gamma rays are usually variations of NaI or CsF scintillation based systems requiring costly and heavy lead shielding to reduce background noise. These detectors inherently suffer from low detection efficiency, high background noise and very poor linearity. They are also unable to provide any reasonably useful position information. A novel positron counting technique is proposed for the radioactivity assay during radiopharmaceutical manufacturing that overcomes these limitations. Detection of positrons instead of gammas offers an unprecedented level of position resolution of the radiation source (down to sub-mm) thanks to the nature of the positron interaction with matter. Counting capability instead of charge integration in the detector brings the sensitivity down to the statistical limits at the same time as offering very high dynamic range and linearity from zero to any arbitrarily high activity. This paper reports on a quantitative comparison between conventional detector systems and the proposed positron counting detector. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Vehicle counting system using real-time video processing
NASA Astrophysics Data System (ADS)
Crisóstomo-Romero, Pedro M.
2006-02-01
Transit studies are important for planning a road network with optimal vehicular flow. A vehicular count is essential. This article presents a vehicle counting system based on video processing. An advantage of such system is the greater detail than is possible to obtain, like shape, size and speed of vehicles. The system uses a video camera placed above the street to image transit in real-time. The video camera must be placed at least 6 meters above the street level to achieve proper acquisition quality. Fast image processing algorithms and small image dimensions are used to allow real-time processing. Digital filters, mathematical morphology, segmentation and other techniques allow identifying and counting all vehicles in the image sequences. The system was implemented under Linux in a 1.8 GHz Pentium 4 computer. A successful count was obtained with frame rates of 15 frames per second for images of size 240x180 pixels and 24 frames per second for images of size 180x120 pixels, thus being able to count vehicles whose speeds do not exceed 150 km/h.
NASA Astrophysics Data System (ADS)
Grinyer, G. F.; Svensson, C. E.; Andreoiu, C.; Andreyev, A. N.; Austin, R. A. E.; Ball, G. C.; Bandyopadhyay, D.; Chakrawarthy, R. S.; Finlay, P.; Garrett, P. E.; Hackman, G.; Hyland, B.; Kulp, W. D.; Leach, K. G.; Leslie, J. R.; Morton, A. C.; Pearson, C. J.; Phillips, A. A.; Sarazin, F.; Schumaker, M. A.; Smith, M. B.; Valiente-Dobón, J. J.; Waddington, J. C.; Williams, S. J.; Wong, J.; Wood, J. L.; Zganjar, E. F.
2007-09-01
A general technique that corrects γ-ray gated β decay-curve data for detector pulse pile-up is presented. The method includes corrections for non-zero time-resolution and energy-threshold effects in addition to a special treatment of saturating events due to cosmic rays. This technique is verified through a Monte Carlo simulation and experimental data using radioactive beams of Na26 implanted at the center of the 8π γ-ray spectrometer at the ISAC facility at TRIUMF in Vancouver, Canada. The β-decay half-life of Na26 obtained from counting 1809-keV γ-ray photopeaks emitted by the daughter Mg26 was determined to be T=1.07167±0.00055 s following a 27σ correction for detector pulse pile-up. This result is in excellent agreement with the result of a previous measurement that employed direct β counting and demonstrates the feasibility of high-precision β-decay half-life measurements through the use of high-purity germanium γ-ray detectors. The technique presented here, while motivated by superallowed-Fermi β decay studies, is general and can be used for all half-life determinations (e.g. α-, β-, X-ray, fission) in which a γ-ray photopeak is used to select the decays of a particular isotope.
Goddard, Braden; Croft, Stephen; Lousteau, Angela; ...
2016-05-25
Safeguarding nuclear material is an important and challenging task for the international community. One particular safeguards technique commonly used for uranium assay is active neutron correlation counting. This technique involves irradiating unused uranium with ( α,n) neutrons from an Am-Li source and recording the resultant neutron pulse signal which includes induced fission neutrons. Although this non-destructive technique is widely employed in safeguards applications, the neutron energy spectra from an Am-Li sources is not well known. Several measurements over the past few decades have been made to characterize this spectrum; however, little work has been done comparing the measured spectra ofmore » various Am-Li sources to each other. This paper examines fourteen different Am-Li spectra, focusing on how these spectra affect simulated neutron multiplicity results using the code Monte Carlo N-Particle eXtended (MCNPX). Two measurement and simulation campaigns were completed using Active Well Coincidence Counter (AWCC) detectors and uranium standards of varying enrichment. The results of this work indicate that for standard AWCC measurements, the fourteen Am-Li spectra produce similar doubles and triples count rates. Finally, the singles count rates varied by as much as 20% between the different spectra, although they are usually not used in quantitative analysis.« less
Forecasting in foodservice: model development, testing, and evaluation.
Miller, J L; Thompson, P A; Orabella, M M
1991-05-01
This study was designed to develop, test, and evaluate mathematical models appropriate for forecasting menu-item production demand in foodservice. Data were collected from residence and dining hall foodservices at Ohio State University. Objectives of the study were to collect, code, and analyze the data; develop and test models using actual operation data; and compare forecasting results with current methods in use. Customer count was forecast using deseasonalized simple exponential smoothing. Menu-item demand was forecast by multiplying the count forecast by a predicted preference statistic. Forecasting models were evaluated using mean squared error, mean absolute deviation, and mean absolute percentage error techniques. All models were more accurate than current methods. A broad spectrum of forecasting techniques could be used by foodservice managers with access to a personal computer and spread-sheet and database-management software. The findings indicate that mathematical forecasting techniques may be effective in foodservice operations to control costs, increase productivity, and maximize profits.
NASA Technical Reports Server (NTRS)
Zimmerman, G. A.; Olsen, E. T.
1992-01-01
Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.
The effect of microchannel plate gain depression on PAPA photon counting cameras
NASA Astrophysics Data System (ADS)
Sams, Bruce J., III
1991-03-01
PAPA (precision analog photon address) cameras are photon counting imagers which employ microchannel plates (MCPs) for image intensification. They have been used extensively in astronomical speckle imaging. The PAPA camera can produce artifacts when light incident on its MCP is highly concentrated. The effect is exacerbated by adjusting the strobe detection level too low, so that the camera accepts very small MCP pulses. The artifacts can occur even at low total count rates if the image has highly a concentrated bright spot. This paper describes how to optimize PAPA camera electronics, and describes six techniques which can avoid or minimize addressing errors.
Garment Counting in a Textile Warehouse by Means of a Laser Imaging System
Martínez-Sala, Alejandro Santos; Sánchez-Aartnoutse, Juan Carlos; Egea-López, Esteban
2013-01-01
Textile logistic warehouses are highly automated mechanized places where control points are needed to count and validate the number of garments in each batch. This paper proposes and describes a low cost and small size automated system designed to count the number of garments by processing an image of the corresponding hanger hooks generated using an array of phototransistors sensors and a linear laser beam. The generated image is processed using computer vision techniques to infer the number of garment units. The system has been tested on two logistic warehouses with a mean error in the estimated number of hangers of 0.13%. PMID:23628760
Garment counting in a textile warehouse by means of a laser imaging system.
Martínez-Sala, Alejandro Santos; Sánchez-Aartnoutse, Juan Carlos; Egea-López, Esteban
2013-04-29
Textile logistic warehouses are highly automated mechanized places where control points are needed to count and validate the number of garments in each batch. This paper proposes and describes a low cost and small size automated system designed to count the number of garments by processing an image of the corresponding hanger hooks generated using an array of phototransistors sensors and a linear laser beam. The generated image is processed using computer vision techniques to infer the number of garment units. The system has been tested on two logistic warehouses with a mean error in the estimated number of hangers of 0.13%.
Girard, Romuald; Zeineddine, Hussein A; Orsbon, Courtney; Tan, Huan; Moore, Thomas; Hobson, Nick; Shenkar, Robert; Lightle, Rhonda; Shi, Changbin; Fam, Maged D; Cao, Ying; Shen, Le; Neander, April I; Rorrer, Autumn; Gallione, Carol; Tang, Alan T; Kahn, Mark L; Marchuk, Douglas A; Luo, Zhe-Xi; Awad, Issam A
2016-09-15
Cerebral cavernous malformations (CCMs) are hemorrhagic brain lesions, where murine models allow major mechanistic discoveries, ushering genetic manipulations and preclinical assessment of therapies. Histology for lesion counting and morphometry is essential yet tedious and time consuming. We herein describe the application and validations of X-ray micro-computed tomography (micro-CT), a non-destructive technique allowing three-dimensional CCM lesion count and volumetric measurements, in transgenic murine brains. We hereby describe a new contrast soaking technique not previously applied to murine models of CCM disease. Volumetric segmentation and image processing paradigm allowed for histologic correlations and quantitative validations not previously reported with the micro-CT technique in brain vascular disease. Twenty-two hyper-dense areas on micro-CT images, identified as CCM lesions, were matched by histology. The inter-rater reliability analysis showed strong consistency in the CCM lesion identification and staging (K=0.89, p<0.0001) between the two techniques. Micro-CT revealed a 29% greater CCM lesion detection efficiency, and 80% improved time efficiency. Serial integrated lesional area by histology showed a strong positive correlation with micro-CT estimated volume (r(2)=0.84, p<0.0001). Micro-CT allows high throughput assessment of lesion count and volume in pre-clinical murine models of CCM. This approach complements histology with improved accuracy and efficiency, and can be applied for lesion burden assessment in other brain diseases. Copyright © 2016 Elsevier B.V. All rights reserved.
Book review: Bird census techniques, Second edition
Sauer, John R.
2002-01-01
Conservation concerns, federal mandates to monitor birds, and citizen science programs have spawned a variety of surveys that collect information on bird populations. Unfortunately, all too frequently these surveys are poorly designed and use inappropriate counting methods. Some of the flawed approaches reflect a lack of understanding of statistical design; many ornithologists simply are not aware that many of our most entrenched counting methods (such as point counts) cannot appropriately be used in studies that compare densities of birds over space and time. It is likely that most of the readers of The Condor have participated in a bird population survey that has been criticized for poor sampling methods. For example, North American readers may be surprised to read in Bird Census Techniques that the North American Breeding Bird Survey 'is seriously flawed in its design,' and that 'Analysis of trends is impossible from points that are positioned along roads' (p. 109). Our conservation efforts are at risk if we do not acknowledge these concerns and improve our survey designs. Other surveys suffer from a lack of focus. In Bird Census Techniques, the authors emphasize that all surveys require clear statements of objectives and an understanding of appropriate survey designs to meet their objectives. Too often, we view survey design as the realm of ornithologists who know the life histories and logistical issues relevant to counting birds. This view reflects pure hubris: survey design is a collaboration between ornithologists, statisticians, and managers, in which goals based on management needs are met by applying statistical principles for design to the biological context of the species of interest. Poor survey design is often due to exclusion of some of these partners from survey development. Because ornithologists are too frequently unaware of these issues, books such as Bird Census Techniques take on added importance as manuals for educating ornithologists about the relevance of survey design and methods and the often subtle interdisciplinary nature of surveys.Review info: Bird Census Techniques, Second Edition. By Colin J. Bibby, Neil D. Burgess, David A. Hill, and Simon H. Mustoe. 2000. Academic Press, London, UK. xvii 1 302 pp. ISBN 0- 12-095831-7.
Barnard, P.L.; Rubin, D.M.; Harney, J.; Mustain, N.
2007-01-01
This extensive field test of an autocorrelation technique for determining grain size from digital images was conducted using a digital bed-sediment camera, or 'beachball' camera. Using 205 sediment samples and >1200 images from a variety of beaches on the west coast of the US, grain size ranging from sand to granules was measured from field samples using both the autocorrelation technique developed by Rubin [Rubin, D.M., 2004. A simple autocorrelation algorithm for determining grain size from digital images of sediment. Journal of Sedimentary Research, 74(1): 160-165.] and traditional methods (i.e. settling tube analysis, sieving, and point counts). To test the accuracy of the digital-image grain size algorithm, we compared results with manual point counts of an extensive image data set in the Santa Barbara littoral cell. Grain sizes calculated using the autocorrelation algorithm were highly correlated with the point counts of the same images (r2 = 0.93; n = 79) and had an error of only 1%. Comparisons of calculated grain sizes and grain sizes measured from grab samples demonstrated that the autocorrelation technique works well on high-energy dissipative beaches with well-sorted sediment such as in the Pacific Northwest (r2 ??? 0.92; n = 115). On less dissipative, more poorly sorted beaches such as Ocean Beach in San Francisco, results were not as good (r2 ??? 0.70; n = 67; within 3% accuracy). Because the algorithm works well compared with point counts of the same image, the poorer correlation with grab samples must be a result of actual spatial and vertical variability of sediment in the field; closer agreement between grain size in the images and grain size of grab samples can be achieved by increasing the sampling volume of the images (taking more images, distributed over a volume comparable to that of a grab sample). In all field tests the autocorrelation method was able to predict the mean and median grain size with ???96% accuracy, which is more than adequate for the majority of sedimentological applications, especially considering that the autocorrelation technique is estimated to be at least 100 times faster than traditional methods.
Lu, Hung-Yi; Chu, Yen; Wu, Yi-Cheng; Liu, Chien-Ying; Hsieh, Ming-Ju; Chao, Yin-Kai; Wu, Ching-Yang; Yuan, Hsu-Chia; Ko, Po-Jen; Liu, Yun-Hen; Liu, Hui-Ping
2015-04-01
Single-port transumbilical surgery is a well-established platform for minimally invasive abdominal surgery. The aim of this study was to compare the hemodynamics and inflammatory response of a novel transumbilical technique with that of a conventional transthoracic technique in thoracic exploration and lung resection in a canine model. Sixteen dogs were randomly assigned to undergo transumbilical thoracoscopy (n = 8) or standard thoracoscopy (n = 8). Animals in the umbilical group received lung resection via a 3-cm transumbilical incision in combination with a 2.5-cm transdiaphragmatic incision. Animals in the standard thoracoscopy group underwent lung resection via a 3-cm thoracic incision. Hemodynamic parameters (e.g., mean arterial pressure, heart rate, cardiac index, systemic vascular resistance, and global end-diastolic volume index) and inflammatory parameters (e.g., neutrophil count, neutrophil 2',7' -dichlorohydrofluorescein [DCFH] expression, monocyte count, monocyte inducible nitric oxide synthase expression, total lymphocyte count, CD4+ and CD8+ lymphocyte counts, the CD4+/CD8+ratio, plasma Creactive protein level, interleukin-6 level) were evaluated before surgery, during the operation, and on postoperative days 1, 3, 7, and 14. Lung resections were successfully performed in all 16 animals. There were 2 surgery-related mortality complications (1 animal in each group). In the transumbilical group, 1 death was caused by early extubation before the animal fully recovered from the anesthesia. In the thoracoscopic group, 1 death was caused by respiratory distress and the complication of sepsis at 5 days after surgery. There was no significant difference between the two techniques with regard to the hemodynamic and immunologic impact of the surgeries. This study suggests that the hemodynamic and inflammatory changes with endoscopic lung resection performed by the transumbilical approach are comparable to those after using the conventional transthoracic approach. This information is novel and relevant for surgeons interested in developing new surgical techniques in minimally invasive surgery. Copyright © 2015 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.
Penn, Andrew M; Lu, Linghong; Chambers, Andrew G; Balshaw, Robert F; Morrison, Jaclyn L; Votova, Kristine; Wood, Eileen; Smith, Derek S; Lesperance, Maria; del Zoppo, Gregory J; Borchers, Christoph H
2015-12-01
Multiple reaction monitoring mass spectrometry (MRM-MS) is an emerging technology for blood biomarker verification and validation; however, the results may be influenced by pre-analytical factors. This exploratory study was designed to determine if differences in phlebotomy techniques would significantly affect the abundance of plasma proteins in an upcoming biomarker development study. Blood was drawn from 10 healthy participants using four techniques: (1) a 20-gauge IV with vacutainer, (2) a 21-gauge direct vacutainer, (3) an 18-gauge butterfly with vacutainer, and (4) an 18-gauge butterfly with syringe draw. The abundances of a panel of 122 proteins (117 proteins, plus 5 matrix metalloproteinase (MMP) proteins) were targeted by LC/MRM-MS. In addition, complete blood count (CBC) data were also compared across the four techniques. Phlebotomy technique significantly affected 2 of the 11 CBC parameters (red blood cell count, p = 0.010; hemoglobin concentration, p = 0.035) and only 12 of the targeted 117 proteins (p < 0.05). Of the five MMP proteins, only MMP7 was detectable and its concentration was not significantly affected by different techniques. Overall, most proteins in this exploratory study were not significantly influenced by phlebotomy technique; however, a larger study with additional patients will be required for confirmation.
NASA Astrophysics Data System (ADS)
Lee, Seungwan; Kang, Sooncheol; Eom, Jisoo
2017-03-01
Contrast-enhanced mammography has been used to demonstrate functional information about a breast tumor by injecting contrast agents. However, a conventional technique with a single exposure degrades the efficiency of tumor detection due to structure overlapping. Dual-energy techniques with energy-integrating detectors (EIDs) also cause an increase of radiation dose and an inaccuracy of material decomposition due to the limitations of EIDs. On the other hands, spectral mammography with photon-counting detectors (PCDs) is able to resolve the issues induced by the conventional technique and EIDs using their energy-discrimination capabilities. In this study, the contrast-enhanced spectral mammography based on a PCD was implemented by using a polychromatic dual-energy model, and the proposed technique was compared with the dual-energy technique with an EID in terms of quantitative accuracy and radiation dose. The results showed that the proposed technique improved the quantitative accuracy as well as reduced radiation dose comparing to the dual-energy technique with an EID. The quantitative accuracy of the contrast-enhanced spectral mammography based on a PCD was slightly improved as a function of radiation dose. Therefore, the contrast-enhanced spectral mammography based on a PCD is able to provide useful information for detecting breast tumors and improving diagnostic accuracy.
Efficient statistical mapping of avian count data
Royle, J. Andrew; Wikle, C.K.
2005-01-01
We develop a spatial modeling framework for count data that is efficient to implement in high-dimensional prediction problems. We consider spectral parameterizations for the spatially varying mean of a Poisson model. The spectral parameterization of the spatial process is very computationally efficient, enabling effective estimation and prediction in large problems using Markov chain Monte Carlo techniques. We apply this model to creating avian relative abundance maps from North American Breeding Bird Survey (BBS) data. Variation in the ability of observers to count birds is modeled as spatially independent noise, resulting in over-dispersion relative to the Poisson assumption. This approach represents an improvement over existing approaches used for spatial modeling of BBS data which are either inefficient for continental scale modeling and prediction or fail to accommodate important distributional features of count data thus leading to inaccurate accounting of prediction uncertainty.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ripley, S.; Wakefield, T.; Spaulding, S.
1985-05-01
In this investigation platelet deposition in polytetrafluroethylene (PTFE) thoracoabdominal grafts (TAC's) was evaluated using two different semi-quantitative techniques. Ten PTFE TAG's 6 mm in diameter and 30 cm in length were inserted into 10 mongrel dogs. One, 4 and 6 weeks after graft implantation the animals were injected with autologous In-111 platelets labelled by a modified Thakur technique. Platelet imaging in grafts was performed 48 hrs after injection. Blood pool was determined by Tc99m labelled RBC's (in vivo/in vitro technique). Semi-quantitative analysis was performed by subdividing the imaged graft into three major regions and selecting a reference region from eithermore » the native aorta or common iliac artery. Excess platelet deposition was determined by two methods: 1) the ratio of In-111 counts in the graft ROI''s to the reference region and 2) the percent In-111 excess using the Tc99m blood pool subtraction technique (TBPST). Animals were sacrificed 7 weeks after implantation and radioactivity in the excised grafts was determined using a well counter. A positive correlation was found to exist between the In-111 ratio percent analysis (IRPA) and the direct gamma counting (DCC) for all three segments of the prosthetic graft. Correlation coefficients for the thorax, midsegment and abdominal segments were 0.80, 0.73 and 0.48 respectivly. There was no correlation between TBPST and DGC. Using the IRPA technique the thrombogenicity of TAG's can be routinely assessed and is clinically applicable for patient use. TBPST should probably be limited to the extremities to avoid error due to free Tc99m counts from kidneys and ureters.« less
Extending unbiased stereology of brain ultrastructure to three-dimensional volumes
NASA Technical Reports Server (NTRS)
Fiala, J. C.; Harris, K. M.; Koslow, S. H. (Principal Investigator)
2001-01-01
OBJECTIVE: Analysis of brain ultrastructure is needed to reveal how neurons communicate with one another via synapses and how disease processes alter this communication. In the past, such analyses have usually been based on single or paired sections obtained by electron microscopy. Reconstruction from multiple serial sections provides a much needed, richer representation of the three-dimensional organization of the brain. This paper introduces a new reconstruction system and new methods for analyzing in three dimensions the location and ultrastructure of neuronal components, such as synapses, which are distributed non-randomly throughout the brain. DESIGN AND MEASUREMENTS: Volumes are reconstructed by defining transformations that align the entire area of adjacent sections. Whole-field alignment requires rotation, translation, skew, scaling, and second-order nonlinear deformations. Such transformations are implemented by a linear combination of bivariate polynomials. Computer software for generating transformations based on user input is described. Stereological techniques for assessing structural distributions in reconstructed volumes are the unbiased bricking, disector, unbiased ratio, and per-length counting techniques. A new general method, the fractional counter, is also described. This unbiased technique relies on the counting of fractions of objects contained in a test volume. A volume of brain tissue from stratum radiatum of hippocampal area CA1 is reconstructed and analyzed for synaptic density to demonstrate and compare the techniques. RESULTS AND CONCLUSIONS: Reconstruction makes practicable volume-oriented analysis of ultrastructure using such techniques as the unbiased bricking and fractional counter methods. These analysis methods are less sensitive to the section-to-section variations in counts and section thickness, factors that contribute to the inaccuracy of other stereological methods. In addition, volume reconstruction facilitates visualization and modeling of structures and analysis of three-dimensional relationships such as synaptic connectivity.
NASA Technical Reports Server (NTRS)
Peters, Kevin A.; Hammond, Ernest C., Jr.
1987-01-01
The age of the surf clam (Spisula solidissima) can be determined with the use of the Digital Image Processor. This technique is used in conjunction with a modified method for aging, refined by John Ropes of the Woods Hole Laboratory, Massachusetts. This method utilizes a thinned sectioned chondrophore of the surf clam which contains annual rings. The rings of the chondrophore are then counted to determine age. By digitizing the chondrophore, the Digital Image Processor is clearly able to separate these annual rings more accurately. This technique produces an easier and more efficient way to count annual rings to determine the age of the surf clam.
Counting malaria parasites with a two-stage EM based algorithm using crowsourced data.
Cabrera-Bean, Margarita; Pages-Zamora, Alba; Diaz-Vilor, Carles; Postigo-Camps, Maria; Cuadrado-Sanchez, Daniel; Luengo-Oroz, Miguel Angel
2017-07-01
Malaria eradication of the worldwide is currently one of the main WHO's global goals. In this work, we focus on the use of human-machine interaction strategies for low-cost fast reliable malaria diagnostic based on a crowdsourced approach. The addressed technical problem consists in detecting spots in images even under very harsh conditions when positive objects are very similar to some artifacts. The clicks or tags delivered by several annotators labeling an image are modeled as a robust finite mixture, and techniques based on the Expectation-Maximization (EM) algorithm are proposed for accurately counting malaria parasites on thick blood smears obtained by microscopic Giemsa-stained techniques. This approach outperforms other traditional methods as it is shown through experimentation with real data.
γγ coincidence spectrometer for instrumental neutron-activation analysis
NASA Astrophysics Data System (ADS)
Tomlin, B. E.; Zeisler, R.; Lindstrom, R. M.
2008-05-01
Neutron-activation analysis (NAA) is an important technique for the accurate and precise determination of trace and ultra-trace elemental compositions. The application of γγ coincidence counting to NAA in order to enhance specificity was first explored over 40 years ago but has not evolved into a regularly used technique. A γγ coincidence spectrometer has been constructed at the National Institute of Standards and Technology, using two HPGe γ-ray detectors and an all-digital data-acquisition system, for the purpose of exploring coincidence NAA and its value in characterizing reference materials. This paper describes the initial evaluation of the quantitative precision of coincidence counting versus singles spectrometry, based upon a sample of neutron-irradiated bovine liver material.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morris, Christopher; Durham, J. Matthew; Guardincerri, Elena
Cosmic ray muon imaging has been studied for the past several years as a possible technique for nuclear warhead inspection and verification as part of the New Strategic Arms Reduction Treaty between the United States and the Russian Federation. The Los Alamos team has studied two different muon imaging methods for this application, using detectors on two sides and one side of the object of interest. In this report we present results obtained on single sided imaging of configurations aimed at demonstrating the potential of this technique for counting nuclear warheads in place with detectors above the closed hatch ofmore » a ballistic missile submarine.« less
Mastalerz, Maria; Gurba, L.W.
2001-01-01
This paper discusses nitrogen determination with the Cameca SX50 electron microprobe using PCO as an analyzing crystal. A set of conditions using differing accelerating voltages, beam currents, beam sizes, and counting times were tested to determine parameters that would give the most reliable nitrogen determination. The results suggest that, for the instrumentation used, 10 kV, current 20 nA, and a counting time of 20 s provides the most reliable nitrogen determination, with a much lower detection limit than the typical concentration of this element in coal. The study demonstrates that the electron microprobe technique can be used to determine the nitrogen content of coal macerals successfully and accurately. ?? 2001 Elsevier Science B.V. All rights reserved.
Nakamura, Keisuke; Shirato, Midori; Kanno, Taro; Örtengren, Ulf; Lingström, Peter; Niwano, Yoshimi
2016-10-01
Prevention of dental caries with maximum conservation of intact tooth substance remains a challenge in dentistry. The present study aimed to evaluate the antimicrobial effect of H2O2 photolysis on Streptococcus mutans biofilm, which may be a novel antimicrobial chemotherapy for treating caries. S. mutans biofilm was grown on disk-shaped hydroxyapatite specimens. After 1-24 h of incubation, growth was assessed by confocal laser scanning microscopy and viable bacterial counting. Resistance to antibiotics (amoxicillin and erythromycin) was evaluated by comparing bactericidal effects on the biofilm with those on planktonic bacteria. To evaluate the effect of the antimicrobial technique, the biofilm was immersed in 3% H2O2 and was irradiated with an LED at 365 nm for 1 min. Viable bacterial counts in the biofilm were determined by colony counting. The thickness and surface coverage of S. mutans biofilm increased with time, whereas viable bacterial counts plateaued after 6 h. When 12- and 24-h-old biofilms were treated with the minimum concentration of antibiotics that killed viable planktonic bacteria with 3 log reduction, their viable counts were not significantly decreased, suggesting the biofilm acquired antibiotic resistance by increasing its thickness. By contrast, hydroxyl radicals generated by photolysis of 3% H2O2 effectively killed S. mutans in 24-h-old biofilm, with greater than 5 log reduction. The technique based on H2O2 photolysis is a potentially powerful adjunctive antimicrobial chemotherapy for caries treatment. Copyright © 2016 Elsevier B.V. and International Society of Chemotherapy. All rights reserved.
Design and evaluation of a nondestructive fissile assay device for HTGR fuel samples
DOE Office of Scientific and Technical Information (OSTI.GOV)
McNeany, S. R.; Knoll, R. W.; Jenkins, J. D.
1979-02-01
Nondestructive assay of fissile material plays an important role in nuclear fuel processing facilities. Information for product quality control, plant criticality safety, and nuclear materials accountability can be obtained from assay devices. All of this is necessary for a safe, efficient, and orderly operation of a production plant. Presented here is a design description and an operational evaluation of a device developed to nondestructively assay small samples of High-Temperature Gas-Cooled Reactor (HTGR) fuel. The measurement technique employed consists in thermal-neutron irradiation of a sample followed by pneumatic transfer to a high-efficiency neutron detector where delayed neutrons are counted. In general,more » samples undergo several irradiation and count cycles during a measurement. The total number of delayed-neutron counts accumulated is translated into grams of fissile mass through comparison with the counts accumulated in an identical irradiation and count sequence of calibration standards. Successful operation of the device through many experiments over a one-year period indicates high operational reliability. Tests of assay precision show this to be better than 0.25% for measurements of 10 min. Assay biases may be encountered if calibration standards are not representative of unknown samples, but reasonable care in construction and control of standards should lead to no more than 0.2% bias in the measurements. Nondestructive fissile assay of HTGR fuel samples by thermal-neutron irradiation and delayed-neutron detection has been demonstrated to be a rapid and accurate analysis technique. However, careful attention and control must be given to calibration standards to see that they remain representative of unknown samples.« less
Takeo, H; Sakurai, T; Amaki, I
1983-01-01
The techniques of aseptic procedures in the laminar airflow room (LAF) were evaluated in 110 adult patients undergoing antileukemic chemotherapy for remission induction. The patients were divided into three groups according to the regimens: Group A, consisting of 20 patients who stayed in the LAF and received the gown technique + sterile food + prophylactic oral and topical antibiotics; Group B, consisting of 12 patients who stayed in the LAF and received sterile food + prophylactic oral antibiotics; and Group C, consisting of 78 patients in open wards, who received prophylactic oral antibiotics alone. Species and numbers of microorganisms on the skin surface were far less in the patients in Group A than in those in Group B. Airborne microorganisms were counted by the air sampling method. No microorganisms could be detected at the time of the patient's rest and of blood collection in either Group A or B. Electrocardiography and X-ray examination caused an increase in the number of colonies to more than one colony in Group B, but Group A had a count of less than 0.5 colony. The colony counts became negative within 5 min after the cessation of each operation. The percentage of febrile days for patients with a peripheral granulocyte count of less than 100/microliter was 29% in Group A, 21% in Group B and 44% in Group C. The incidence of documented infections during the total hospital stay was 25% (5/20), 42% (5/12) and 86% (67/78), respectively. The aseptic procedures in Group B were not as strict as in Group A, but the incidence of infections in Group B was significantly lower than in Group C.
A new model to predict weak-lensing peak counts. II. Parameter constraint strategies
NASA Astrophysics Data System (ADS)
Lin, Chieh-An; Kilbinger, Martin
2015-11-01
Context. Peak counts have been shown to be an excellent tool for extracting the non-Gaussian part of the weak lensing signal. Recently, we developed a fast stochastic forward model to predict weak-lensing peak counts. Our model is able to reconstruct the underlying distribution of observables for analysis. Aims: In this work, we explore and compare various strategies for constraining a parameter using our model, focusing on the matter density Ωm and the density fluctuation amplitude σ8. Methods: First, we examine the impact from the cosmological dependency of covariances (CDC). Second, we perform the analysis with the copula likelihood, a technique that makes a weaker assumption than does the Gaussian likelihood. Third, direct, non-analytic parameter estimations are applied using the full information of the distribution. Fourth, we obtain constraints with approximate Bayesian computation (ABC), an efficient, robust, and likelihood-free algorithm based on accept-reject sampling. Results: We find that neglecting the CDC effect enlarges parameter contours by 22% and that the covariance-varying copula likelihood is a very good approximation to the true likelihood. The direct techniques work well in spite of noisier contours. Concerning ABC, the iterative process converges quickly to a posterior distribution that is in excellent agreement with results from our other analyses. The time cost for ABC is reduced by two orders of magnitude. Conclusions: The stochastic nature of our weak-lensing peak count model allows us to use various techniques that approach the true underlying probability distribution of observables, without making simplifying assumptions. Our work can be generalized to other observables where forward simulations provide samples of the underlying distribution.
Kinoshita, S; Suzuki, T; Yamashita, S; Muramatsu, T; Ide, M; Dohi, Y; Nishimura, K; Miyamae, T; Yamamoto, I
1992-01-01
A new radionuclide technique for the calculation of left ventricular (LV) volume by the first-pass (FP) method was developed and examined. Using a semi-geometric count-based method, the LV volume can be measured by the following equation: CV = CM/(L/d). V = (CT/CV) x d3 = (CT/CM) x L x d2. (V = LV volume, CV = voxel count, CM = the maximum LV count, CT = the total LV count, L = LV depth where the maximum count was obtained, and d = pixel size.) This theorem was applied to FP LV images obtained in the 30-degree right anterior oblique position. Frame-mode acquisition was performed and the LV end-diastolic maximum count and total count were obtained. The maximum LV depth was obtained as the maximum width of the LV on the FP end-diastolic image, using the assumption that the LV cross-section is circular. These values were substituted in the above equation and the LV end-diastolic volume (FP-EDV) was calculated. A routine equilibrium (EQ) study was done, and the end-diastolic maximum count and total count were obtained. The LV maximum depth was measured on the FP end-diastolic frame, as the maximum length of the LV image. Using these values, the EQ-EDV was calculated and the FP-EDV was compared to the EQ-EDV. The correlation coefficient for these two values was r = 0.96 (n = 23, p less than 0.001), and the standard error of the estimated volume was 10 ml.(ABSTRACT TRUNCATED AT 250 WORDS)
A method for the in vivo measurement of americium-241 at long times post-exposure
DOE Office of Scientific and Technical Information (OSTI.GOV)
Neton, J.W.
1988-01-01
This study investigated an improved method for the quantitative measurement, calibration and calculation of {sup 241}Am organ burdens in humans. The techniques developed correct for cross-talk or count-rate contributions from surrounding and adjacent organ burdens and assures for the proper assignment of activity to the lungs, liver and skeleton. In order to predict the net count-rates for the measurement geometries of the skull, liver and lung, a background prediction method was developed. This method utilizes data obtained from the measurement of a group of control subjects. Based on this data, a linear prediction equation was developed for each measurement geometry.more » In order to correct for the cross-contributions among the various deposition loci, a series of surrogate human phantom structures were measured. The results of measurements of {sup 241}Am depositions in six exposure cases have been evaluated using these new techniques and have indicated that lung burden estimates could be in error by as much as 100 percent when corrections are not made for contributions to the count-rate from other organs.« less
Effect of surgical hand scrub time on subsequent bacterial growth.
Wheelock, S M; Lookinland, S
1997-06-01
In this experimental study, the researchers evaluated the effect of surgical hand scrub time on subsequent bacterial growth and assessed the effectiveness of the glove juice technique in a clinical setting. In a randomized crossover design, 25 perioperative staff members scrubbed for two or three minutes in the first trial and vice versa in the second trial, after which the wore sterile surgical gloves for one hour under clinical conditions. The researchers then sampled the subjects' nondominant hands for bacterial growth, cultured aliquots from the sampling solution, and counted microorganisms. Scrubbing for three minutes produced lower mean log bacterial counts than scrubbing for two minutes. Although the mean bacterial count differed significantly (P = .02) between the two-minute and three-minute surgical hand scrub times, it fell below 0.5 log, which is the threshold for practical and clinical significance. This finding suggests that a two-minute surgical hand scrub is clinically as effective as a three-minute surgical had scrub. The glove juice technique demonstrated sensitivity and reliability in enumerating bacteria on the hands of perioperative staff members in a clinical setting.
Kaur, S; Nieuwenhuijsen, M J
2009-07-01
Short-term human exposure concentrations to PM2.5, ultrafine particle counts (particle range: 0.02-1 microm), and carbon monoxide (CO) were investigated at and around a street canyon intersection in Central London, UK. During a four week field campaign, groups of four volunteers collected samples at three timings (morning, lunch, and afternoon), along two different routes (a heavily trafficked route and a backstreet route) via five modes of transport (walking, cycling, bus, car, and taxi). This was followed by an investigation into the determinants of exposure using a regression technique which incorporated the site-specific traffic counts, meteorological variables (wind speed and temperature) and the mode of transport used. The analyses explained 9, 62, and 43% of the variability observed in the exposure concentrations to PM2.5, ultrafine particle counts, and CO in this study, respectively. The mode of transport was a statistically significant determinant of personal exposure to PM2.5, ultrafine particle counts, and CO, and for PM2.5 and ultrafine particle counts it was the most important determinant. Traffic count explained little of the variability in the PM2.5 concentrations, but it had a greater influence on ultrafine particle count and CO concentrations. The analyses showed that temperature had a statistically significant impact on ultrafine particle count and CO concentrations. Wind speed also had a statistically significant effect but smaller. The small proportion in variability explained in PM2.5 by the model compared to the largest proportion in ultrafine particle counts and CO may be due to the effect of long-range transboundary sources, whereas for ultrafine particle counts and CO, local traffic is the main source.
Bekibele, Charles O; Kehinde, Aderemi O; Ajayi, Benedictus G K
2010-12-01
To determine the effect of face washing with soap and water and cleaning with povidone iodine and cetrimide/chlorhexidine gluconate (Savlon) on upper-lid bacteria. Prospective, nonrandomized clinical trial. Eighty patients attending the Eye Clinic, University College Hospital, Ibadan, Nigeria. Eighty patients assigned to 4 groups had swabs of the upper eyelid skin taken before and after face wash with soap and water, and cleansing with Savlon and 5% povidone iodine. Specimens were cultured and Gram stained. Bacterial counts were carried out using standard techniques. Face washing with soap and water increased the proportion of patients with bacterial isolates from 80.0% to 87.5%. The average colony count increased from 187.1 to 318.5 colony units per mL (p = 0.02). Application of 5% povidone iodine without face washing with soap and water reduced the proportion of patients with bacterial isolates from 82.6% (mean count 196.5) to 28.6% (mean count 34.1)(p = 0.001); in comparison, the application of 5% povidone iodine after face washing with soap and water reduced the proportion from 71.4% (mean count 133.9) to 40.0% (mean count 69.0)(p = 0.01). Application of Savlon without face washing with soap and water reduced the proportion of patients with bacterial isolates from 100% (mean count 310.9) to 41.2% (mean count 19.8)(p = 0.004) compared with the application after face washing, which reduced the proportion from 89.5% (mean count 240.3) to 41.2% (mean count 82.9)(p = 0.02). Both povidone and Savlon are effective in reducing periocular bacteria in an African setting. Prior face washing with soap and water had no added benefit in reducing bacterial colony count.
Benítez, Francisco Moreno; Camacho, Antonio Letrán; Del Cuvillo Bernal, Alfonso; de Medina, Pedro Lobatón Sánchez; Cózar, Francisco J García; Romeu, Ma Luisa Espinazo
2013-07-10
Background: There is an increase in the incidence of pollen related allergy, thus information on pollen schedules would be a great asset for physicians to improve the clinical care of patients. Like cypress pollen sensitization shows a high prevalence among the causes of allergic rhinitis, and therefore it is of interest to use it like a model of study, distinguishing cypress pollen, pollen count and allergenic load level. In this work, we use a flow cytometry based technique to obtain both Cupressus arizonica pollen count and allergenic load, using specific rabbit polyclonal antibody Cup a1 and its comparison with optical microscopy technique measurement. Methods: Airborne samples were collected from Burkard Spore-Trap and Burkard Cyclone Cupressus arizonica pollen was studied using specific rabbit polyclonal antibody Cup a1, labelled with AlexaFluor ® 488 or 750 and analysed by Flow Cytometry in both an EPICS XL and Cyan ADP cytometers (Beckman Coulter ® ). Optical microscopy study was realized with a Leica optical microscope. Bland & Altman was used to determine agreement between both techniques measured. Results: We can identify three different populations based on rabbit polyclonal antibody Cup a1 staining. The main region (44.5%) had 97.3% recognition, a second region (25%) with 28% and a third region (30.5%) with 68% respectively. Immunofluorescence and confocal microscopy showed that main region corresponds to whole pollen grains, the second region are pollen without exine and the third region is constituted by smaller particles with allergenic properties. Pollen schedule shows a higher correlation measured by optical microscopy and flow cytometry in the pollen count with a p-value: 0.0008E -2 and 0.0002 with regard to smaller particles, so the Bland & Altman measurement showed a good correlation between them, p-value: 0,0003. Conclusion: Determination of pollen count and allergenic load by flow cytometry represents an important tool in the determination of airborne respiratory allergens. We showed that not only whole pollen but also smaller particles could induce allergic sensitization. This is the first study where flow cytometry is used for calculating pollen counts and allergenic load. © 2013 Clinical Cytometry Society. Copyright © 2013 Clinical Cytometry Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Schwartz, A.N.; Graham, M.M.; Ferency, G.F.
1989-04-01
Radioisotope penile plethysmography is a nuclear medicine technique which assists in the evaluation of patients with erectile dysfunction. This technique attempts to noninvasively quantitate penile corpora cavernosal blood flow during early penile tumescence using technetium-99m-labeled red blood cells. Penile images and counts were acquired in a steady-state blood-pool phase prior to and after the administration of intracorporal papaverine. Penile counts, images, and time-activity curves were computer analyzed in order to determine peak corporal flow and volume changes. Peak corporal flow rates were compared to arterial integrity (determined by angiography) and venosinusoidal corporal leak (determined by cavernosometry). Peak corporal flow correlatedmore » well with arterial integrity (r = 0.91) but did not correlate with venosinusoidal leak parameters (r = 0.01). This report focuses on the methodology and the assumptions which form the foundation of this technique. The strong correlation of peak corporal flow and angiography suggests that radioisotope penile plethysmography could prove useful in the evaluation of arterial inflow disorders in patients with erectile dysfunction.« less
The Measurement of Human Body-Fluid Volumes: Resting Fluid Volumes Before and After Heat Acclimation
2001-01-01
equilibration period. Erythrocytes aliquots were haemolysed before counting with saponin . Both counts were used to correct the derived ECFV, which was...was largely in accordance with the procedures of Greenleaf et al. (1980). This technique used an extraction procedure in which the dye was first...collection. Therefore, the above extraction procedure was not used. A major limitation of using a cellulose column is the possibility of not collecting all
Method of detecting and counting bacteria
NASA Technical Reports Server (NTRS)
Picciolo, G. L.; Chappelle, E. W. (Inventor)
1976-01-01
An improved method is provided for determining bacterial levels, especially in samples of aqueous physiological fluids. The method depends on the quantitative determination of bacterial adenosine triphosphate (ATP) in the presence of nonbacterial ATP. The bacterial ATP is released by cell rupture and is measured by an enzymatic bioluminescent assay. A concentration technique is included to make the method more sensitive. It is particularly useful where the fluid to be measured contains an unknown or low bacteria count.
Semi-automated identification of cones in the human retina using circle Hough transform
Bukowska, Danuta M.; Chew, Avenell L.; Huynh, Emily; Kashani, Irwin; Wan, Sue Ling; Wan, Pak Ming; Chen, Fred K
2015-01-01
A large number of human retinal diseases are characterized by a progressive loss of cones, the photoreceptors critical for visual acuity and color perception. Adaptive Optics (AO) imaging presents a potential method to study these cells in vivo. However, AO imaging in ophthalmology is a relatively new phenomenon and quantitative analysis of these images remains difficult and tedious using manual methods. This paper illustrates a novel semi-automated quantitative technique enabling registration of AO images to macular landmarks, cone counting and its radius quantification at specified distances from the foveal center. The new cone counting approach employs the circle Hough transform (cHT) and is compared to automated counting methods, as well as arbitrated manual cone identification. We explore the impact of varying the circle detection parameter on the validity of cHT cone counting and discuss the potential role of using this algorithm in detecting both cones and rods separately. PMID:26713186
FPGA-based gating and logic for multichannel single photon counting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pooser, Raphael C; Earl, Dennis Duncan; Evans, Philip G
2012-01-01
We present results characterizing multichannel InGaAs single photon detectors utilizing gated passive quenching circuits (GPQC), self-differencing techniques, and field programmable gate array (FPGA)-based logic for both diode gating and coincidence counting. Utilizing FPGAs for the diode gating frontend and the logic counting backend has the advantage of low cost compared to custom built logic circuits and current off-the-shelf detector technology. Further, FPGA logic counters have been shown to work well in quantum key distribution (QKD) test beds. Our setup combines multiple independent detector channels in a reconfigurable manner via an FPGA backend and post processing in order to perform coincidencemore » measurements between any two or more detector channels simultaneously. Using this method, states from a multi-photon polarization entangled source are detected and characterized via coincidence counting on the FPGA. Photons detection events are also processed by the quantum information toolkit for application testing (QITKAT)« less
Winter, F H; York, G K; el-Nakhal, H
1971-07-01
A rapid method for estimating the extent of microbial contamination on food and on food processing equipment is described. Microbial cells are rinsed from food or swab samples with sterile diluent and concentrated on the surface of membrane filters. The filters are incubated on a suitable bacteriological medium for 4 hr at 30 C, heated at 105 C for 5 min, and stained. The membranes are then dried at 60 C for 15 min, rendered transparent with immersion oil, and examined microscopically. Data obtained by the rapid method were compared with counts of the same samples determined by the standard plate count method. Over 60 comparisons resulted in a correlation coefficient of 0.906. Because the rapid technique can provide reliable microbiological count information in extremely short times, it can be a most useful tool in the routine evaluation of microbial contamination of food processing facilities and for some foods.
New method for estimating bacterial cell abundances in natural samples by use of sublimation
NASA Technical Reports Server (NTRS)
Glavin, Daniel P.; Cleaves, H. James; Schubert, Michael; Aubrey, Andrew; Bada, Jeffrey L.
2004-01-01
We have developed a new method based on the sublimation of adenine from Escherichia coli to estimate bacterial cell counts in natural samples. To demonstrate this technique, several types of natural samples, including beach sand, seawater, deep-sea sediment, and two soil samples from the Atacama Desert, were heated to a temperature of 500 degrees C for several seconds under reduced pressure. The sublimate was collected on a cold finger, and the amount of adenine released from the samples was then determined by high-performance liquid chromatography with UV absorbance detection. Based on the total amount of adenine recovered from DNA and RNA in these samples, we estimated bacterial cell counts ranging from approximately 10(5) to 10(9) E. coli cell equivalents per gram. For most of these samples, the sublimation-based cell counts were in agreement with total bacterial counts obtained by traditional DAPI (4,6-diamidino-2-phenylindole) staining.
A rapid method for counting nucleated erythrocytes on stained blood smears by digital image analysis
Gering, E.; Atkinson, C.T.
2004-01-01
Measures of parasitemia by intraerythrocytic hematozoan parasites are normally expressed as the number of infected erythrocytes per n erythrocytes and are notoriously tedious and time consuming to measure. We describe a protocol for generating rapid counts of nucleated erythrocytes from digital micrographs of thin blood smears that can be used to estimate intensity of hematozoan infections in nonmammalian vertebrate hosts. This method takes advantage of the bold contrast and relatively uniform size and morphology of erythrocyte nuclei on Giemsa-stained blood smears and uses ImageJ, a java-based image analysis program developed at the U.S. National Institutes of Health and available on the internet, to recognize and count these nuclei. This technique makes feasible rapid and accurate counts of total erythrocytes in large numbers of microscope fields, which can be used in the calculation of peripheral parasitemias in low-intensity infections.
Roteta, Miguel; Peyres, Virginia; Rodríguez Barquero, Leonor; García-Toraño, Eduardo; Arenillas, Pablo; Balpardo, Christian; Rodrígues, Darío; Llovera, Roberto
2012-09-01
The radionuclide (68)Ga is one of the few positron emitters that can be prepared in-house without the use of a cyclotron. It disintegrates to the ground state of (68)Zn partially by positron emission (89.1%) with a maximum energy of 1899.1 keV, and partially by electron capture (10.9%). This nuclide has been standardized in the frame of a cooperation project between the Radionuclide Metrology laboratories from CIEMAT (Spain) and CNEA (Argentina). Measurements involved several techniques: 4πβ-γ coincidences, integral gamma counting and Liquid Scintillation Counting using the triple to double coincidence ratio and the CIEMAT/NIST methods. Given the short half-life of the radionuclide assayed, a direct comparison between results from both laboratories was excluded and a comparison of experimental efficiencies of similar NaI detectors was used instead. Copyright © 2012 Elsevier Ltd. All rights reserved.
Hassan, Afrah Fatima; Yadav, Gunjan; Tripathi, Abhay Mani; Mehrotra, Mridul; Saha, Sonali; Garg, Nishita
2016-01-01
Caries excavation is a noninvasive technique of caries removal with maximum preservation of healthy tooth structure. To compare the efficacy of three different caries excavation techniques in reducing the count of cariogenic flora. Sixty healthy primary molars were selected from 26 healthy children with occlusal carious lesions without pulpal involvement and divided into three groups in which caries excavation was done with the help of (1) carbide bur; (2) polymer bur using slow-speed handpiece; and (3) ultrasonic tip with ultrasonic machine. Samples were collected before and after caries excavation for microbiological analysis with the help of sterile sharp spoon excavator. Samples were inoculated on blood agar plate and incubated at 37°C for 48 hours. After bacterial cultivation, the bacterial count of Streptococcus mutans was obtained. All statistical analysis was performed using SPSS 13 statistical software version. Kruskal-Wallis analysis of variance, Wilcoxon matched pairs test, and Z test were performed to reveal the statistical significance. The decrease in bacterial count of S. mutans before and after caries excavation was significant (p < 0.001) in all the three groups. Carbide bur showed most efficient reduction in cariogenic flora, while ultrasonic tip showed almost comparable results, while polymer bur showed least reduction in cariogenic flora after caries excavation. Hassan AF, Yadav G, Tripathi AM, Mehrotra M, Saha S, Garg N. A Comparative Evaluation of the Efficacy of Different Caries Excavation Techniques in reducing the Cariogenic Flora: An in vivo Study. Int J Clin Pediatr Dent 2016;9(3):214-217.
MEASUREMENTS OF AIRBORNE CONCENTRATIONS OF RADON AND THORON DECAY PRODUCTS.
Chalupnik, S; Skubacz, K; Urban, P; Wysocka, M
2017-11-01
Liquid scintillation counting (LSC) is a measuring technique, broadly applied in environmental monitoring of radionuclides. One of the possible applications of LSC is the measurement of radon and thoron decay products. But this method is suitable only for grab sampling. For long-term measurements a different technique can be applied-monitors of potential alpha energy concentration (PAEC) with thermoluminescent detectors (TLD). In these devices, called Alfa-2000 sampling probe, TL detectors (CaSO4:Dy) are applied for alpha particles counting. Three independent heads are placed over the membrane filter in a dust sampler's microcyclone. Such solution enables simultaneous measurements of PAEC and dust content. Moreover, the information which is stored in TLD chips is the energy of alpha particles, not the number of counted particles. Therefore, the readout of TL detector shows directly potential alpha energy, with no dependence on equilibrium factor, etc. This technique, which had been used only for radon decay products measurements, was modified by author to allow simultaneous measurements of radon and thoron PAEC. The LSC method can be used for calibration of portable radon decay products monitors. The LSC method has the advantage to be an absolute one, the TLD method to measure directly the (dose relevant) deposited energy. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
Playing at Statistical Mechanics
ERIC Educational Resources Information Center
Clark, Paul M.; And Others
1974-01-01
Discussed are the applications of counting techniques of a sorting game to distributions and concepts in statistical mechanics. Included are the following distributions: Fermi-Dirac, Bose-Einstein, and most probable. (RH)
Free Radical Polymerization of Styrene: A Radiotracer Experiment
ERIC Educational Resources Information Center
Mazza, R. J.
1975-01-01
Describes an experiment designed to acquaint the chemistry student with polymerization reactions, vacuum techniques, liquid scintillation counting, gas-liquid chromatography, and the handling of radioactive materials. (MLH)
NASA Astrophysics Data System (ADS)
Englander, J. G.; Brodrick, P. G.; Brandt, A. R.
2015-12-01
Fugitive emissions from oil and gas extraction have become a greater concern with the recent increases in development of shale hydrocarbon resources. There are significant gaps in the tools and research used to estimate fugitive emissions from oil and gas extraction. Two approaches exist for quantifying these emissions: atmospheric (or 'top down') studies, which measure methane fluxes remotely, or inventory-based ('bottom up') studies, which aggregate leakage rates on an equipment-specific basis. Bottom-up studies require counting or estimating how many devices might be leaking (called an 'activity count'), as well as how much each device might leak on average (an 'emissions factor'). In a real-world inventory, there is uncertainty in both activity counts and emissions factors. Even at the well level there are significant disagreements in data reporting. For example, some prior studies noted a ~5x difference in the number of reported well completions in the United States between EPA and private data sources. The purpose of this work is to address activity count uncertainty by using machine learning algorithms to classify oilfield surface facilities using high-resolution spatial imagery. This method can help estimate venting and fugitive emissions sources from regions where reporting of oilfield equipment is incomplete or non-existent. This work will utilize high resolution satellite imagery to count well pads in the Bakken oil field of North Dakota. This initial study examines an area of ~2,000 km2 with ~1000 well pads. We compare different machine learning classification techniques, and explore the impact of training set size, input variables, and image segmentation settings to develop efficient and robust techniques identifying well pads. We discuss the tradeoffs inherent to different classification algorithms, and determine the optimal algorithms for oilfield feature detection. In the future, the results of this work will be leveraged to be provide activity counts of oilfield surface equipment including tanks, pumpjacks, and holding ponds.
NASA Astrophysics Data System (ADS)
Bruggeman, M.; Baeten, P.; De Boeck, W.; Carchon, R.
1996-02-01
Neutron coincidence counting is commonly used for the non-destructive assay of plutonium bearing waste or for safeguards verification measurements. A major drawback of conventional coincidence counting is related to the fact that a valid calibration is needed to convert a neutron coincidence count rate to a 240Pu equivalent mass ( 240Pu eq). In waste assay, calibrations are made for representative waste matrices and source distributions. The actual waste however may have quite different matrices and source distributions compared to the calibration samples. This often results in a bias of the assay result. This paper presents a new neutron multiplicity sensitive coincidence counting technique including an auto-calibration of the neutron detection efficiency. The coincidence counting principle is based on the recording of one- and two-dimensional Rossi-alpha distributions triggered respectively by pulse pairs and by pulse triplets. Rossi-alpha distributions allow an easy discrimination between real and accidental coincidences and are aimed at being measured by a PC-based fast time interval analyser. The Rossi-alpha distributions can be easily expressed in terms of a limited number of factorial moments of the neutron multiplicity distributions. The presented technique allows an unbiased measurement of the 240Pu eq mass. The presented theory—which will be indicated as Time Interval Analysis (TIA)—is complementary to Time Correlation Analysis (TCA) theories which were developed in the past, but is from the theoretical point of view much simpler and allows a straightforward calculation of deadtime corrections and error propagation. Analytical expressions are derived for the Rossi-alpha distributions as a function of the factorial moments of the efficiency dependent multiplicity distributions. The validity of the proposed theory is demonstrated and verified via Monte Carlo simulations of pulse trains and the subsequent analysis of the simulated data.
Viana, Marcelo Tavares; Perez, Manuella Cavalcanti; Ribas, Valdenilson Ribeiro; Martins, Gilberto de Freire; de Castro, Célia Maria Machado Barbosa
2012-01-01
Objective To analyze the impact of moderate physical exercise on the total and differential leukocyte counts and red blood cell count of 36 sixty-day-old adult male Wistar rats subjected to early malnourishment. Methods The rats were divided in nourished (N - casein 17%) and malnourished groups (M - casein 8%) and thesegroups were then subdivided in trained (T) untrained (U) creating four groups NT, NU, MT and MU. The NT and MTgroups were submitted to moderate physical exercise using a treadmill (60 min/day, 5 days/week for 8 weeks). Onthe 1st day, before the training started T0 and 24 hours after the last training day of the week (T1 until T8), a 1 mLaliquot of blood was collected from the animals' tails for analysis. The total leukocyte count was evaluated in a cellcounter with an electronic microscope. The cyanmethemoglobin technique was used to measure the hemoglobin level. The hematocrit values were determined as a percentage using the micro-hematocrit technique with a microcapillaryreader and a cell counter was used to determine the red blood cell count. The t-test was used for statistical analysis and a p-value < 0.05 was considered significant. Data are expressed as means ± standard deviation. Results There was a significant difference in the total leukocyte count between the NT (9.1 ± 0.1) and MT groups (8.0 ± 0.1) from T1 and in neutrophils between the NT (22.1 ± 0.6) and MT groups (24.6 ± 1.8) from T7 (p < 0.05). There was no statistical significance in the hemoglobin, hematocrit and red blood cell count from T1. Conclusions According to the results of this study, moderate physical exercise seems to have induced physiologic adaptation in adult rats from T1. PMID:23049442
New isotope technologies in environmental physics
NASA Astrophysics Data System (ADS)
Povinec, P. P.; Betti, M.; Jull, A. J. T.; Vojtyla, P.
2008-02-01
As the levels of radionuclides observed at present in the environment are very low, high sensitive analytical systems are required for carrying out environmental investigations. We review recent progress which has been done in low-level counting techniques in both radiometrics and mass spectrometry sectors, with emphasis on underground laboratories, Monte Carlo (GEANT) simulation of background of HPGe detectors operating in various configurations, secondary ionisation mass spectrometry, and accelerator mass spectrometry. Applications of radiometrics and mass spectrometry techniques in radioecology and climate change studies are presented and discussed as well. The review should help readers in better orientation on recent developments in the field of low-level counting and spectrometry, and to advice on construction principles of underground laboratories, as well as on criteria how to choose low or high energy mass spectrometers for environmental investigations.
SPAD electronics for high-speed quantum communications
NASA Astrophysics Data System (ADS)
Bienfang, Joshua C.; Restelli, Alessandro; Migdall, Alan
2011-01-01
We discuss high-speed electronics that support the use of single-photon avalanche diodes (SPADs) in gigahertz singlephoton communications systems. For InGaAs/InP SPADs, recent work has demonstrated reduced afterpulsing and count rates approaching 500 MHz can be achieved with gigahertz periodic-gating techniques designed to minimize the total avalanche charge to less than 100 fC. We investigate afterpulsing in this regime and establish a connection to observations using more conventional techniques. For Si SPADs, we report the benefits of improved timing electronics that enhance the temporal resolution of Si SPADs used in a free-space quantum key distribution (QKD) system operating in the GHz regime. We establish that the effects of count-rate fluctuations induced by daytime turbulent scintillation are significantly reduced, benefitting the performance of the QKD system.
Quantitative gene expression analysis in Caenorhabditis elegans using single molecule RNA FISH.
Bolková, Jitka; Lanctôt, Christian
2016-04-01
Advances in fluorescent probe design and synthesis have allowed the uniform in situ labeling of individual RNA molecules. In a technique referred to as single molecule RNA FISH (smRNA FISH), the labeled RNA molecules can be imaged as diffraction-limited spots and counted using image analysis algorithms. Single RNA counting has provided valuable insights into the process of gene regulation. This microscopy-based method has often revealed a high cell-to-cell variability in expression levels, which has in turn led to a growing interest in investigating the biological significance of gene expression noise. Here we describe the application of the smRNA FISH technique to samples of Caenorhabditis elegans, a well-characterized model organism. Copyright © 2015 Elsevier Inc. All rights reserved.
Dating Tectonic Activity on Mercury’s Large-Scale Lobate-Scarp Thrust Faults
NASA Astrophysics Data System (ADS)
Barlow, Nadine G.; E Banks, Maria
2017-10-01
Mercury’s widespread large-scale lobate-scarp thrust faults reveal that the planet’s tectonic history has been dominated by global contraction, primarily due to cooling of its interior. Constraining the timing and duration of this contraction provides key insight into Mercury’s thermal and geologic evolution. We combine two techniques to enhance the statistical validity of size-frequency distribution crater analyses and constrain timing of the 1) earliest and 2) most recent detectable activity on several of Mercury’s largest lobate-scarp thrust faults. We use the sizes of craters directly transected by or superposed on the edge of the scarp face to define a count area around the scarp, a method we call the Modified Buffered Crater Counting Technique (MBCCT). We developed the MBCCT to avoid the issue of a near-zero scarp width since feature widths are included in area calculations of the commonly used Buffered Crater Counting Technique (BCCT). Since only craters directly intersecting the scarp face edge conclusively show evidence of crosscutting relations, we increase the number of craters in our analysis (and reduce uncertainties) by using the morphologic degradation state (i.e. relative age) of these intersecting craters to classify other similarly degraded craters within the count area (i.e., those with the same relative age) as superposing or transected. The resulting crater counts are divided into two categories: transected craters constrain the earliest possible activity and superposed craters constrain the most recent detectable activity. Absolute ages are computed for each population using the Marchi et al. [2009] model production function. A test of the Blossom lobate scarp indicates the MBCCT gives statistically equivalent results to the BCCT. We find that all scarps in this study crosscut surfaces Tolstojan or older in age (>~3.7 Ga). The most recent detectable activity along lobate-scarp thrust faults ranges from Calorian to Kuiperian (~3.7 Ga to present). Our results complement previous relative-age studies with absolute ages and indicate global contraction continued over the last ~3-4 Gyr. At least some thrust fault activity occurred on Mercury in relatively recent times (<280 Ma).
Santra, Kalyan; Zhan, Jinchun; Song, Xueyu; ...
2016-02-10
The need for measuring fluorescence lifetimes of species in subdiffraction-limited volumes in, for example, stimulated emission depletion (STED) microscopy, entails the dual challenge of probing a small number of fluorophores and fitting the concomitant sparse data set to the appropriate excited-state decay function. This need has stimulated a further investigation into the relative merits of two fitting techniques commonly referred to as “residual minimization” (RM) and “maximum likelihood” (ML). Fluorescence decays of the well-characterized standard, rose bengal in methanol at room temperature (530 ± 10 ps), were acquired in a set of five experiments in which the total number ofmore » “photon counts” was approximately 20, 200, 1000, 3000, and 6000 and there were about 2–200 counts at the maxima of the respective decays. Each set of experiments was repeated 50 times to generate the appropriate statistics. Each of the 250 data sets was analyzed by ML and two different RM methods (differing in the weighting of residuals) using in-house routines and compared with a frequently used commercial RM routine. Convolution with a real instrument response function was always included in the fitting. While RM using Pearson’s weighting of residuals can recover the correct mean result with a total number of counts of 1000 or more, ML distinguishes itself by yielding, in all cases, the same mean lifetime within 2% of the accepted value. For 200 total counts and greater, ML always provides a standard deviation of <10% of the mean lifetime, and even at 20 total counts there is only 20% error in the mean lifetime. Here, the robustness of ML advocates its use for sparse data sets such as those acquired in some subdiffraction-limited microscopies, such as STED, and, more importantly, provides greater motivation for exploiting the time-resolved capacities of this technique to acquire and analyze fluorescence lifetime data.« less
A removal model for estimating detection probabilities from point-count surveys
Farnsworth, G.L.; Pollock, K.H.; Nichols, J.D.; Simons, T.R.; Hines, J.E.; Sauer, J.R.
2002-01-01
Use of point-count surveys is a popular method for collecting data on abundance and distribution of birds. However, analyses of such data often ignore potential differences in detection probability. We adapted a removal model to directly estimate detection probability during point-count surveys. The model assumes that singing frequency is a major factor influencing probability of detection when birds are surveyed using point counts. This may be appropriate for surveys in which most detections are by sound. The model requires counts to be divided into several time intervals. Point counts are often conducted for 10 min, where the number of birds recorded is divided into those first observed in the first 3 min, the subsequent 2 min, and the last 5 min. We developed a maximum-likelihood estimator for the detectability of birds recorded during counts divided into those intervals. This technique can easily be adapted to point counts divided into intervals of any length. We applied this method to unlimited-radius counts conducted in Great Smoky Mountains National Park. We used model selection criteria to identify whether detection probabilities varied among species, throughout the morning, throughout the season, and among different observers. We found differences in detection probability among species. Species that sing frequently such as Winter Wren (Troglodytes troglodytes) and Acadian Flycatcher (Empidonax virescens) had high detection probabilities (∼90%) and species that call infrequently such as Pileated Woodpecker (Dryocopus pileatus) had low detection probability (36%). We also found detection probabilities varied with the time of day for some species (e.g. thrushes) and between observers for other species. We used the same approach to estimate detection probability and density for a subset of the observations with limited-radius point counts.
Casey, D T; Frenje, J A; Séguin, F H; Li, C K; Rosenberg, M J; Rinderknecht, H; Manuel, M J-E; Gatu Johnson, M; Schaeffer, J C; Frankel, R; Sinenian, N; Childs, R A; Petrasso, R D; Glebov, V Yu; Sangster, T C; Burke, M; Roberts, S
2011-07-01
A magnetic recoil spectrometer (MRS) has been built and successfully used at OMEGA for measurements of down-scattered neutrons (DS-n), from which an areal density in both warm-capsule and cryogenic-DT implosions have been inferred. Another MRS is currently being commissioned on the National Ignition Facility (NIF) for diagnosing low-yield tritium-hydrogen-deuterium implosions and high-yield DT implosions. As CR-39 detectors are used in the MRS, the principal sources of background are neutron-induced tracks and intrinsic tracks (defects in the CR-39). The coincidence counting technique was developed to reduce these types of background tracks to the required level for the DS-n measurements at OMEGA and the NIF. Using this technique, it has been demonstrated that the number of background tracks is reduced by a couple of orders of magnitude, which exceeds the requirement for the DS-n measurements at both facilities.
Patrizio, Angela; Specht, Christian G.
2016-01-01
Abstract. The ability to count molecules is essential to elucidating cellular mechanisms, as these often depend on the absolute numbers and concentrations of molecules within specific compartments. Such is the case at chemical synapses, where the transmission of information from presynaptic to postsynaptic terminals requires complex interactions between small sets of molecules. Be it the subunit stoichiometry specifying neurotransmitter receptor properties, the copy numbers of scaffold proteins setting the limit of receptor accumulation at synapses, or protein packing densities shaping the molecular organization and plasticity of the postsynaptic density, all of these depend on exact quantities of components. A variety of proteomic, electrophysiological, and quantitative imaging techniques have yielded insights into the molecular composition of synaptic complexes. In this review, we compare the different quantitative approaches and consider the potential of single molecule imaging techniques for the quantification of synaptic components. We also discuss specific neurobiological data to contextualize the obtained numbers and to explain how they aid our understanding of synaptic structure and function. PMID:27335891
Patrizio, Angela; Specht, Christian G
2016-10-01
The ability to count molecules is essential to elucidating cellular mechanisms, as these often depend on the absolute numbers and concentrations of molecules within specific compartments. Such is the case at chemical synapses, where the transmission of information from presynaptic to postsynaptic terminals requires complex interactions between small sets of molecules. Be it the subunit stoichiometry specifying neurotransmitter receptor properties, the copy numbers of scaffold proteins setting the limit of receptor accumulation at synapses, or protein packing densities shaping the molecular organization and plasticity of the postsynaptic density, all of these depend on exact quantities of components. A variety of proteomic, electrophysiological, and quantitative imaging techniques have yielded insights into the molecular composition of synaptic complexes. In this review, we compare the different quantitative approaches and consider the potential of single molecule imaging techniques for the quantification of synaptic components. We also discuss specific neurobiological data to contextualize the obtained numbers and to explain how they aid our understanding of synaptic structure and function.
Rapid enumeration of low numbers of moulds in tea based drinks using an automated system.
Tanaka, Kouichi; Yamaguchi, Nobuyasu; Baba, Takashi; Amano, Norihide; Nasu, Masao
2011-01-31
Aseptically prepared cold drinks based on tea have become popular worldwide. Contamination of these drinks with harmful microbes is a potential health problem because such drinks are kept free from preservatives to maximize aroma and flavour. Heat-tolerant conidia and ascospores of fungi can survive pasteurization, and need to be detected as quickly as possible. We were able to rapidly and accurately detect low numbers of conidia and ascospores in tea-based drinks using fluorescent staining followed by an automated counting system. Conidia or ascospores were inoculated into green tea and oolong tea, and samples were immediately filtered through nitrocellulose membranes (pore size: 0.8 μm) to concentrate fungal propagules. These were transferred onto potato dextrose agar and incubated for 23 h at 28 °C. Fungi germinating on the membranes were fluorescently stained for 30 min. The stained mycelia were counted selectively within 90s using an automated counting system (MGS-10LD; Chuo Electric Works, Osaka, Japan). Very low numbers (1 CFU/100ml) of conidia or ascospores could be rapidly counted, in contrast to traditional labour intensive techniques. All tested mould strains were detected within 24h while conventional plate counting required 72 h for colony enumeration. Counts of slow-growing fungi (Cladosporium cladosporioides) obtained by automated counting and by conventional plate counting were close (r(2) = 0.986). Our combination of methods enables counting of both fast- and slow-growing fungi, and should be useful for microbiological quality control of tea-based and also other drinks. Copyright © 2011 Elsevier B.V. All rights reserved.
Multiple-Event, Single-Photon Counting Imaging Sensor
NASA Technical Reports Server (NTRS)
Zheng, Xinyu; Cunningham, Thomas J.; Sun, Chao; Wang, Kang L.
2011-01-01
The single-photon counting imaging sensor is typically an array of silicon Geiger-mode avalanche photodiodes that are monolithically integrated with CMOS (complementary metal oxide semiconductor) readout, signal processing, and addressing circuits located in each pixel and the peripheral area of the chip. The major problem is its single-event method for photon count number registration. A single-event single-photon counting imaging array only allows registration of up to one photon count in each of its pixels during a frame time, i.e., the interval between two successive pixel reset operations. Since the frame time can t be too short, this will lead to very low dynamic range and make the sensor merely useful for very low flux environments. The second problem of the prior technique is a limited fill factor resulting from consumption of chip area by the monolithically integrated CMOS readout in pixels. The resulting low photon collection efficiency will substantially ruin any benefit gained from the very sensitive single-photon counting detection. The single-photon counting imaging sensor developed in this work has a novel multiple-event architecture, which allows each of its pixels to register as more than one million (or more) photon-counting events during a frame time. Because of a consequently boosted dynamic range, the imaging array of the invention is capable of performing single-photon counting under ultra-low light through high-flux environments. On the other hand, since the multiple-event architecture is implemented in a hybrid structure, back-illumination and close-to-unity fill factor can be realized, and maximized quantum efficiency can also be achieved in the detector array.
Microcoupon Assay Of Adhesion And Growth Of Bacterial Films
NASA Technical Reports Server (NTRS)
Pierson, Duane L.; Koenig, David W.
1994-01-01
Microbiological assay technique facilitates determination of some characteristics of sessile bacteria like those that attach to and coat interior walls of water-purification systems. Biofilms cause sickness and interfere with purification process. Technique enables direct measurement of rate of attachment of bacterial cells, their metabolism, and effects of chemicals on them. Used to quantify effects of both bactericides and growth-stimulating agents and in place of older standard plate-count and tube-dilution techniques.
Walsh, Alex J.; Sharick, Joe T.; Skala, Melissa C.; Beier, Hope T.
2016-01-01
Time-correlated single photon counting (TCSPC) enables acquisition of fluorescence lifetime decays with high temporal resolution within the fluorescence decay. However, many thousands of photons per pixel are required for accurate lifetime decay curve representation, instrument response deconvolution, and lifetime estimation, particularly for two-component lifetimes. TCSPC imaging speed is inherently limited due to the single photon per laser pulse nature and low fluorescence event efficiencies (<10%) required to reduce bias towards short lifetimes. Here, simulated fluorescence lifetime decays are analyzed by SPCImage and SLIM Curve software to determine the limiting lifetime parameters and photon requirements of fluorescence lifetime decays that can be accurately fit. Data analysis techniques to improve fitting accuracy for low photon count data were evaluated. Temporal binning of the decays from 256 time bins to 42 time bins significantly (p<0.0001) improved fit accuracy in SPCImage and enabled accurate fits with low photon counts (as low as 700 photons/decay), a 6-fold reduction in required photons and therefore improvement in imaging speed. Additionally, reducing the number of free parameters in the fitting algorithm by fixing the lifetimes to known values significantly reduced the lifetime component error from 27.3% to 3.2% in SPCImage (p<0.0001) and from 50.6% to 4.2% in SLIM Curve (p<0.0001). Analysis of nicotinamide adenine dinucleotide–lactate dehydrogenase (NADH-LDH) solutions confirmed temporal binning of TCSPC data and a reduced number of free parameters improves exponential decay fit accuracy in SPCImage. Altogether, temporal binning (in SPCImage) and reduced free parameters are data analysis techniques that enable accurate lifetime estimation from low photon count data and enable TCSPC imaging speeds up to 6x and 300x faster, respectively, than traditional TCSPC analysis. PMID:27446663
Winkelman, James W; Tanasijevic, Milenko J; Zahniser, David J
2017-08-01
- A novel automated slide-based approach to the complete blood count and white blood cell differential count is introduced. - To present proof of concept for an image-based approach to complete blood count, based on a new slide preparation technique. A preliminary data comparison with the current flow-based technology is shown. - A prototype instrument uses a proprietary method and technology to deposit a precise volume of undiluted peripheral whole blood in a monolayer onto a glass microscope slide so that every cell can be distinguished, counted, and imaged. The slide is stained, and then multispectral image analysis is used to measure the complete blood count parameters. Images from a 600-cell white blood cell differential count, as well as 5000 red blood cells and a variable number of platelets, that are present in 600 high-power fields are made available for a technologist to view on a computer screen. An initial comparison of the basic complete blood count parameters was performed, comparing 1857 specimens on both the new instrument and a flow-based hematology analyzer. - Excellent correlations were obtained between the prototype instrument and a flow-based system. The primary parameters of white blood cell, red blood cell, and platelet counts resulted in correlation coefficients (r) of 0.99, 0.99, and 0.98, respectively. Other indices included hemoglobin (r = 0.99), hematocrit (r = 0.99), mean cellular volume (r = 0.90), mean corpuscular hemoglobin (r = 0.97), and mean platelet volume (r = 0.87). For the automated white blood cell differential counts, r values were calculated for neutrophils (r = 0.98), lymphocytes (r = 0.97), monocytes (r = 0.76), eosinophils (r = 0.96), and basophils (r = 0.63). - Quantitative results for components of the complete blood count and automated white blood cell differential count can be developed by image analysis of a monolayer preparation of a known volume of peripheral blood.
Evaluation of counting methods for oceanic radium-228
NASA Astrophysics Data System (ADS)
Orr, James C.
1988-07-01
Measurement of open ocean 228Ra is difficult, typically requiring at least 200 L of seawater. The burden of collecting and processing these large-volume samples severely limits the widespread use of this promising tracer. To use smaller-volume samples, a more sensitive means of analysis is required. To seek out new and improved counting method(s), conventional 228Ra counting methods have been compared with some promising techniques which are currently used for other radionuclides. Of the conventional methods, α spectrometry possesses the highest efficiency (3-9%) and lowest background (0.0015 cpm), but it suffers from the need for complex chemical processing after sampling and the need to allow about 1 year for adequate ingrowth of 228Th granddaughter. The other two conventional counting methods measure the short-lived 228Ac daughter while it remains supported by 228Ra, thereby avoiding the complex sample processing and the long delay before counting. The first of these, high-resolution γ spectrometry, offers the simplest processing and an efficiency (4.8%) comparable to α spectrometry; yet its high background (0.16 cpm) and substantial equipment cost (˜30,000) limit its widespread use. The second no-wait method, β-γ coincidence spectrometry, also offers comparable efficiency (5.3%), but it possesses both lower background (0.0054 cpm) and lower initial cost (˜12,000). Three new (i.e., untried for 228Ra) techniques all seem to promise about a fivefold increase in efficiency over conventional methods. By employing liquid scintillation methods, both α spectrometry and β-γ coincidence spectrometry can improve their counter efficiency while retaining low background. The third new 228Ra counting method could be adapted from a technique which measures 224Ra by 220Rn emanation. After allowing for ingrowth and then counting for the 224Ra great-granddaughter, 228Ra could be back calculated, thereby yielding a method with high efficiency, where no sample processing is required. The efficiency and background of each of the three new methods have been estimated and are compared with those of the three methods currently employed to measure oceanic 228Ra. From efficiency and background, the relative figure of merit and the detection limit have been determined for each of the six counters. These data suggest that the new counting methods have the potential to measure most 228Ra samples with just 30 L of seawater, to better than 5% precision. Not only would this reduce the time, effort, and expense involved in sample collection, but 228Ra could then be measured on many small-volume samples (20-30 L) previously collected with only 226Ra in mind. By measuring 228Ra quantitatively on such small-volume samples, three analyses (large-volume 228Ra, large-volume 226Ra, and small-volume 226Ra) could be reduced to one, thereby dramatically improving analytical precision.
Fundamental limits to superresolution fluorescence microscopy
NASA Astrophysics Data System (ADS)
Small, Alex
2013-02-01
Superresolution fluorescence microscopy techniques such as PALM, STORM, STED, and Structured Illumination Microscopy (SIM) enable imaging of live cells at nanometer resolution. The common theme in all of these techniques is that the diffraction limit is circumvented by controlling the states of fluorescent molecules. Although the samples are labeled very densely (i.e. with spacing much smaller than the Airy distance), not all of the molecules are emitting at the same time. Consequently, one does not encounter overlapping blurs. In the deterministic techniques (STED, SIM) the achievable resolution scales as the wavelength of light divided by the square root of the intensity of a beam used to control the fluorescent state. In the stochastic techniques (PALM, STORM), the achievable resolution scales as the wavelength of light divided by the square root of the number of photons collected. Although these limits arise from very different mechanisms (parabolic beam profiles for STED and SIM, statistics for PALM and STORM), in all cases the resolution scales inversely with the square root of a measure of the number of photons used in the experiment. We have developed a proof that this relationship between resolution and photon count is universal to techniques that control the states of fluorophores using classical light. Our proof encompasses linear and nonlinear optics, as well as computational post-processing techniques for extracting information beyond the diffraction limit. If there are techniques that can achieve a more efficient relationship between resolution and photon count, those techniques will require light exhibiting non-classical correlations.
Power counting to better jet observables
NASA Astrophysics Data System (ADS)
Larkoski, Andrew J.; Moult, Ian; Neill, Duff
2014-12-01
Optimized jet substructure observables for identifying boosted topologies will play an essential role in maximizing the physics reach of the Large Hadron Collider. Ideally, the design of discriminating variables would be informed by analytic calculations in perturbative QCD. Unfortunately, explicit calculations are often not feasible due to the complexity of the observables used for discrimination, and so many validation studies rely heavily, and solely, on Monte Carlo. In this paper we show how methods based on the parametric power counting of the dynamics of QCD, familiar from effective theory analyses, can be used to design, understand, and make robust predictions for the behavior of jet substructure variables. As a concrete example, we apply power counting for discriminating boosted Z bosons from massive QCD jets using observables formed from the n-point energy correlation functions. We show that power counting alone gives a definite prediction for the observable that optimally separates the background-rich from the signal-rich regions of phase space. Power counting can also be used to understand effects of phase space cuts and the effect of contamination from pile-up, which we discuss. As these arguments rely only on the parametric scaling of QCD, the predictions from power counting must be reproduced by any Monte Carlo, which we verify using Pythia 8 and Herwig++. We also use the example of quark versus gluon discrimination to demonstrate the limits of the power counting technique.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Van Esch, Patrick; Crisanti, Marta; Mutti, Paolo
2015-07-01
A research project is presented in which we aim at counting individual neutrons with CCD-like cameras. We explore theoretically a technique that allows us to use imaging detectors as counting detectors at lower counting rates, and transits smoothly to continuous imaging at higher counting rates. As such, the hope is to combine the good background rejection properties of standard neutron counting detectors with the absence of dead time of integrating neutron imaging cameras as well as their very good spatial resolution. Compared to Xray detection, the essence of thermal neutron detection is the nuclear conversion reaction. The released energies involvedmore » are of the order of a few MeV, while X-ray detection releases energies of the order of the photon energy, which is in the 10 KeV range. Thanks to advances in camera technology which have resulted in increased quantum efficiency, lower noise, as well as increased frame rate up to 100 fps for CMOS-type cameras, this more than 100-fold higher available detection energy implies that the individual neutron detection light signal can be significantly above the noise level, as such allowing for discrimination and individual counting, which is hard to achieve with X-rays. The time scale of CMOS-type cameras doesn't allow one to consider time-of-flight measurements, but kinetic experiments in the 10 ms range are possible. The theory is next confronted to the first experimental results. (authors)« less
Alworth, Leanne C; Berghaus, Roy D; Kelly, Lisa M; Supakorndej, Prasit; Burkman, Erica J; Savadelis, Molly D; Cooper, Tanya L; Salyards, Gregory W; Harvey, Stephen B; Moorhead, Andrew R
2015-01-01
The NIH guidelines for survival bleeding of mice and rats note that using the retroorbital plexus has a greater potential for complications than do other methods of blood collection and that this procedure should be performed on anesthetized animals. Lateral saphenous vein puncture has a low potential for complications and can be performed without anesthesia. Mongolian gerbils (Meriones unguiculatus) are the preferred rodent model for filarial parasite research. To monitor microfilaria counts in the blood, blood sampling from the orbital plexus has been the standard. Our goal was to refine the blood collection technique. To determine whether blood collection from the lateral saphenous vein was a feasible alternative to retroorbital sampling, we compared microfilaria counts in blood samples collected by both methods from 21 gerbils infected with the filarial parasitic worm Brugia pahangi. Lateral saphenous vein counts were equivalent to retroorbital counts at relatively high counts (greater than 50 microfilariae per 20 µL) but were significantly lower than retroorbital counts when microfilarial concentrations were lower. Our results indicate that although retroorbital collection may be preferable when low concentrations of microfilariae need to be enumerated, the lateral saphenous vein is a suitable alternative site for blood sampling to determine microfilaremia and is a feasible refinement that can benefit the wellbeing of gerbils. PMID:26678366
Wang, Xiaodan; Yamaguchi, Nobuyasu; Someya, Takashi; Nasu, Masao
2007-10-01
The micro-colony method was used to enumerate viable bacteria in composts. Cells were vacuum-filtered onto polycarbonate filters and incubated for 18 h on LB medium at 37 degrees C. Bacteria on the filters were stained with SYBR Green II, and enumerated using a newly developed micro-colony auto counting system which can automatically count micro-colonies on half the area of the filter within 90 s. A large number of bacteria in samples retained physiological activity and formed micro-colonies within 18 h, whereas most could not form large colonies on conventional media within 1 week. The results showed that this convenient technique can enumerate viable bacteria in compost rapidly for its efficient quality control.
NASA Astrophysics Data System (ADS)
Shields, C. A.; Ullrich, P. A.; Rutz, J. J.; Wehner, M. F.; Ralph, M.; Ruby, L.
2017-12-01
Atmospheric rivers (ARs) are long, narrow filamentary structures that transport large amounts of moisture in the lower layers of the atmosphere, typically from subtropical regions to mid-latitudes. ARs play an important role in regional hydroclimate by supplying significant amounts of precipitation that can alleviate drought, or in extreme cases, produce dangerous floods. Accurately detecting, or tracking, ARs is important not only for weather forecasting, but is also necessary to understand how these events may change under global warming. Detection algorithms are used on both regional and global scales, and most accurately, using high resolution datasets, or model output. Different detection algorithms can produce different answers. Detection algorithms found in the current literature fall broadly into two categories: "time-stitching", where the AR is tracked with a Lagrangian approach through time and space; and "counting", where ARs are identified for a single point in time for a single location. Counting routines can be further subdivided into algorithms that use absolute thresholds with specific geometry, to algorithms that use relative thresholds, to algorithms based on statistics, to pattern recognition and machine learning techniques. With such a large diversity in detection code, differences in AR tracking and "counts" can vary widely from technique to technique. Uncertainty increases for future climate scenarios, where the difference between relative and absolute thresholding produce vastly different counts, simply due to the moister background state in a warmer world. In an effort to quantify the uncertainty associated with tracking algorithms, the AR detection community has come together to participate in ARTMIP, the Atmospheric River Tracking Method Intercomparison Project. Each participant will provide AR metrics to the greater group by applying their code to a common reanalysis dataset. MERRA2 data was chosen for both temporal and spatial resolution. After completion of this first phase, Tier 1, ARTMIP participants may choose to contribute to Tier 2, which will range from reanalysis uncertainty, to analysis of future climate scenarios from high resolution model output. ARTMIP's experimental design, techniques, and preliminary metrics will be presented.
Mueller, Sherry A; Anderson, James E; Kim, Byung R; Ball, James C
2009-04-01
Effective bacterial control in cooling-tower systems requires accurate and timely methods to count bacteria. Plate-count methods are difficult to implement on-site, because they are time- and labor-intensive and require sterile techniques. Several field-applicable methods (dipslides, Petrifilm, and adenosine triphosphate [ATP] bioluminescence) were compared with the plate count for two sample matrices--phosphate-buffered saline solution containing a pure culture of Pseudomonas fluorescens and cooling-tower water containing an undefined mixed bacterial culture. For the pure culture, (1) counts determined on nutrient agar and plate-count agar (PCA) media and expressed as colony-forming units (CFU) per milliliter were equivalent to those on R2A medium (p = 1.0 and p = 1.0, respectively); (2) Petrifilm counts were not significantly different from R2A plate counts (p = 0.99); (3) the dipslide counts were up to 2 log units higher than R2A plate counts, but this discrepancy was not statistically significant (p = 0.06); and (4) a discernable correlation (r2 = 0.67) existed between ATP readings and plate counts. For cooling-tower water samples (n = 62), (1) bacterial counts using R2A medium were higher (but not significant; p = 0.63) than nutrient agar and significantly higher than tryptone-glucose yeast extract (TGE; p = 0.03) and PCA (p < 0.001); (2) Petrifilm counts were significantly lower than nutrient agar or R2A (p = 0.02 and p < 0.001, respectively), but not statistically different from TGE, PCA, and dipslides (p = 0.55, p = 0.69, and p = 0.91, respectively); (3) the dipslide method yielded bacteria counts 1 to 3 log units lower than nutrient agar and R2A (p < 0.001), but was not significantly different from Petrifilm (p = 0.91), PCA (p = 1.00) or TGE (p = 0.07); (4) the differences between dipslides and the other methods became greater with a 6-day incubation time; and (5) the correlation between ATP readings and plate counts varied from system to system, was poor (r2 values ranged from < 0.01 to 0.47), and the ATP method was not sufficiently sensitive to measure counts below approximately 10(4) CFU/mL.
Baeten; Bruggeman; Paepen; Carchon
2000-03-01
The non-destructive quantification of transuranic elements in nuclear waste management or in safeguards verifications is commonly performed by passive neutron assay techniques. To minimise the number of unknown sample-dependent parameters, Neutron Multiplicity Counting (NMC) is applied. We developed a new NMC-technique, called Time Interval Correlation Spectroscopy (TICS), which is based on the measurement of Rossi-alpha time interval distributions. Compared to other NMC-techniques, TICS offers several advantages.
Inflight Radiometric Calibration of New Horizons' Multispectral Visible Imaging Camera (MVIC)
NASA Technical Reports Server (NTRS)
Howett, C. J. A.; Parker, A. H.; Olkin, C. B.; Reuter, D. C.; Ennico, K.; Grundy, W. M.; Graps, A. L.; Harrison, K. P.; Throop, H. B.; Buie, M. W.;
2016-01-01
We discuss two semi-independent calibration techniques used to determine the inflight radiometric calibration for the New Horizons Multi-spectral Visible Imaging Camera (MVIC). The first calibration technique compares the measured number of counts (DN) observed from a number of well calibrated stars to those predicted using the component-level calibration. The ratio of these values provides a multiplicative factor that allows a conversation between the preflight calibration to the more accurate inflight one, for each detector. The second calibration technique is a channel-wise relative radiometric calibration for MVIC's blue, near-infrared and methane color channels using Hubble and New Horizons observations of Charon and scaling from the red channel stellar calibration. Both calibration techniques produce very similar results (better than 7% agreement), providing strong validation for the techniques used. Since the stellar calibration described here can be performed without a color target in the field of view and covers all of MVIC's detectors, this calibration was used to provide the radiometric keyword values delivered by the New Horizons project to the Planetary Data System (PDS). These keyword values allow each observation to be converted from counts to physical units; a description of how these keyword values were generated is included. Finally, mitigation techniques adopted for the gain drift observed in the near-infrared detector and one of the panchromatic framing cameras are also discussed.
Bayesian analyses of time-interval data for environmental radiation monitoring.
Luo, Peng; Sharp, Julia L; DeVol, Timothy A
2013-01-01
Time-interval (time difference between two consecutive pulses) analysis based on the principles of Bayesian inference was investigated for online radiation monitoring. Using experimental and simulated data, Bayesian analysis of time-interval data [Bayesian (ti)] was compared with Bayesian and a conventional frequentist analysis of counts in a fixed count time [Bayesian (cnt) and single interval test (SIT), respectively]. The performances of the three methods were compared in terms of average run length (ARL) and detection probability for several simulated detection scenarios. Experimental data were acquired with a DGF-4C system in list mode. Simulated data were obtained using Monte Carlo techniques to obtain a random sampling of the Poisson distribution. All statistical algorithms were developed using the R Project for statistical computing. Bayesian analysis of time-interval information provided a similar detection probability as Bayesian analysis of count information, but the authors were able to make a decision with fewer pulses at relatively higher radiation levels. In addition, for the cases with very short presence of the source (< count time), time-interval information is more sensitive to detect a change than count information since the source data is averaged by the background data over the entire count time. The relationships of the source time, change points, and modifications to the Bayesian approach for increasing detection probability are presented.
Spatial patterns in vegetation fires in the Indian region.
Vadrevu, Krishna Prasad; Badarinath, K V S; Anuradha, Eaturu
2008-12-01
In this study, we used fire count datasets derived from Along Track Scanning Radiometer (ATSR) satellite to characterize spatial patterns in fire occurrences across highly diverse geographical, vegetation and topographic gradients in the Indian region. For characterizing the spatial patterns of fire occurrences, observed fire point patterns were tested against the hypothesis of a complete spatial random (CSR) pattern using three different techniques, the quadrat analysis, nearest neighbor analysis and Ripley's K function. Hierarchical nearest neighboring technique was used to depict the 'hotspots' of fire incidents. Of the different states, highest fire counts were recorded in Madhya Pradesh (14.77%) followed by Gujarat (10.86%), Maharastra (9.92%), Mizoram (7.66%), Jharkhand (6.41%), etc. With respect to the vegetation categories, highest number of fires were recorded in agricultural regions (40.26%) followed by tropical moist deciduous vegetation (12.72), dry deciduous vegetation (11.40%), abandoned slash and burn secondary forests (9.04%), tropical montane forests (8.07%) followed by others. Analysis of fire counts based on elevation and slope range suggested that maximum number of fires occurred in low and medium elevation types and in very low to low-slope categories. Results from three different spatial techniques for spatial pattern suggested clustered pattern in fire events compared to CSR. Most importantly, results from Ripley's K statistic suggested that fire events are highly clustered at a lag-distance of 125 miles. Hierarchical nearest neighboring clustering technique identified significant clusters of fire 'hotspots' in different states in northeast and central India. The implications of these results in fire management and mitigation were discussed. Also, this study highlights the potential of spatial point pattern statistics in environmental monitoring and assessment studies with special reference to fire events in the Indian region.
A removal model for estimating detection probabilities from point-count surveys
Farnsworth, G.L.; Pollock, K.H.; Nichols, J.D.; Simons, T.R.; Hines, J.E.; Sauer, J.R.
2000-01-01
We adapted a removal model to estimate detection probability during point count surveys. The model assumes one factor influencing detection during point counts is the singing frequency of birds. This may be true for surveys recording forest songbirds when most detections are by sound. The model requires counts to be divided into several time intervals. We used time intervals of 2, 5, and 10 min to develop a maximum-likelihood estimator for the detectability of birds during such surveys. We applied this technique to data from bird surveys conducted in Great Smoky Mountains National Park. We used model selection criteria to identify whether detection probabilities varied among species, throughout the morning, throughout the season, and among different observers. The overall detection probability for all birds was 75%. We found differences in detection probability among species. Species that sing frequently such as Winter Wren and Acadian Flycatcher had high detection probabilities (about 90%) and species that call infrequently such as Pileated Woodpecker had low detection probability (36%). We also found detection probabilities varied with the time of day for some species (e.g. thrushes) and between observers for other species. This method of estimating detectability during point count surveys offers a promising new approach to using count data to address questions of the bird abundance, density, and population trends.
Real-time passenger counting by active linear cameras
NASA Astrophysics Data System (ADS)
Khoudour, Louahdi; Duvieubourg, Luc; Deparis, Jean-Pierre
1996-03-01
The companies operating subways are very much concerned with counting the passengers traveling through their transport systems. One of the most widely used systems for counting passengers consists of a mechanical gate equipped with a counter. However, such simple systems are not able to count passengers jumping above the gates. Moreover, passengers carrying large luggage or bags may meet some difficulties when going through such gates. The ideal solution is a contact-free counting system that would bring more comfort of use for the passengers. For these reasons, we propose to use a video processing system instead of these mechanical gates. The optical sensors discussed in this paper offer several advantages including well defined detection areas, fast response time and reliable counting capability. A new technology has been developed and tested, based on linear cameras. Preliminary results show that this system is very efficient when the passengers crossing the optical gate are well separated. In other cases, such as in compact crowd conditions, reasonable accuracy has been demonstrated. These results are illustrated by means of a number of sequences shot in field conditions. It is our belief that more precise measurements could be achieved, in the case of compact crowd, by other algorithms and acquisition techniques of the line images that we are presently developing.
Cross-Section Measurements via the Activation Technique at the Cologne Clover Counting Setup
NASA Astrophysics Data System (ADS)
Heim, Felix; Mayer, Jan; Netterdon, Lars; Scholz, Philipp; Zilges, Andreas
The activation technique is a widely used method for the determination of cross-section values for charged-particle induced reactions at astrophysically relevant energies. Since network calculations of nucleosynthesis processes often depend on reaction rates calculated in the scope of the Hauser-Feshbach statistical model, these cross-sections can be used to improve the nuclear-physics input-parameters like optical-model potentials (OMP), γ-ray strength functions, and nuclear level densities. In order to extend the available experimental database, the 108Cd(α, n)111Sn reaction cross section was investigated at ten energies between 10.2 and 13.5 MeV. As this reaction at these energies is almost only sensitive on the α-decay width, the results were compared to statistical model calculations using different models for the α-OMP. The irradiation as well as the consecutive γ-ray counting were performed at the Institute for Nuclear Physics of the University of Cologne using the 10 MV FN-Tandem accelerator and the Cologne Clover Counting Setup. This setup consists of two clover- type high purity germanium (HPGe) detectors in a close face-to-face geometry to cover a solid angle of almost 4π.
NASA Technical Reports Server (NTRS)
Wilson, James Charles
1994-01-01
The ER-2 condensation nuclei counter (CNC) has been modified to reduce the diffusive losses of particles within the instrument. These changes have been successful in improving the counting efficiency of small particles at low pressures. Two techniques for measuring the size distributions of particles with diameters less than 0.17 micrometers have been evaluated. Both of these methods, the differential mobility analyzer (DMA) and the diffusion battery, have fundamental problems that limit their usefulness for stratospheric applications. We cannot recommend either for this application. Newly developed, alternative methods for measuring small particles include inertial separation with a low-loss critical orifice and thin-plate impactor device. This technique is now used to collect particles in the multisample aerosol collector housed in the ER-2 CNC-2, and shows some promise for particle size measurements when coupled with a CNC as a counting device. The modified focused-cavity aerosol spectrometer (FCAS) can determine the size distribution of particles with ambient diameters as small as about 0.07 micrometers. Data from this instrument indicates the presence of a nuclei mode when CNC-2 indicates high concentrations of particles, but cannot resolve important parameters of the distribution.
NASA Astrophysics Data System (ADS)
Itabashi, Masaaki; Nakajima, Shigeru; Fukuda, Hiroshi
After unexpected failure of metallic structure, microscopic investigation will be performed. Generally, such an investigation is limited to search striation pattern with a SEM (scanning electron microscope). But, when the cause of the failure was not severe repeated stress, this investigation is ineffective. In this paper, new microscopic observation technique is proposed to detect low cycle fatigue-impact tensile loading history. Al alloys, 6061-T6 and 2219-T87, were fractured in dynamic tension, after severe pre-fatigue. The side surface of the fractured specimens was observed with a SEM. Neighboring fractured surface, many opened cracks on the side surface have been generated. For each specimen, the number of the cracks was counted together with information of individual sizes and geometric features. For 6061-T6 alloy specimen with the pre-fatigue, the number of the cracks is greater than that for the specimen without the pre-fatigue. For 2219-T87 alloy, the same tendency can be found after a certain screening of the crack counting. Therefore, the crack counting technique may be useful to detect the existence of the pre-fatigue from the dynamically fractured specimen surface.
Scoglio, M E; Di Pietro, A; Anzalone, C; Calimeri, S; Lo Giudice, D; Trimarchi, G R
2000-01-01
The toxicity of synthetic sewage containing increasing concentrations of arsenic (.125, .25, .5, 1.0 mg L-1), cadmium (.02, .05, .1, .2 mg L-1), lead (.2, .5, 1.0, 2.0 mg L-1) and nickel (.5, 1.0, 2.0, 4.0 mg L-1) has been investigated by determining the total direct count (TDC) and the direct viable count (DVC) of Salmonella enteritidis by means of an immunofluorescence technique (IFA). This has been done in order to evaluate the possibility of using the IFA technique to estimate the toxicity of complex effluents. Arsenic, cadmium and nickel produced a concentration-dependent reduction in the number of viable bacterial cells. This was more clear when the viable bacterial cells were considered than when only the culturable part was used. Lead did not show a concentration-dependent and reproducible effect. At the highest concentrations allowed by the Italian wastewater regulations, lead, cadmium, arsenic and nickel reduced the viable/total bacterial cells ratio to 74.5%, 68.5%, 28.4% and 6.9%, respectively. The toxic effects of the metals were also tested using the standard Microtox assay.
1980-01-01
In a multi-laboratory trial with the membrane filtration technique, three surfactants--Teepol 610 (T610), Tergitol 7 (T7) and sodium lauryl sulphate (LS)--were compared in media for the enumeration of coliform organisms and Escherichia coli in water. A total of 179 samples of water (87 raw and 92 marginally chlorinated) were examined for colony counts of coliform organisms, and 185 water samples (94 raw and 91 marginally chlorinated) for E. coli. Slight differences in the confirmed colony counts between the three media were noted, but few of these were observed consistently in every laboratory. In most laboratories, T7 gave slightly higher counts of E. coli than LS with chlorinated waters; a higher incidence of false-positive results for E. coli at 44 degrees C was also noted with T7. As there were no outstanding differences in the trial, sodium lauryl sulphate, which is chemically defined, cheap and readily available, is therefore recommended for use at a concentration of 0 . 1% instead of Teepol 610 in the standard medium for the enumeration of coliform organisms and E. coli in water by the membrane filtration technique. PMID:7005324
Bruserud, Ø; Liseth, K; Stamnesfet, S; Cacic, D L; Melve, G; Kristoffersen, E; Hervig, T; Reikvam, H
2013-12-01
Hyperleukocytosis is usually defined as leukocyte count >100 × 10(9) L(-1) and can be seen in newly diagnosed leukaemias. Hyperleukocytic leukaemia is associated with a risk of organ failure and early death secondary to leukostasis. Mechanical removal of leukocytes by the apheresis technique, leukocytapheresis, is a therapeutic option in these patients. During a 16-year period, 16 patients were treated with leukocytapheresis (35 apheresis procedures) for hyperleukocytosis/leukostasis. We present our experience, and in addition we review previous studies of hyperleukocytosis/leukocytapheresis in patients with acute myeloid leukaemia (AML). We used a highly standardised approach for leukocytapheresis in leukaemia patients with hyperleukocytosis. The average leukocytapheresis number for each patient was 2·2 (range 1-6). Median leukocyte count before apheresis was 309 × 10(9) L(-1) (range 104-935); the mean leukocyte count reduction was 71%, corresponding to a mean absolute reduction of 219 × 10(9) L(-1). No serious side effects were seen during or immediately after apheresis. The data suggest that our standardised technique for leukocytapheresis effectively reduced the peripheral blood leukaemia cell counts. Previous studies in AML also support the conclusion that this is a safe and effective procedure for the treatment of a potentially life-threatening complication, but apheresis should always be combined with early chemotherapy. © 2013 The Authors. Transfusion Medicine © 2013 British Blood Transfusion Society.
Terminology used in estimating number of birds
C. John Ralph
1981-01-01
Hundreds of papers, such as those presented in this volume, are produced annually by workers using various techniques. The purpose of this section is twofold: (1) to provide for readers unfamiliar with counting techniques a handy guide to the most common terms and methods used in the field; and (2) to attempt to set a consensus on the meanings of some terms that are...
The temperature of large dust grains in molecular clouds
NASA Technical Reports Server (NTRS)
Clark, F. O.; Laureijs, R. J.; Prusti, T.
1991-01-01
The temperature of the large dust grains is calculated from three molecular clouds ranging in visual extinction from 2.5 to 8 mag, by comparing maps of either extinction derived from star counts or gas column density derived from molecular observations to I(100). Both techniques show the dust temperature declining into clouds. The two techniques do not agree in absolute scale.
A Comparison of Methods to Analyze Aquatic Heterotrophic Flagellates of Different Taxonomic Groups.
Jeuck, Alexandra; Nitsche, Frank; Wylezich, Claudia; Wirth, Olaf; Bergfeld, Tanja; Brutscher, Fabienne; Hennemann, Melanie; Monir, Shahla; Scherwaß, Anja; Troll, Nicole; Arndt, Hartmut
2017-08-01
Heterotrophic flagellates contribute significantly to the matter flux in aquatic and terrestrial ecosystems. Still today their quantification and taxonomic classification bear several problems in field studies, though these methodological problems seem to be increasingly ignored in current ecological studies. Here we describe and test different methods, the live-counting technique, different fixation techniques, cultivation methods like the liquid aliquot method (LAM), and a molecular single cell survey called aliquot PCR (aPCR). All these methods have been tested either using aquatic field samples or cultures of freshwater and marine taxa. Each of the described methods has its advantages and disadvantages, which have to be considered in every single case. With the live-counting technique a detection of living cells up to morphospecies level is possible. Fixation of cells and staining methods are advantageous due to the possible long-term storage and observation of samples. Cultivation methods (LAM) offer the possibility of subsequent molecular analyses, and aPCR tools might complete the deficiency of LAM in terms of the missing detection of non-cultivable flagellates. In summary, we propose a combination of several investigation techniques reducing the gap between the different methodological problems. Copyright © 2017 Elsevier GmbH. All rights reserved.
NASA Astrophysics Data System (ADS)
Jakopic, Rozle; Richter, Stephan; Kühn, Heinz; Benedik, Ljudmila; Pihlar, Boris; Aregbe, Yetunde
2009-01-01
A sample preparation procedure for isotopic measurements using thermal ionization mass spectrometry (TIMS) was developed which employs the technique of carburization of rhenium filaments. Carburized filaments were prepared in a special vacuum chamber in which the filaments were exposed to benzene vapour as a carbon supply and carburized electrothermally. To find the optimal conditions for the carburization and isotopic measurements using TIMS, the influence of various parameters such as benzene pressure, carburization current and the exposure time were tested. As a result, carburization of the filaments improved the overall efficiency by one order of magnitude. Additionally, a new "multi-dynamic" measurement technique was developed for Pu isotope ratio measurements using a "multiple ion counting" (MIC) system. This technique was combined with filament carburization and applied to the NBL-137 isotopic standard and samples of the NUSIMEP 5 inter-laboratory comparison campaign, which included certified plutonium materials at the ppt-level. The multi-dynamic measurement technique for plutonium, in combination with filament carburization, has been shown to significantly improve the precision and accuracy for isotopic analysis of environmental samples with low-levels of plutonium.
Cheng, Cynthia; Lee, Chadd W; Daskalakis, Constantine
2015-10-27
Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient's microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.(1) This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique.
Daskalakis, Constantine
2015-01-01
Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient’s microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.1 This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique. PMID:26554744
Closed circuit TV system automatically guides welding arc
NASA Technical Reports Server (NTRS)
Stephans, D. L.; Wall, W. A., Jr.
1968-01-01
Closed circuit television /CCTV/ system automatically guides a welding torch to position the welding arc accurately along weld seams. Digital counting and logic techniques incorporated in the control circuitry, ensure performance reliability.
Time estimation as a secondary task to measure workload: Summary of research
NASA Technical Reports Server (NTRS)
Hart, S. G.; Mcpherson, D.; Loomis, L. L.
1978-01-01
Actively produced intervals of time were found to increase in length and variability, whereas retrospectively produced intervals decreased in length although they also increased in variability with the addition of a variety of flight-related tasks. If pilots counted aloud while making a production, however, the impact of concurrent activity was minimized, at least for the moderately demanding primary tasks that were selected. The effects of feedback on estimation accuracy and consistency were greatly enhanced if a counting or tapping production technique was used. This compares with the minimal effect that feedback had when no overt timekeeping technique was used. Actively made verbal estimates of sessions filled with different activities performed during the interval were increased. Retrospectively made verbal estimates, however, increased in length as the amount and complexity of activities performed during the interval were increased.
Mass-dependent channel electron multiplier operation. [for ion detection
NASA Technical Reports Server (NTRS)
Fields, S. A.; Burch, J. L.; Oran, W. A.
1977-01-01
The absolute counting efficiency and pulse height distributions of a continuous-channel electron multiplier used in the detection of hydrogen, argon and xenon ions are assessed. The assessment technique, which involves the post-acceleration of 8-eV ion beams to energies from 100 to 4000 eV, provides information on counting efficiency versus post-acceleration voltage characteristics over a wide range of ion mass. The charge pulse height distributions for H2 (+), A (+) and Xe (+) were measured by operating the experimental apparatus in a marginally gain-saturated mode. It was found that gain saturation occurs at lower channel multiplier operating voltages for light ions such as H2 (+) than for the heavier ions A (+) and Xe (+), suggesting that the technique may be used to discriminate between these two classes of ions in electrostatic analyzers.
Growth of alveoli during postnatal development in humans based on stereological estimation.
Herring, Matt J; Putney, Lei F; Wyatt, Gregory; Finkbeiner, Walter E; Hyde, Dallas M
2014-08-15
Alveolarization in humans and nonhuman primates begins during prenatal development. Advances in stereological counting techniques allow accurate assessment of alveolar number; however, these techniques have not been applied to the developing human lung. Based on the recent American Thoracic Society guidelines for stereology, lungs from human autopsies, ages 2 mo to 15 yr, were fractionated and isometric uniform randomly sampled to count the number of alveoli. The number of alveoli was compared with age, weight, and height as well as growth between right and left lungs. The number of alveoli in the human lung increased exponentially during the first 2 yr of life but continued to increase albeit at a reduced rate through adolescence. Alveolar numbers also correlated with the indirect radial alveolar count technique. Growth curves for human alveolarization were compared using historical data of nonhuman primates and rats. The alveolar growth rate in nonhuman primates was nearly identical to the human growth curve. Rats were significantly different, showing a more pronounced exponential growth during the first 20 days of life. This evidence indicates that the human lung may be more plastic than originally thought, with alveolarization occurring well into adolescence. The first 20 days of life in rats implies a growth curve that may relate more to prenatal growth in humans. The data suggest that nonhuman primates are a better laboratory model for studies of human postnatal lung growth than rats. Copyright © 2014 the American Physiological Society.
Advanced analysis techniques for uranium assay
DOE Office of Scientific and Technical Information (OSTI.GOV)
Geist, W. H.; Ensslin, Norbert; Carrillo, L. A.
2001-01-01
Uranium has a negligible passive neutron emission rate making its assay practicable only with an active interrogation method. The active interrogation uses external neutron sources to induce fission events in the uranium in order to determine the mass. This technique requires careful calibration with standards that are representative of the items to be assayed. The samples to be measured are not always well represented by the available standards which often leads to large biases. A technique of active multiplicity counting is being developed to reduce some of these assay difficulties. Active multiplicity counting uses the measured doubles and triples countmore » rates to determine the neutron multiplication (f4) and the product of the source-sample coupling ( C ) and the 235U mass (m). Since the 35U mass always appears in the multiplicity equations as the product of Cm, the coupling needs to be determined before the mass can be known. A relationship has been developed that relates the coupling to the neutron multiplication. The relationship is based on both an analytical derivation and also on empirical observations. To determine a scaling constant present in this relationship, known standards must be used. Evaluation of experimental data revealed an improvement over the traditional calibration curve analysis method of fitting the doubles count rate to the 235Um ass. Active multiplicity assay appears to relax the requirement that the calibration standards and unknown items have the same chemical form and geometry.« less
Jones, Jeryl C; Appt, Susan E; Bourland, J Daniel; Hoyer, Patricia B; Clarkson, Thomas B; Kaplan, Jay R
2007-09-01
Macaques are important models for menopause and associated diseases in women. A sensitive, noninvasive technique for quantifying changes in ovarian morphology would facilitate longitudinal studies focused on the health-related sequelae of naturally occurring or experimentally induced alterations in ovarian structure and function. Multidetector computed tomography (MDCT) is a fast, non-invasive imaging technique that uses X-rays, multiple rows of detectors, and computers to generate detailed slice images of structures. The purpose of this study was to describe the utility of MDCT for reliably characterizing ovarian morphology in macaques. Five macaques were scanned using contrast-enhanced MDCT. The following characteristics were described: 1) appearance of ovaries and adjacent landmarks, 2) effects of varying technical protocols on ovarian image quality, 3) radiation doses delivered to the pelvic region during scanning, and 4) MDCT estimates of ovarian volume and antral follicle counts versus those measured directly in ovarian tissue. Ovaries were distinguishable in all MDCT scans and exhibited heterogeneous contrast enhancement. Antral follicles appeared as focal areas of nonenhancement. Ovarian image quality with 5 pediatric scanning protocols was sufficient for discriminating ovarian margins. Pelvic region radiation doses ranged from 0.5 to 0.7 rad. Antral follicles counted using MDCT ranged from 3 to 5 compared with 3 to 4 counted using histology. Ovarian volumes measured using MDCT ranged from 0.41 to 0.67 ml compared with 0.40 to 0.65 ml by water displacement. MDCT is a promising technique for measuring longitudinal changes in macaque ovarian morphology reliably and noninvasively.
Nasi, Milena; De Biasi, Sara; Bianchini, Elena; Gibellini, Lara; Pinti, Marcello; Scacchetti, Tiziana; Trenti, Tommaso; Borghi, Vanni; Mussini, Cristina; Cossarizza, Andrea
2015-01-01
An accurate and affordable CD4+ T cells count is an essential tool in the fight against HIV/AIDS. Flow cytometry (FCM) is the "gold standard" for counting such cells, but this technique is expensive and requires sophisticated equipment, temperature-sensitive monoclonal antibodies (mAbs) and trained personnel. The lack of access to technical support and quality assurance programs thus limits the use of FCM in resource-constrained countries. We have tested the accuracy, the precision and the carry-over contamination of Partec CyFlow MiniPOC, a portable and economically affordable flow cytometer designed for CD4+ count and percentage, used along with the "CD4% Count Kit-Dry". Venous blood from 59 adult HIV+ patients (age: 25-58 years; 43 males and 16 females) was collected and stained with the "MiniPOC CD4% Count Kit-Dry". CD4+ count and percentage were then determined in triplicate by the CyFlow MiniPOC. In parallel, CD4 count was performed using mAbs and a CyFlow Counter, or by a dual platform system (from Beckman Coulter) based upon Cytomic FC500 ("Cytostat tetrachrome kit" for mAbs) and Coulter HmX Hematology Analyzer (for absolute cell count). The accuracy of CyFlow MiniPOC against Cytomic FC500 showed a correlation coefficient (CC) of 0.98 and 0.97 for CD4+ count and percentage, respectively. The accuracy of CyFlow MiniPOC against CyFlow Counter showed a CC of 0.99 and 0.99 for CD4 T cell count and percentage, respectively. CyFlow MiniPOC showed an excellent repeatability: CD4+ cell count and percentage were analyzed on two instruments, with an intra-assay precision below ± 5% deviation. Finally, there was no carry-over contamination for samples at all CD4 values, regardless of their position in the sequence of analysis. The cost-effective CyFlow MiniPOC produces rapid, reliable and accurate results that are fully comparable with those from highly expensive dual platform systems.
Modeling the frequency-dependent detective quantum efficiency of photon-counting x-ray detectors.
Stierstorfer, Karl
2018-01-01
To find a simple model for the frequency-dependent detective quantum efficiency (DQE) of photon-counting detectors in the low flux limit. Formula for the spatial cross-talk, the noise power spectrum and the DQE of a photon-counting detector working at a given threshold are derived. Parameters are probabilities for types of events like single counts in the central pixel, double counts in the central pixel and a neighboring pixel or single count in a neighboring pixel only. These probabilities can be derived in a simple model by extensive use of Monte Carlo techniques: The Monte Carlo x-ray propagation program MOCASSIM is used to simulate the energy deposition from the x-rays in the detector material. A simple charge cloud model using Gaussian clouds of fixed width is used for the propagation of the electric charge generated by the primary interactions. Both stages are combined in a Monte Carlo simulation randomizing the location of impact which finally produces the required probabilities. The parameters of the charge cloud model are fitted to the spectral response to a polychromatic spectrum measured with our prototype detector. Based on the Monte Carlo model, the DQE of photon-counting detectors as a function of spatial frequency is calculated for various pixel sizes, photon energies, and thresholds. The frequency-dependent DQE of a photon-counting detector in the low flux limit can be described with an equation containing only a small set of probabilities as input. Estimates for the probabilities can be derived from a simple model of the detector physics. © 2017 American Association of Physicists in Medicine.
Sample to answer visualization pipeline for low-cost point-of-care blood cell counting
NASA Astrophysics Data System (ADS)
Smith, Suzanne; Naidoo, Thegaran; Davies, Emlyn; Fourie, Louis; Nxumalo, Zandile; Swart, Hein; Marais, Philip; Land, Kevin; Roux, Pieter
2015-03-01
We present a visualization pipeline from sample to answer for point-of-care blood cell counting applications. Effective and low-cost point-of-care medical diagnostic tests provide developing countries and rural communities with accessible healthcare solutions [1], and can be particularly beneficial for blood cell count tests, which are often the starting point in the process of diagnosing a patient [2]. The initial focus of this work is on total white and red blood cell counts, using a microfluidic cartridge [3] for sample processing. Analysis of the processed samples has been implemented by means of two main optical visualization systems developed in-house: 1) a fluidic operation analysis system using high speed video data to determine volumes, mixing efficiency and flow rates, and 2) a microscopy analysis system to investigate homogeneity and concentration of blood cells. Fluidic parameters were derived from the optical flow [4] as well as color-based segmentation of the different fluids using a hue-saturation-value (HSV) color space. Cell count estimates were obtained using automated microscopy analysis and were compared to a widely accepted manual method for cell counting using a hemocytometer [5]. The results using the first iteration microfluidic device [3] showed that the most simple - and thus low-cost - approach for microfluidic component implementation was not adequate as compared to techniques based on manual cell counting principles. An improved microfluidic design has been developed to incorporate enhanced mixing and metering components, which together with this work provides the foundation on which to successfully implement automated, rapid and low-cost blood cell counting tests.
NASA Astrophysics Data System (ADS)
Hess, Dale; van Lieshout, Marie-Colette; Payne, Bill; Stein, Alfred
This paper describes how spatial statistical techniques may be used to analyse weed occurrence in tropical fields. Quadrat counts of weed numbers are available over a series of years, as well as data on explanatory variables, and the aim is to smooth the data and assess spatial and temporal trends. We review a range of models for correlated count data. As an illustration, we consider data on striga infestation of a 60 × 24 m 2 millet field in Niger collected from 1985 until 1991, modelled by independent Poisson counts and a prior auto regression term enforcing spatial coherence. The smoothed fields show the presence of a seed bank, the estimated model parameters indicate a decay in the striga numbers over time, as well as a clear correlation with the amount of rainfall in 15 consecutive days following the sowing date. Such results could contribute to precision agriculture as a guide to more cost-effective striga control strategies.
Fluorometric determination of the DNA concentration in municipal drinking water.
McCoy, W F; Olson, B H
1985-01-01
DNA concentrations in municipal drinking water samples were measured by fluorometry, using Hoechst 33258 fluorochrome. The concentration, extraction, and detection methods used were adapted from existing techniques. The method is reproducible, fast, accurate, and simple. The amounts of DNA per cell for five different bacterial isolates obtained from drinking water samples were determined by measuring DNA concentration and total cell concentration (acridine orange epifluorescence direct cell counting) in stationary pure cultures. The relationship between DNA concentration and epifluorescence total direct cell concentration in 11 different drinking water samples was linear and positive; the amounts of DNA per cell in these samples did not differ significantly from the amounts in pure culture isolates. We found significant linear correlations between DNA concentration and colony-forming unit concentration, as well as between epifluorescence direct cell counts and colony-forming unit concentration. DNA concentration measurements of municipal drinking water samples appear to monitor changes in bacteriological quality at least as well as total heterotrophic plate counting and epifluorescence direct cell counting. PMID:3890737
Woodcock singing-ground counts and habitat changes in the northeastern United States
Dwyer, T.J.; McAuley, D.G.; Derleth, E.L.
1983-01-01
Aerial photography from the late 1960's and the late 1970's was used to study habitat changes along 78 American woodcock (Scolopax minor) singing-ground routes in 9 northeastern states. The most noticeable changes were declines in the amount of abandoned field, cropland, shrubland, and field/pasture. The amount of land in the urban/industrial type increased 33.4% from the late 1960's to the late 1970's. We examined relationships between the woodcock call-count index and habitat variables using multiple-regression techniques. The abundance of calling male woodcock was positively correlated with the amount of abandoned field and alder (Alnus sp.) and negatively correlated with the amount of urban/industrial type. However, only the change in the urban/industrial type was significantly (P < 0.05) related to the change in the call-count index. Urban/industrial area increased, whereas the call-count index declined on average in our sample of routes by 1.4 birds/route (40.5%).
Angel, J.C.; Nelson, D.O.; Panno, S.V.
2004-01-01
A new Geographic Information System (GIS) method was developed as an alternative to the hand-counting of sinkholes on topographic maps for density and distribution studies. Sinkhole counts were prepared by hand and compared to those generated from USGS DLG data using ArcView 3.2 and the ArcInfo Workstation component of ArcGIS 8.1 software. The study area for this investigation, chosen for its great density of sinkholes, included the 42 public land survey sections that reside entirely within the Renault Quadrangle in southwestern Illinois. Differences between the sinkhole counts derived from the two methods for the Renault Quadrangle study area were negligible. Although the initial development and refinement of the GIS method required considerably more time than counting sinkholes by hand, the flexibility of the GIS method is expected to provide significant long-term benefits and time savings when mapping larger areas and expanding research efforts. ?? 2004 by The National Speleological Society.
SPITZER 70 AND 160 {mu}m OBSERVATIONS OF THE COSMOS FIELD
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frayer, D. T.; Huynh, M. T.; Bhattacharya, B.
2009-11-15
We present Spitzer 70 and 160 {mu}m observations of the COSMOS Spitzer survey (S-COSMOS). The data processing techniques are discussed for the publicly released products consisting of images and source catalogs. We present accurate 70 and 160 {mu}m source counts of the COSMOS field and find reasonable agreement with measurements in other fields and with model predictions. The previously reported counts for GOODS-North and the extragalactic First Look Survey are updated with the latest calibration, and counts are measured based on the large area SWIRE survey to constrain the bright source counts. We measure an extragalactic confusion noise level ofmore » {sigma} {sub c} = 9.4 {+-} 3.3 mJy (q = 5) for the MIPS 160 {mu}m band based on the deep S-COSMOS data and report an updated confusion noise level of {sigma} {sub c} = 0.35 {+-} 0.15 mJy (q = 5) for the MIPS 70 {mu}m band.« less
Photon Counting System for High-Sensitivity Detection of Bioluminescence at Optical Fiber End.
Iinuma, Masataka; Kadoya, Yutaka; Kuroda, Akio
2016-01-01
The technique of photon counting is widely used for various fields and also applicable to a high-sensitivity detection of luminescence. Thanks to recent development of single photon detectors with avalanche photodiodes (APDs), the photon counting system with an optical fiber has become powerful for a detection of bioluminescence at an optical fiber end, because it allows us to fully use the merits of compactness, simple operation, highly quantum efficiency of the APD detectors. This optical fiber-based system also has a possibility of improving the sensitivity to a local detection of Adenosine triphosphate (ATP) by high-sensitivity detection of the bioluminescence. In this chapter, we are introducing a basic concept of the optical fiber-based system and explaining how to construct and use this system.
Jiménez-Banzo, Ana; Ragàs, Xavier; Kapusta, Peter; Nonell, Santi
2008-09-01
Two recent advances in optoelectronics, namely novel near-IR sensitive photomultipliers and inexpensive yet powerful diode-pumped solid-state lasers working at kHz repetition rate, enable the time-resolved detection of singlet oxygen (O2(a1Deltag)) phosphorescence in photon counting mode, thereby boosting the time-resolution, sensitivity, and dynamic range of this well-established detection technique. Principles underlying this novel approach and selected examples of applications are provided in this perspective, which illustrate the advantages over the conventional analog detection mode.
Hemispheric specialization in quantification processes.
Pasini, M; Tessari, A
2001-01-01
Three experiments were carried out to study hemispheric specialization for subitizing (the rapid enumeration of small patterns) and counting (the serial quantification process based on some formal principles). The experiments consist of numerosity identification of dot patterns presented in one visual field, with a tachistoscopic technique, or eye movements monitored through glasses, and comparison between centrally presented dot patterns and lateralized tachistoscopically presented digits. Our experiments show left visual field advantage in the identification and comparison tasks in the subitizing range, whereas right visual field advantage has been found in the comparison task for the counting range.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Morton, April M; Piburn, Jesse O; McManamay, Ryan A
2017-01-01
Monte Carlo simulation is a popular numerical experimentation technique used in a range of scientific fields to obtain the statistics of unknown random output variables. Despite its widespread applicability, it can be difficult to infer required input probability distributions when they are related to population counts unknown at desired spatial resolutions. To overcome this challenge, we propose a framework that uses a dasymetric model to infer the probability distributions needed for a specific class of Monte Carlo simulations which depend on population counts.
Microradiography with Semiconductor Pixel Detectors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jakubek, Jan; Cejnarova, Andrea; Dammer, Jiri
High resolution radiography (with X-rays, neutrons, heavy charged particles, ...) often exploited also in tomographic mode to provide 3D images stands as a powerful imaging technique for instant and nondestructive visualization of fine internal structure of objects. Novel types of semiconductor single particle counting pixel detectors offer many advantages for radiation imaging: high detection efficiency, energy discrimination or direct energy measurement, noiseless digital integration (counting), high frame rate and virtually unlimited dynamic range. This article shows the application and potential of pixel detectors (such as Medipix2 or TimePix) in different fields of radiation imaging.
A Motivational Technique for Business Math
ERIC Educational Resources Information Center
Voelker, Pamela
1977-01-01
The author suggests the use of simulation and role playing as a method of motivating students in business math. Examples of career-oriented business math simulation games are counting change, banking, payrolls, selling, and shopping. (MF)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peronio, P.; Acconcia, G.; Rech, I.
Time-Correlated Single Photon Counting (TCSPC) has been long recognized as the most sensitive method for fluorescence lifetime measurements, but often requiring “long” data acquisition times. This drawback is related to the limited counting capability of the TCSPC technique, due to pile-up and counting loss effects. In recent years, multi-module TCSPC systems have been introduced to overcome this issue. Splitting the light into several detectors connected to independent TCSPC modules proportionally increases the counting capability. Of course, multi-module operation also increases the system cost and can cause space and power supply problems. In this paper, we propose an alternative approach basedmore » on a new detector and processing electronics designed to reduce the overall system dead time, thus enabling efficient photon collection at high excitation rate. We present a fast active quenching circuit for single-photon avalanche diodes which features a minimum dead time of 12.4 ns. We also introduce a new Time-to-Amplitude Converter (TAC) able to attain extra-short dead time thanks to the combination of a scalable array of monolithically integrated TACs and a sequential router. The fast TAC (F-TAC) makes it possible to operate the system towards the upper limit of detector count rate capability (∼80 Mcps) with reduced pile-up losses, addressing one of the historic criticisms of TCSPC. Preliminary measurements on the F-TAC are presented and discussed.« less
Atom-counting in High Resolution Electron Microscopy:TEM or STEM - That's the question.
Gonnissen, J; De Backer, A; den Dekker, A J; Sijbers, J; Van Aert, S
2017-03-01
In this work, a recently developed quantitative approach based on the principles of detection theory is used in order to determine the possibilities and limitations of High Resolution Scanning Transmission Electron Microscopy (HR STEM) and HR TEM for atom-counting. So far, HR STEM has been shown to be an appropriate imaging mode to count the number of atoms in a projected atomic column. Recently, it has been demonstrated that HR TEM, when using negative spherical aberration imaging, is suitable for atom-counting as well. The capabilities of both imaging techniques are investigated and compared using the probability of error as a criterion. It is shown that for the same incoming electron dose, HR STEM outperforms HR TEM under common practice standards, i.e. when the decision is based on the probability function of the peak intensities in HR TEM and of the scattering cross-sections in HR STEM. If the atom-counting decision is based on the joint probability function of the image pixel values, the dependence of all image pixel intensities as a function of thickness should be known accurately. Under this assumption, the probability of error may decrease significantly for atom-counting in HR TEM and may, in theory, become lower as compared to HR STEM under the predicted optimal experimental settings. However, the commonly used standard for atom-counting in HR STEM leads to a high performance and has been shown to work in practice. Copyright © 2017 Elsevier B.V. All rights reserved.
Fu, Xin; Huang, Kelong; Liu, Suqin
2010-02-01
In this paper, a rapid, simple, and sensitive method was described for detection of the total bacterial count using SiO(2)-coated CdSe/ZnS quantum dots (QDs) as a fluorescence marker that covalently coupled with bacteria using glutaraldehyde as the crosslinker. Highly luminescent CdSe/ZnS were prepared by applying cadmium oxide and zinc stearate as precursors instead of pyrophoric organometallic precursors. A reverse-microemulsion technique was used to synthesize CdSe/ZnS/SiO(2) composite nanoparticles with a SiO(2) surface coating. Our results showed that CdSe/ZnS/SiO(2) composite nanoparticles prepared with this method possessed highly luminescent, biologically functional, and monodispersive characteristics, and could successfully be covalently conjugated with the bacteria. As a demonstration, it was found that the method had higher sensitivity and could count bacteria in 3 x 10(2) CFU/mL, lower than the conventional plate counting and organic dye-based method. A linear relationship of the fluorescence peak intensity (Y) and the total bacterial count (X) was established in the range of 3 x 10(2)-10(7) CFU/mL using the equation Y = 374.82X-938.27 (R = 0.99574). The results of the determination for the total count of bacteria in seven real samples were identical with the conventional plate count method, and the standard deviation was satisfactory.
Conclusions on measurement uncertainty in microbiology.
Forster, Lynne I
2009-01-01
Since its first issue in 1999, testing laboratories wishing to comply with all the requirements of ISO/IEC 17025 have been collecting data for estimating uncertainty of measurement for quantitative determinations. In the microbiological field of testing, some debate has arisen as to whether uncertainty needs to be estimated for each method performed in the laboratory for each type of sample matrix tested. Queries also arise concerning the estimation of uncertainty when plate/membrane filter colony counts are below recommended method counting range limits. A selection of water samples (with low to high contamination) was tested in replicate with the associated uncertainty of measurement being estimated from the analytical results obtained. The analyses performed on the water samples included total coliforms, fecal coliforms, fecal streptococci by membrane filtration, and heterotrophic plate counts by the pour plate technique. For those samples where plate/membrane filter colony counts were > or =20, uncertainty estimates at a 95% confidence level were very similar for the methods, being estimated as 0.13, 0.14, 0.14, and 0.12, respectively. For those samples where plate/membrane filter colony counts were <20, estimated uncertainty values for each sample showed close agreement with published confidence limits established using a Poisson distribution approach.
Reticulocyte analysis using flow cytometry.
Corberand, J X
1996-12-01
Automation of the reticulocyte count by means of flow cytometry has considerably improved the quality of this investigation. This article deals firstly with the reasons for the poor performance of the microscopic technique and with the physiological principles underlying identification and classification of reticulocytes using RNA labeling. It then outlines the automated methods currently on the market, which can be classified in three categories: a) "general-purpose" cytofluorometers, which in clinical laboratories usually deal with lymphocyte immunophenotyping; b) the only commercially available cytofluorometer dedicated to the reticulocyte count; this automat has the advantage of requiring no human intervention as it merely needs to be fed with samples; c) hematology analyzers with specific modules for automatic counting of reticulocytes previously incubated with a non-fluorescent dye. Of the various fluorescent markers available, thiazole orange, DEQTC iodide and auramine are most often used for this basic hematology test. The quality of the count, the availability of new reticulocyte indices (maturation index, percentage of young reticulocytes) and rapidity of the count give this test renewed value in the practical approach to the diagnosis of anemia, and also open new perspectives in the surveillance of aplastic anemia after chemotherapy or bone marrow grafting.
Development of an Automatic Echo-counting Program for HROFFT Spectrograms
NASA Astrophysics Data System (ADS)
Noguchi, Kazuya; Yamamoto, Masa-Yuki
2008-06-01
Radio meteor observations by Ham-band beacon or FM radio broadcasts using “Ham-band Radio meteor Observation Fast Fourier Transform” (HROFFT) an automatic operating software have been performed widely in recent days. Previously, counting of meteor echoes on the spectrograms of radio meteor observation was performed manually by observers. In the present paper, we introduce an automatic meteor echo counting software application. Although output images of the HROFFT contain both the features of meteor echoes and those of various types of noises, a newly developed image processing technique has been applied, resulting in software that enables a useful auto-counting tool. There exists a slight error in the processing on spectrograms when the observation site is affected by many disturbing noises. Nevertheless, comparison between software and manual counting revealed an agreement of almost 90%. Therefore, we can easily obtain a dataset of detection time, duration time, signal strength, and Doppler shift of each meteor echo from the HROFFT spectrograms. Using this software, statistical analyses of meteor activities is based on the results obtained at many Ham-band Radio meteor Observation (HRO) sites throughout the world, resulting in a very useful “standard” for monitoring meteor stream activities in real time.
Data analysis in emission tomography using emission-count posteriors
NASA Astrophysics Data System (ADS)
Sitek, Arkadiusz
2012-11-01
A novel approach to the analysis of emission tomography data using the posterior probability of the number of emissions per voxel (emission count) conditioned on acquired tomographic data is explored. The posterior is derived from the prior and the Poisson likelihood of the emission-count data by marginalizing voxel activities. Based on emission-count posteriors, examples of Bayesian analysis including estimation and classification tasks in emission tomography are provided. The application of the method to computer simulations of 2D tomography is demonstrated. In particular, the minimum-mean-square-error point estimator of the emission count is demonstrated. The process of finding this estimator can be considered as a tomographic image reconstruction technique since the estimates of the number of emissions per voxel divided by voxel sensitivities and acquisition time are the estimates of the voxel activities. As an example of a classification task, a hypothesis stating that some region of interest (ROI) emitted at least or at most r-times the number of events in some other ROI is tested. The ROIs are specified by the user. The analysis described in this work provides new quantitative statistical measures that can be used in decision making in diagnostic imaging using emission tomography.
Weirs: Counting and sampling adult salmonids in streams and rivers
Zimmerman, Christian E.; Zabkar, Laura M.; Johnson, David H.; Shrier, Brianna M.; O'Neal, Jennifer S.; Knutzen, John A.; Augerot, Xanthippe; O'Neal, Thomas A.; Pearsons, Todd N.
2007-01-01
Weirs—which function as porous barriers built across stream—have long been used to capture migrating fish in flowing waters. For example, the Netsilik peoples of northern Canada used V-shaped weirs constructed of river rocks gathered onsite to capture migrating Arctic char Salvelinus alpinus (Balikci 1970). Similarly, fences constructed of stakes and a latticework of willow branches or staves were used by Native Americans to capture migrating salmon in streams along the West Coast of North America (Stewart 1994). In modern times, weirs have also been used in terminal fisheries and to capture brood fish for use in fish culture. Weirs have been used to gather data on age structure, condition, sex ratio, spawning escapement, abundance, and migratory patterns of fish in streams. One of the critical elements of fisheries management and stock assessment of salmonids is a count of adult fish returning to spawn. Weirs are frequently used to capture or count fish to determine status and trends of populations or direct inseason management of fisheries; generally, weirs are the standard against which other techniques are measured. To evaluate fishery management actions, the number of fish escaping to spawn is often compared to river-specific target spawning requirements (O’Connell and Dempson 1995). A critical factor in these analyses is the determination of total run size (O’Connell 2003). O’Connell compared methods of run-size estimation against absolute counts from a rigid weir and concluded that, given the uncertainty of estimators, the absolute counts obtained at the weir wer significantly better than modeled estimates, which deviated as much as 50–60% from actual counts. The use of weirs is generally restricted to streams and small rivers because of construction expense, formation of navigation barriers, and the tendency of weirs to clog with debris, which can cause flooding and collapse of the structure (Hubert 1996). When feasible, however, weirs are generally regarded as the most accurate technique available to quantify escapement as the result is supposedly an absolute count (Cousens et al. 1982). Weirs also provide the opportunity to capture fish for observation and sampling of biological characteristics and tissues; they may also serve as recapture sites for basin-wide, mark–recapture population estimates. Temporary weirs are useful in monitoring wild populations of salmonids as well as for capturing broodstock for artificial propagation.
A technique for sampling low shrub vegetation, by crown volume classes
Jay R. Bentley; Donald W. Seegrist; David A. Blakeman
1970-01-01
The effects of herbicides or other cultural treatments of low shrubs can be sampled by a new technique using crown volume as the key variable. Low shrubs were grouped in 12 crown volume classes with index values based on height times surface area of crown. The number of plants, by species, in each class is counted on quadrats. Many quadrats are needed for highly...
ERIC Educational Resources Information Center
Hsu, Guo-Liang; Tang, Jung-Chang; Hwang, Wu-Yuin; Li, Yung-Chang; Hwang, Wu-Yuin; Li, Yung-Chang; Hung, Jung-Chao; Wei, Chun-Hwa
2016-01-01
The demands of money-counting skills potentially limit individuals with intellectual disability (ID) to master the one-more-than technique, particularly in Taiwan, which requires high daily minimum living expense for supporting an individual's daily life. Employing a multiple treatment design across price ranges and settings, this study compared…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shcheslavskiy, V. I.; Institute of Biomedical Technologies, Nizhny Novgorod State Medical Academy, Minin and Pozharsky Square, 10/1, Nizhny Novgorod 603005; Neubauer, A.
We present a lifetime imaging technique that simultaneously records the fluorescence and phosphorescence lifetime images in confocal laser scanning systems. It is based on modulating a high-frequency pulsed laser synchronously with the pixel clock of the scanner, and recording the fluorescence and phosphorescence signals by multidimensional time-correlated single photon counting board. We demonstrate our technique on the recording of the fluorescence/phosphorescence lifetime images of human embryonic kidney cells at different environmental conditions.
Preparation and validation of gross alpha/beta samples used in EML`s quality assessment program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Scarpitta, S.C.
1997-10-01
A set of water and filter samples have been incorporated into the existing Environmental Measurements Laboratory`s (EML) Quality Assessment Program (QAP) for gross alpha/beta determinations by participating DOE laboratories. The participating laboratories are evaluated by comparing their results with the EML value. The preferred EML method for measuring water and filter samples, described in this report, uses gas flow proportional counters with 2 in. detectors. Procedures for sample preparation, quality control and instrument calibration are presented. Liquid scintillation (LS) counting is an alternative technique that is suitable for quantifying both the alpha ({sup 241}Am, {sup 230}Th and {sup 238}Pu) andmore » beta ({sup 90}Sr/{sup 90}Y) activity concentrations in the solutions used to prepare the QAP water and air filter samples. Three LS counting techniques (Cerenkov, dual dpm and full spectrum analysis) are compared. These techniques may be used to validate the activity concentrations of each component in the alpha/beta solution before the QAP samples are actually prepared.« less
NASA Astrophysics Data System (ADS)
Clarke, James; Cheng, Kwan; Shindell, Orrin; Wang, Exing
We have designed and constructed a high-throughput electrofusion chamber and an incubator to fabricate Giant Unilamellar Vesicles (GUVs) consisting of high-melting lipids, low-melting lipids, cholesterol and both ordered and disordered phase sensitive fluorescent probes (DiIC12, dehydroergosterol and BODIPY-Cholesterol). GUVs were formed in a 3 stage pulse sequence electrofusion process with voltages ranging from 50mVpp to 2.2Vpp and frequencies from 5Hz to 10Hz. Steady state and time-correlated single-photon counting (TCSPC) fluorescence lifetime (FLIM) based confocal and/or multi-photon microscopic techniques were used to characterize phase separated lipid domains in GUVs. Confocal imaging measures the probe concentration and the chemical environment of the system. TCSPC techniques determine the chemical environment through the perturbation of fluorescent lifetimes of the probes in the system. The above techniques will be applied to investigate the protein-lipid interactions involving domain formation. Specifically, the mechanisms governing lipid domain formations in the above systems that mimic the lipid rafts in cells will be explored. Murchison Fellowship at Trinity University.
High order statistical signatures from source-driven measurements of subcritical fissile systems
NASA Astrophysics Data System (ADS)
Mattingly, John Kelly
1998-11-01
This research focuses on the development and application of high order statistical analyses applied to measurements performed with subcritical fissile systems driven by an introduced neutron source. The signatures presented are derived from counting statistics of the introduced source and radiation detectors that observe the response of the fissile system. It is demonstrated that successively higher order counting statistics possess progressively higher sensitivity to reactivity. Consequently, these signatures are more sensitive to changes in the composition, fissile mass, and configuration of the fissile assembly. Furthermore, it is shown that these techniques are capable of distinguishing the response of the fissile system to the introduced source from its response to any internal or inherent sources. This ability combined with the enhanced sensitivity of higher order signatures indicates that these techniques will be of significant utility in a variety of applications. Potential applications include enhanced radiation signature identification of weapons components for nuclear disarmament and safeguards applications and augmented nondestructive analysis of spent nuclear fuel. In general, these techniques expand present capabilities in the analysis of subcritical measurements.
Van Wyk, J A; Van Wyk, Laetitia
2002-12-01
Faecal pellets from a sheep that was artificially infected with a monoculture of Haemonchus contortus were collected over a 2-h period in the morning. In the laboratory the faeces were thoroughly mixed by hand and 48 by 1 g aliquots of the pellets were sealed in plastic bags, from which the air had gently been expressed. The faecal worm egg count of the sheep was about 14,000 g(-1). Varying numbers of the bags were either processed for faecal worm egg counting (FEC) by the McMaster technique on day 0, or were stored at one of the following temperatures: about 4 degrees C, -10 degrees C or -170 degrees C before processing. The faecal aliquots that were frozen were thawed at room temperature after having been frozen for either 2 h or 7 days, and processing of aliquots maintained at 4 degrees C proceeded shortly after the samples had been removed from the refrigerator. A dramatic reduction in egg numbers was found in all the aliquots that were frozen at -170 degrees C before faecal worm egg counts were done, as well as in those frozen for 7 days at about -10 degrees C. Numerous empty, or partially empty, egg shells were observed when performing the counts in faeces that had been frozen. In contrast, there was no significant reduction in the numbers of eggs in aliquots maintained for 7 days in a refrigerator at +/- 4 degrees C before examination, when compared with others examined shortly after collection of the faeces. Since H. contortus eggs in faeces are damaged by freezing, some methods that can be used for short term preservation are outlined. It is concluded that all nematode egg counts from cryopreserved faeces (whether in a freezer at -10 degrees C or in liquid nitrogen) should possibly be regarded as being inaccurate, unless the contrary can be demonstrated for different worm genera. However, exceptions are expected for the more rugged ova, such as those of the ascarids and Trichuris spp.
Semi-automatic assessment of skin capillary density: proof of principle and validation.
Gronenschild, E H B M; Muris, D M J; Schram, M T; Karaca, U; Stehouwer, C D A; Houben, A J H M
2013-11-01
Skin capillary density and recruitment have been proven to be relevant measures of microvascular function. Unfortunately, the assessment of skin capillary density from movie files is very time-consuming, since this is done manually. This impedes the use of this technique in large-scale studies. We aimed to develop a (semi-) automated assessment of skin capillary density. CapiAna (Capillary Analysis) is a newly developed semi-automatic image analysis application. The technique involves four steps: 1) movement correction, 2) selection of the frame range and positioning of the region of interest (ROI), 3) automatic detection of capillaries, and 4) manual correction of detected capillaries. To gain insight into the performance of the technique, skin capillary density was measured in twenty participants (ten women; mean age 56.2 [42-72] years). To investigate the agreement between CapiAna and the classic manual counting procedure, we used weighted Deming regression and Bland-Altman analyses. In addition, intra- and inter-observer coefficients of variation (CVs), and differences in analysis time were assessed. We found a good agreement between CapiAna and the classic manual method, with a Pearson's correlation coefficient (r) of 0.95 (P<0.001) and a Deming regression coefficient of 1.01 (95%CI: 0.91; 1.10). In addition, we found no significant differences between the two methods, with an intercept of the Deming regression of 1.75 (-6.04; 9.54), while the Bland-Altman analysis showed a mean difference (bias) of 2.0 (-13.5; 18.4) capillaries/mm(2). The intra- and inter-observer CVs of CapiAna were 2.5% and 5.6% respectively, while for the classic manual counting procedure these were 3.2% and 7.2%, respectively. Finally, the analysis time for CapiAna ranged between 25 and 35min versus 80 and 95min for the manual counting procedure. We have developed a semi-automatic image analysis application (CapiAna) for the assessment of skin capillary density, which agrees well with the classic manual counting procedure, is time-saving, and has a better reproducibility as compared to the classic manual counting procedure. As a result, the use of skin capillaroscopy is feasible in large-scale studies, which importantly extends the possibilities to perform microcirculation research in humans. © 2013.
Imputing missing data via sparse reconstruction techniques.
DOT National Transportation Integrated Search
2017-06-01
The State of Texas does not currently have an automated approach for estimating volumes for links without counts. This research project proposes the development of an automated system to efficiently estimate the traffic volumes on uncounted links, in...
1999-12-07
Scientists using NASA Hubble Space Telescope are studying the colors of star clusters to determine the age and history of starburst galaxies, a technique somewhat similar to the process of learning the age of a tree by counting its rings.
Accelerating the two-point and three-point galaxy correlation functions using Fourier transforms
NASA Astrophysics Data System (ADS)
Slepian, Zachary; Eisenstein, Daniel J.
2016-01-01
Though Fourier transforms (FTs) are a common technique for finding correlation functions, they are not typically used in computations of the anisotropy of the two-point correlation function (2PCF) about the line of sight in wide-angle surveys because the line-of-sight direction is not constant on the Cartesian grid. Here we show how FTs can be used to compute the multipole moments of the anisotropic 2PCF. We also show how FTs can be used to accelerate the 3PCF algorithm of Slepian & Eisenstein. In both cases, these FT methods allow one to avoid the computational cost of pair counting, which scales as the square of the number density of objects in the survey. With the upcoming large data sets of Dark Energy Spectroscopic Instrument, Euclid, and Large Synoptic Survey Telescope, FT techniques will therefore offer an important complement to simple pair or triplet counts.
NASA Astrophysics Data System (ADS)
Huismann, Immo; Stiller, Jörg; Fröhlich, Jochen
2017-10-01
The paper proposes a novel factorization technique for static condensation of a spectral-element discretization matrix that yields a linear operation count of just 13N multiplications for the residual evaluation, where N is the total number of unknowns. In comparison to previous work it saves a factor larger than 3 and outpaces unfactored variants for all polynomial degrees. Using the new technique as a building block for a preconditioned conjugate gradient method yields linear scaling of the runtime with N which is demonstrated for polynomial degrees from 2 to 32. This makes the spectral-element method cost effective even for low polynomial degrees. Moreover, the dependence of the iterative solution on the element aspect ratio is addressed, showing only a slight increase in the number of iterations for aspect ratios up to 128. Hence, the solver is very robust for practical applications.
High channel count and high precision channel spacing multi-wavelength laser array for future PICs.
Shi, Yuechun; Li, Simin; Chen, Xiangfei; Li, Lianyan; Li, Jingsi; Zhang, Tingting; Zheng, Jilin; Zhang, Yunshan; Tang, Song; Hou, Lianping; Marsh, John H; Qiu, Bocang
2014-12-09
Multi-wavelength semiconductor laser arrays (MLAs) have wide applications in wavelength multiplexing division (WDM) networks. In spite of their tremendous potential, adoption of the MLA has been hampered by a number of issues, particularly wavelength precision and fabrication cost. In this paper, we report high channel count MLAs in which the wavelengths of each channel can be determined precisely through low-cost standard μm-level photolithography/holographic lithography and the reconstruction-equivalent-chirp (REC) technique. 60-wavelength MLAs with good wavelength spacing uniformity have been demonstrated experimentally, in which nearly 83% lasers are within a wavelength deviation of ±0.20 nm, corresponding to a tolerance of ±0.032 nm in the period pitch. As a result of employing the equivalent phase shift technique, the single longitudinal mode (SLM) yield is nearly 100%, while the theoretical yield of standard DFB lasers is only around 33.3%.
Machine Learning Based Single-Frame Super-Resolution Processing for Lensless Blood Cell Counting
Huang, Xiwei; Jiang, Yu; Liu, Xu; Xu, Hang; Han, Zhi; Rong, Hailong; Yang, Haiping; Yan, Mei; Yu, Hao
2016-01-01
A lensless blood cell counting system integrating microfluidic channel and a complementary metal oxide semiconductor (CMOS) image sensor is a promising technique to miniaturize the conventional optical lens based imaging system for point-of-care testing (POCT). However, such a system has limited resolution, making it imperative to improve resolution from the system-level using super-resolution (SR) processing. Yet, how to improve resolution towards better cell detection and recognition with low cost of processing resources and without degrading system throughput is still a challenge. In this article, two machine learning based single-frame SR processing types are proposed and compared for lensless blood cell counting, namely the Extreme Learning Machine based SR (ELMSR) and Convolutional Neural Network based SR (CNNSR). Moreover, lensless blood cell counting prototypes using commercial CMOS image sensors and custom designed backside-illuminated CMOS image sensors are demonstrated with ELMSR and CNNSR. When one captured low-resolution lensless cell image is input, an improved high-resolution cell image will be output. The experimental results show that the cell resolution is improved by 4×, and CNNSR has 9.5% improvement over the ELMSR on resolution enhancing performance. The cell counting results also match well with a commercial flow cytometer. Such ELMSR and CNNSR therefore have the potential for efficient resolution improvement in lensless blood cell counting systems towards POCT applications. PMID:27827837
Monitoring Microbial Numbers in Food by Density Centrifugation
Basel, Richard M.; Richter, Edward R.; Banwart, George J.
1983-01-01
Some foods contain low numbers of microbes that may be difficult to enumerate by the plate count method due to small food particles that interfere with the counting of colonies. Ludox colloidal silicon was coated with reducing agents to produce a nontoxic density material. Food homogenates were applied to a layered 10 and 80% mixture of modified Ludox and centrifuged at low speed. The top and bottom of the tube contained the food material, and the Ludox-containing portion was evaluated by conventional pour plate techniques. Plate counts of the Ludox mixture agreed with plate counts of the food homogenate alone. The absence of small food particles from pour plates resulted in a plate that was more easily read than pour plates of the homogenate alone. Modified Ludox was evaluated for its effect on bacteria at 4°C during a 24-h incubation period. No inhibition was observed. This method is applicable to food products, such as doughnuts, spices, tomato products, and meat, in which small food particles often interfere with routine plate counts or low dilution may inhibit colony formation. Inhibitory substances can be removed from spices, resulting in higher counts. Ludox is more economical than similar products, such as Percoll. Modified Ludox is easily rendered nontoxic by the addition of common laboratory reagents. In addition, the mixture is compatible with microbiological media. PMID:6303217
NASA Astrophysics Data System (ADS)
Shearer, P.; Jawed, M. K.; Raines, J. M.; Lepri, S. T.; Gilbert, J. A.; von Steiger, R.; Zurbuchen, T.
2013-12-01
The SWICS instruments aboard ACE and Ulysses have performed in situ measurements of individual solar wind ions for a period spanning over two decades. Solar wind composition is determined by accumulating the measurements into an ion count histogram in which each species appears as a distinct peak. Assigning counts to the appropriate species is a challenging statistical problem because of the limited counts for some species and overlap between some peaks. We show that the most commonly used count assignment methods can suffer from significant bias when a highly abundant species overlaps with a much less abundant one. For ACE/SWICS data, this bias results in an overestimated Ne/O ratio. Bias is greatly reduced by switching to a rigorous maximum likelihood count assignment method, resulting in a 30-50% reduction in the estimated Ne abundance. We will discuss the new Ne/O values and put them in context with the solar system abundances for Ne derived from other techniques, such as in situ collection from Genesis and its heritage instrument, the Solar Foil experiment during the Apollo era. The new count assignment method is currently being applied to reanalyze the archived ACE and Ulysses data and obtain revised abundances of C, N, O, Ne, Mg, Si, S, and Fe, leading to revised datasets that will be made publicly available.
High linearity SPAD and TDC array for TCSPC and 3D ranging applications
NASA Astrophysics Data System (ADS)
Villa, Federica; Lussana, Rudi; Bronzi, Danilo; Dalla Mora, Alberto; Contini, Davide; Tisa, Simone; Tosi, Alberto; Zappa, Franco
2015-01-01
An array of 32x32 Single-Photon Avalanche-Diodes (SPADs) and Time-to-Digital Converters (TDCs) has been fabricated in a 0.35 μm automotive-certified CMOS technology. The overall dimension of the chip is 9x9 mm2. Each pixel is able to detect photons in the 300 nm - 900 nm wavelength range with a fill-factor of 3.14% and either to count them or to time stamp their arrival time. In photon-counting mode an in-pixel 6-bit counter provides photon-numberresolved intensity movies at 100 kfps, whereas in photon-timing mode the 10-bit in-pixel TDC provides time-resolved maps (Time-Correlated Single-Photon Counting measurements) or 3D depth-resolved (through direct time-of-flight technique) images and movies, with 312 ps resolution. The photodetector is a 30 μm diameter SPAD with low Dark Count Rate (120 cps at room temperature, 3% hot-pixels) and 55% peak Photon Detection Efficiency (PDE) at 450 nm. The TDC has a 6-bit counter and a 4-bit fine interpolator, based on a Delay Locked Loop (DLL) line, which makes the TDC insensitive to process, voltage, and temperature drifts. The implemented sliding-scale technique improves linearity, giving 2% LSB DNL and 10% LSB INL. The single-shot precision is 260 ps rms, comprising SPAD, TDC and driving board jitter. Both optical and electrical crosstalk among SPADs and TDCs are negligible. 2D fast movies and 3D reconstructions with centimeter resolution are reported.
Muramatsu, Keita; Matsuo, Koichiro; Kawai, Yusuke; Yamamoto, Tsukasa; Hara, Yoshitaka; Shimomura, Yasuyo; Yamashita, Chizuru; Nishida, Osamu
2018-06-26
Endotracheal intubation of critically ill patients increases the risk of aspiration pneumonia, which can be reduced by regular oral care. However, the rinsing of the residual oral contaminants after mechanical cleaning carries the risk of aspirating the residue during the intubation period. Removing the contaminants by wiping with mouth wipes could be an alternative to rinsing with water because of no additional fluid. This study tested: (i) the amount of oral bacteria during endotracheal intubation and after extubation; and (ii) the changes in the bacterial count during oral care procedures. Thirty-five mechanically ventilated patients in the intensive care unit were enrolled. The amount of bacteria on the dorsal tongue surface was counted before and following oral care and then after the elimination of contaminants either by rinsing with water and suctioning or by wiping with mouth wipes. The oral bacterial amount was compared statistically between the intubation and extubation status and among set time points during the oral care procedure. The oral bacterial count was significantly decreased after extubation. During the oral care procedure, the oral bacterial amount was significantly lower after eliminating the contaminants either by rinsing or wiping, with no remarkable difference between the elimination techniques. The findings suggest that the oral bacterial amount is elevated during endotracheal intubation, which could increase the risk of aspiration pneumonia. The significant reduction in the bacterial count by wiping indicates that it might be a suitable alternative to rinsing for mechanically ventilated patients. © 2018 Japan Academy of Nursing Science.
Establishment of HPC(R2A) for regrowth control in non-chlorinated distribution systems.
Uhl, Wolfgang; Schaule, Gabriela
2004-05-01
Drinking water distributed without disinfection and without regrowth problems for many years may show bacterial regrowth when the residence time and/or temperature in the distribution system increases or when substrate and/or bacterial concentration in the treated water increases. An example of a regrowth event in a major German city is discussed. Regrowth of HPC bacteria occurred unexpectedly at the end of a very hot summer. No pathogenic or potentially pathogenic bacteria were identified. Increased residence times in the distribution system and temperatures up to 25 degrees C were identified as most probable causes and the regrowth event was successfully overcome by changing flow regimes and decreasing residence times. Standard plate counts of HPC bacteria using the spread plate technique on nutrient rich agar according to German Drinking Water Regulations (GDWR) had proven to be a very good indicator of hygienically safe drinking water and to demonstrate the effectiveness of water treatment. However, the method proved insensitive for early regrowth detection. Regrowth experiments in the lab and sampling of the distribution system during two summers showed that spread plate counts on nutrient-poor R2A agar after 7-day incubation yielded 100 to 200 times higher counts. Counts on R2A after 3-day incubation were three times less than after 7 days. As the precision of plate count methods is very poor for counts less than 10 cfu/plate, a method yielding higher counts is better suited to detect upcoming regrowth than a method yielding low counts. It is shown that for the identification of regrowth events HPC(R2A) gives a further margin of about 2 weeks for reaction before HPC(GDWR). Copyright 2003 Elsevier B.V.
Santra, Kalyan; Smith, Emily A.; Petrich, Jacob W.; ...
2016-12-12
It is often convenient to know the minimum amount of data needed in order to obtain a result of desired accuracy and precision. It is a necessity in the case of subdiffraction-limited microscopies, such as stimulated emission depletion (STED) microscopy, owing to the limited sample volumes and the extreme sensitivity of the samples to photobleaching and photodamage. We present a detailed comparison of probability-based techniques (the maximum likelihood method and methods based on the binomial and the Poisson distributions) with residual minimization-based techniques for retrieving the fluorescence decay parameters for various two-fluorophore mixtures, as a function of the total numbermore » of photon counts, in time-correlated, single-photon counting experiments. The probability-based techniques proved to be the most robust (insensitive to initial values) in retrieving the target parameters and, in fact, performed equivalently to 2-3 significant figures. This is to be expected, as we demonstrate that the three methods are fundamentally related. Furthermore, methods based on the Poisson and binomial distributions have the desirable feature of providing a bin-by-bin analysis of a single fluorescence decay trace, which thus permits statistics to be acquired using only the one trace for not only the mean and median values of the fluorescence decay parameters but also for the associated standard deviations. Lastly, these probability-based methods lend themselves well to the analysis of the sparse data sets that are encountered in subdiffraction-limited microscopies.« less
Gervais, David; Corn, Tim; Downer, Andrew; Smith, Stuart; Jennings, Alan
2014-07-01
In order to generate further characterisation data for the lyophilised product Erwinia chrysanthemi L-asparaginase, reconstituted drug product (DP; marketed as Erwinase or Erwinaze) was analysed for subvisible (2-10 μm) particulate content using both the light obscuration (LO) method and the newer flow-imaging microscopy (FIM) technique. No correlation of subvisible particulate counts exists between FIM and LO nor do the counts correlate with activity at both release and on stability. The subvisible particulate content of lyophilised Erwinia L-asparaginase appears to be consistent and stable over time and in line with other parenteral biopharmaceutical products. The majority (ca. 75%) of subvisible particulates in L-asparaginase DP were at the low end of the measurement range by FIM (2-4 μm). In this size range, FIM was unable to definitively classify the particulates as either protein or non-protein. More sensitive measurement techniques would be needed to classify the particulates in lyophilised L-asparaginase into type (protein and non-protein), so the LO technique has been chosen for on-going DP analyses. E. chrysanthemi L-asparaginase has a lower rate of hypersensitivity compared with native Escherichia coli preparations, but a subset of patients develop hypersensitivity to the Erwinia enzyme. A DP lot that had subvisible particulate counts on the upper end of the measurement range by both LO and FIM had the same incidence of allergic hypersensitivity in clinical experience as lots at all levels of observed subvisible particulate content, suggesting that the presence of L-asparaginase subvisible particulates is not important with respect to allergic response.
Analytical method for measuring cosmogenic 35S in natural waters
Uriostegui, Stephanie H.; Bibby, Richard K.; Esser, Bradley K.; ...
2015-05-18
Here, cosmogenic sulfur-35 in water as dissolved sulfate ( 35SO 4) has successfully been used as an intrinsic hydrologic tracer in low-SO 4, high-elevation basins. Its application in environmental waters containing high SO 4 concentrations has been limited because only small amounts of SO 4 can be analyzed using current liquid scintillation counting (LSC) techniques. We present a new analytical method for analyzing large amounts of BaSO 4 for 35S. We quantify efficiency gains when suspending BaSO 4 precipitate in Inta-Gel Plus cocktail, purify BaSO 4 precipitate to remove dissolved organic matter, mitigate interference of radium-226 and its daughter productsmore » by selection of high purity barium chloride, and optimize LSC counting parameters for 35S determination in larger masses of BaSO 4. Using this improved procedure, we achieved counting efficiencies that are comparable to published LSC techniques despite a 10-fold increase in the SO 4 sample load. 35SO 4 was successfully measured in high SO 4 surface waters and groundwaters containing low ratios of 35S activity to SO 4 mass demonstrating that this new analytical method expands the analytical range of 35SO 4 and broadens the utility of 35SO 4 as an intrinsic tracer in hydrologic settings.« less
A technology review of time-of-flight photon counting for advanced remote sensing
NASA Astrophysics Data System (ADS)
Lamb, Robert A.
2010-04-01
Time correlated single photon counting (TCSPC) has made tremendous progress during the past ten years enabling improved performance in precision time-of-flight (TOF) rangefinding and lidar. In this review the development and performance of several ranging systems is presented that use TCSPC for accurate ranging and range profiling over distances up to 17km. A range resolution of a few millimetres is routinely achieved over distances of several kilometres. These systems include single wavelength devices operating in the visible; multi-wavelength systems covering the visible and near infra-red; the use of electronic gating to reduce in-band solar background and, most recently, operation at high repetition rates without range aliasing- typically 10MHz over several kilometres. These systems operate at very low optical power (<100μW). The technique therefore has potential for eye-safe lidar monitoring of the environment and obvious military, security and surveillance sensing applications. The review will highlight the theoretical principles of photon counting and progress made in developing absolute ranging techniques that enable high repetition rate data acquisition that avoids range aliasing. Technology trends in TCSPC rangefinding are merging with those of quantum cryptography and its future application to revolutionary quantum imaging provides diverse and exciting research into secure covert sensing, ultra-low power active imaging and quantum rangefinding.
Radioactive equilibrium in ancient marine sediments
Breger, I.A.
1955-01-01
Radioactive equilibrium in eight marine sedimentary formations has been studied by means of direct determinations of uranium, radium and thorium. Alpha-particle counting has also been carried out in order to cross-calibrate thick-source counting techniques. The maximum deviation from radioactive equilibrium that has been noted is 11 per cent-indicating that there is probably equilibrium in all the formations analyzed. Thick-source alpha-particle counting by means of a proportional counter or an ionization chamber leads to high results when the samples contain less than about 10 p.p.m. of uranium. For samples having a higher content of uranium the results are in excellent agreement with each other and with those obtained by direct analytical techniques. The thorium contents that have been obtained correspond well to the average values reported in the literature. The uranium content of marine sediments may be appreciably higher than the average values that have been reported for sedimentary rocks. Data show that there is up to fourteen times the percentage of uranium as of thorium in the formations studied and that the percentage of thorium never exceeds that of uranium. While the proximity of a depositional environment to a land mass may influence the concentration of uranium in a marine sediment, this is not true with thorium. ?? 1955.
NASA Astrophysics Data System (ADS)
Jacq, Thomas S.; Lardizabal, Carlos F.
2017-11-01
In this work we consider open quantum random walks on the non-negative integers. By considering orthogonal matrix polynomials we are able to describe transition probability expressions for classes of walks via a matrix version of the Karlin-McGregor formula. We focus on absorbing boundary conditions and, for simpler classes of examples, we consider path counting and the corresponding combinatorial tools. A non-commutative version of the gambler's ruin is studied by obtaining the probability of reaching a certain fortune and the mean time to reach a fortune or ruin in terms of generating functions. In the case of the Hadamard coin, a counting technique for boundary restricted paths in a lattice is also presented. We discuss an open quantum version of Foster's Theorem for the expected return time together with applications.
NASA Astrophysics Data System (ADS)
Carpentieri, C.; Schwarz, C.; Ludwig, J.; Ashfaq, A.; Fiederle, M.
2002-07-01
High precision concerning the dose calibration of X-ray sources is required when counting and integrating methods are compared. The dose calibration for a dental X-ray tube was executed with special dose calibration equipment (dosimeter) as function of exposure time and rate. Results were compared with a benchmark spectrum and agree within ±1.5%. Dead time investigations with the Medipix1 photon-counting chip (PCC) have been performed by rate variations. Two different types of dead time, paralysable and non-paralysable will be discussed. The dead time depends on settings of the front-end electronics and is a function of signal height, which might lead to systematic defects of systems. Dead time losses in excess of 30% have been found for the PCC at 200 kHz absorbed photons per pixel.
De Backer, A; Martinez, G T; MacArthur, K E; Jones, L; Béché, A; Nellist, P D; Van Aert, S
2015-04-01
Quantitative annular dark field scanning transmission electron microscopy (ADF STEM) has become a powerful technique to characterise nano-particles on an atomic scale. Because of their limited size and beam sensitivity, the atomic structure of such particles may become extremely challenging to determine. Therefore keeping the incoming electron dose to a minimum is important. However, this may reduce the reliability of quantitative ADF STEM which will here be demonstrated for nano-particle atom-counting. Based on experimental ADF STEM images of a real industrial catalyst, we discuss the limits for counting the number of atoms in a projected atomic column with single atom sensitivity. We diagnose these limits by combining a thorough statistical method and detailed image simulations. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Yun, Hoyoung; Bang, Hyunwoo; Lee, Won Gu; Lim, Hyunchang; Park, Junha; Lee, Joonmo; Riaz, Asif; Cho, Keunchang; Chung, Chanil; Han, Dong-Chul; Chang, Jun Keun
2007-12-01
Although CD4+ T-cells are an important target of HIV detection, there have been still major problems in making a diagnosis and monitoring in the third world and the region with few medical facilities. Then, it is necessary to use portable diagnosis devices at low cost when you put an enumeration of CD4+ T-cells. In general, the counting of CD4 below 200cells/uL makes it necessary to initiate antiretroviral treatment in adults (over 13 years old). However, lymphocyte subsets (including CD4 counts) of infants and young children are higher than those of adults. This fact shows the percentage of CD4+ T-cells of blood subsets, i.e., CD4/CD45%, CD4/CD8% or CD4/CD3% means a more reliable indicator of HIV infection than absolute counts in children. To know the percentage of CD4+ T-cell by using two fluorescent dyes of different emission wavelength, at least, one laser and two PMT detectors are in general needed. Then, it is so hard to develop a portable device like a 'toaster size' because this makes such a device more complex including many peripheral modules. In this study, we developed a novel technique to control the intensity of fluorescent dye-doped silica nanoparticles. I synthesized FITC-doped silica nanoparticles conjugated CD4 antibody 10 times brighter than FITC-conjugated CD45 antibody. With the difference of intensity of two fluorescent dyes, we measured two parameters by using only a single detector and laser. Most experiments were achieved with uFACS (microfabricated fluorescence-activated cell sorter) on an inverted microscope (IX71, Olympus). In conclusion, this method enables us to discriminate the difference between CD4 and CD45 in an intensity domain simultaneously. Furthermore, this technique would make it possible develop much cheaper and smaller devices which can count the number of CD4 T-cells.
Specificity and sensitivity of noninvasive measurement of pulmonary vascular protein leak.
Dauber, I M; Pluss, W T; VanGrondelle, A; Trow, R S; Weil, J V
1985-08-01
Noninvasive techniques employing external counting of radiolabeled protein have the potential for measuring pulmonary vascular protein permeability, but their specificity and sensitivity remain unclear. We tested the specificity and sensitivity of a double-radioisotope method by injecting radiolabeled albumin (131I) and erythrocytes (99mTc) into anesthetized dogs and measuring the counts of each isotope for 150 min after injection with an external gamma probe fixed over the lung. We calculated the rate of increase of albumin counts measured by the probe (which reflects the rate at which protein leaks into the extravascular space). To assess permeability we normalized the rate of increase in albumin counts for changes in labeled erythrocyte signal to minimize influence of changes in vascular surface area and thus derived an albumin leak index. We measured the albumin leak index and gravimetric lung water during hydrostatic edema (acutely elevating left atrial pressure by left atrial balloon inflation: mean pulmonary arterial wedge pressure = 22.6 Torr) and in lung injury edema induced by high- (1.0 g/kg) and low-dose (0.25 g/kg) intravenous thiourea. To test specificity we compared hydrostatic and high-dose thiourea edema. The albumin leak index increased nearly fourfold from control after thiourea injury (27.2 +/- 2.3 X 10-4 vs. 7.6 +/- 0.9 X 10-4 min-1) but did not change from control levels after elevating left atrial pressure (8.9 +/- 1.2 X 10-4 min-1) despite comparable increases in gravimetric lung water. To test sensitivity we compared low-dose thiourea with controls. Following low-dose thiourea, the albumin leak index nearly doubled despite the absence of a measurable increase in lung water. We conclude that a noninvasive double radioisotope measurement of pulmonary vascular protein leak, employing external counting techniques and a simplified method of calculation, is specific for lung injury and is also sensitive enough to detect lung injury insufficient to produce detectable pulmonary edema.
NASA Technical Reports Server (NTRS)
Jones, John H.; Hanson, B. Z.
2011-01-01
Petrologic investigation of the shergottites has been hampered by the fact that most of these meteorites are partial cumulates. Two lines of inquiry have been used to evaluate the compositions of parental liquids: (i) perform melting experiments at different pressures and temperatures until the compositions of cumulate crystal cores are reproduced [e.g., 1]; and (ii) use point-counting techniques to reconstruct the compositions of intercumulus liquids [e.g., 2]. The second of these methods is hampered by the approximate nature of the technique. In effect, element maps are used to construct mineral modes; and average mineral compositions are then converted into bulk compositions. This method works well when the mineral phases are homogeneous [3]. However, when minerals are zoned, with narrow rims contributing disproportionately to the mineral volume, this method becomes problematic. Decisions need to be made about the average composition of the various zones within crystals. And, further, the proportions of those zones also need to be defined. We have developed a new microprobe technique to see whether the point-count method of determining intercumulus liquid composition is realistic. In our technique, the approximating decisions of earlier methods are unnecessary because each pixel of our x-ray maps is turned into a complete eleven-element quantitative analysis. The success or failure of our technique can then be determined by experimentation. As discussed earlier, experiments on our point-count composition can then be used to see whether experimental liquidus phases successfully reproduce natural mineral compositions. Regardless of our ultimate outcome in retrieving shergottite parent liquids, we believe our pixel-bypixel analysis technique represents a giant step forward in documenting thin-section modes and compositions. For a third time, we have analyzed the groundmass composition of EET 79001, 68 [Eg]. The first estimate of Eg was made by [4] and later modified by [5], to take phase diagram considerations into account. The Eg composition of [4] was too olivine normative to be the true Eg composition, because the ,68 groundmass contains no forsteritic olivine. A later mapping by [2] basically reconfirmed the modifications of [5]. However, even the modified composition of [5] has olivine on the liquidus for 50 C before low-Ca pyroxene appears [6].
Nasi, Milena; De Biasi, Sara; Bianchini, Elena; Gibellini, Lara; Pinti, Marcello; Scacchetti, Tiziana; Trenti, Tommaso; Borghi, Vanni; Mussini, Cristina; Cossarizza, Andrea
2015-01-01
Background An accurate and affordable CD4+ T cells count is an essential tool in the fight against HIV/AIDS. Flow cytometry (FCM) is the “gold standard” for counting such cells, but this technique is expensive and requires sophisticated equipment, temperature-sensitive monoclonal antibodies (mAbs) and trained personnel. The lack of access to technical support and quality assurance programs thus limits the use of FCM in resource-constrained countries. We have tested the accuracy, the precision and the carry-over contamination of Partec CyFlow MiniPOC, a portable and economically affordable flow cytometer designed for CD4+ count and percentage, used along with the “CD4% Count Kit-Dry”. Materials and Methods Venous blood from 59 adult HIV+ patients (age: 25–58 years; 43 males and 16 females) was collected and stained with the “MiniPOC CD4% Count Kit-Dry”. CD4+ count and percentage were then determined in triplicate by the CyFlow MiniPOC. In parallel, CD4 count was performed using mAbs and a CyFlow Counter, or by a dual platform system (from Beckman Coulter) based upon Cytomic FC500 (“Cytostat tetrachrome kit” for mAbs) and Coulter HmX Hematology Analyzer (for absolute cell count). Results The accuracy of CyFlow MiniPOC against Cytomic FC500 showed a correlation coefficient (CC) of 0.98 and 0.97 for CD4+ count and percentage, respectively. The accuracy of CyFlow MiniPOC against CyFlow Counter showed a CC of 0.99 and 0.99 for CD4 T cell count and percentage, respectively. CyFlow MiniPOC showed an excellent repeatability: CD4+ cell count and percentage were analyzed on two instruments, with an intra-assay precision below ±5% deviation. Finally, there was no carry-over contamination for samples at all CD4 values, regardless of their position in the sequence of analysis. Conclusion The cost-effective CyFlow MiniPOC produces rapid, reliable and accurate results that are fully comparable with those from highly expensive dual platform systems. PMID:25622041
High-Precision Isotope Ratio Measurements of Sub-Picogram Actinide Samples
NASA Astrophysics Data System (ADS)
Pollington, A. D.; Kinman, W.
2016-12-01
One of the most exciting trends in analytical geochemistry over the past decade is the push towards smaller and smaller sample sizes while simultaneously achieving high precision isotope ratio measurements. This trend has been driven by advances in clean chemistry protocols, and by significant breakthroughs in mass spectrometer ionization efficiency and detector quality (stability and noise for low signals). In this presentation I will focus on new techniques currently being developed at Los Alamos National Laboratory for the characterization of ultra-small samples (pg, fg, ag), with particular focus on actinide measurements by MC-ICP-MS. Analyses of U, Pu, Th and Am are routinely carried out in our facility using multi-ion counting techniques. I will describe some of the challenges associated with using exclusively ion counting methods (e.g., stability, detector cross calibration, etc.), and how we work to mitigate them. While the focus of much of the work currently being carried out is in the broad field of nuclear forensics and safeguards, the techniques that are being developed are directly applicable to many geologic questions that require analyses of small samples of U and Th, for example. In addition to the description of the technique development, I will present case studies demonstrating the precision and accuracy of the method as applied to real-world samples.
Estimation of Traffic Variables Using Point Processing Techniques
DOT National Transportation Integrated Search
1978-05-01
An alternative approach to estimating aggregate traffic variables on freeways--spatial mean velocity and density--is presented. Vehicle arrival times at a given location on a roadway, typically a presence detector, are regarded as a point or counting...
Kebede, Mihiretu; Zegeye, Desalegn Tigabu; Zeleke, Berihun Megabiaw
2017-12-01
To monitor the progress of therapy and disease progression, periodic CD4 counts are required throughout the course of HIV/AIDS care and support. The demand for CD4 count measurement is increasing as ART programs expand over the last decade. This study aimed to predict CD4 count changes and to identify the predictors of CD4 count changes among patients on ART. A cross-sectional study was conducted at the University of Gondar Hospital from 3,104 adult patients on ART with CD4 counts measured at least twice (baseline and most recent). Data were retrieved from the HIV care clinic electronic database and patients` charts. Descriptive data were analyzed by SPSS version 20. Cross-Industry Standard Process for Data Mining (CRISP-DM) methodology was followed to undertake the study. WEKA version 3.8 was used to conduct a predictive data mining. Before building the predictive data mining models, information gain values and correlation-based Feature Selection methods were used for attribute selection. Variables were ranked according to their relevance based on their information gain values. J48, Neural Network, and Random Forest algorithms were experimented to assess model accuracies. The median duration of ART was 191.5 weeks. The mean CD4 count change was 243 (SD 191.14) cells per microliter. Overall, 2427 (78.2%) patients had their CD4 counts increased by at least 100 cells per microliter, while 4% had a decline from the baseline CD4 value. Baseline variables including age, educational status, CD8 count, ART regimen, and hemoglobin levels predicted CD4 count changes with predictive accuracies of J48, Neural Network, and Random Forest being 87.1%, 83.5%, and 99.8%, respectively. Random Forest algorithm had a superior performance accuracy level than both J48 and Artificial Neural Network. The precision, sensitivity and recall values of Random Forest were also more than 99%. Nearly accurate prediction results were obtained using Random Forest algorithm. This algorithm could be used in a low-resource setting to build a web-based prediction model for CD4 count changes. Copyright © 2017 Elsevier B.V. All rights reserved.
A whole-system approach to x-ray spectroscopy in cargo inspection systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Langeveld, Willem G. J.; Gozani, Tsahi; Ryge, Peter
The bremsstrahlung x-ray spectrum used in high-energy, high-intensity x-ray cargo inspection systems is attenuated and modified by the materials in the cargo in a Z-dependent way. Therefore, spectroscopy of the detected x rays yields information about the Z of the x-rayed cargo material. It has previously been shown that such ZSpectroscopy (Z-SPEC) is possible under certain circumstances. A statistical approach, Z-SCAN (Z-determination by Statistical Count-rate ANalysis), has also been shown to be effective, and it can be used either by itself or in conjunction with Z-SPEC when the x-ray count rate is too high for individual x-ray spectroscopy. Both techniquesmore » require fast x-ray detectors and fast digitization electronics. It is desirable (and possible) to combine all techniques, including x-ray imaging of the cargo, in a single detector array, to reduce costs, weight, and overall complexity. In this paper, we take a whole-system approach to x-ray spectroscopy in x-ray cargo inspection systems, and show how the various parts interact with one another. Faster detectors and read-out electronics are beneficial for both techniques. A higher duty-factor x-ray source allows lower instantaneous count rates at the same overall x-ray intensity, improving the range of applicability of Z-SPEC in particular. Using an intensity-modulated advanced x-ray source (IMAXS) allows reducing the x-ray count rate for cargoes with higher transmission, and a stacked-detector approach may help material discrimination for the lowest attenuations. Image processing and segmentation allow derivation of results for entire objects, and subtraction of backgrounds. We discuss R and D performed under a number of different programs, showing progress made in each of the interacting subsystems. We discuss results of studies into faster scintillation detectors, including ZnO, BaF{sub 2} and PbWO{sub 4}, as well as suitable photo-detectors, read-out and digitization electronics. We discuss high-duty-factor linear-accelerator x-ray sources and their associated requirements, and how such sources improve spectroscopic techniques. We further discuss how image processing techniques help in correcting for backgrounds and overlapping materials. In sum, we present an integrated picture of how to optimize a cargo inspection system for x-ray spectroscopy.« less
Urrutia, H; Vidal, R; Baeza, M; Reyes, J E; Aspe, E
1997-06-01
The efficiency of organic matter degradation in attached biomass reactors depends on the suitable selection of artificial support for the retention of bacterial communities. We have studied the growth on glass and clay beads of methylaminotrophic, acetotrophic and hydrogenotrophic methanogenic bacterial communities isolated from anaerobic reactors. Bacterial counts were performed by the standard MPN technique. Experiments were performed in 50 ml vials for 12 days at 35 degrees C. Increase in the counts of methylaminotrophic and hydrogenotrophic methanogens occurred on both glass and clay beads. The latter support material also stimulated the growth rate of methylaminotrophic methanogens.
Total body calcium analysis. [neutron irradiation
NASA Technical Reports Server (NTRS)
Lewellen, T. K.; Nelp, W. B.
1974-01-01
A technique to quantitate total body calcium in humans is developed. Total body neutron irradiation is utilized to produce argon 37. The radio argon, which diffuses into the blood stream and is excreted through the lungs, is recovered from the exhaled breath and counted inside a proportional detector. Emphasis is placed on: (1) measurement of the rate of excretion of radio argon following total body neutron irradiation; (2) the development of the radio argon collection, purification, and counting systems; and (3) development of a patient irradiation facility using a 14 MeV neutron generator. Results and applications are discussed in detail.
Liu, Juntao; Zhang, Feng; Wang, Xinguang; Han, Fei; Yuan, Zhelong
2014-12-01
Formation porosity can be determined using the boron capture gamma ray counting ratio with a near to far detector in a pulsed neutron-gamma element logging tool. The thermal neutron distribution, boron capture gamma spectroscopy and porosity response for formations with different water salinity and wellbore diameter characteristics were simulated using the Monte Carlo method. We found that a boron lining improves the signal-to-noise ratio and that the boron capture gamma ray counting ratio has a higher sensitivity for determining porosity than total capture gamma. Copyright © 2014 Elsevier Ltd. All rights reserved.
Tanır, A Güneş; Yedek, Hatice; Koç, Kemal; Bölükdemir, M Hicabi
2017-01-01
The scattered doses received by the area surrounding the target that has been subjected to x-rays were investigated. Two experiments were carried out: 1- Al 2 O 3 : C was used as dosimeter and the luminescence counts was measured using both the RisØ TL/OSL system and an ion chamber. 2- BeO aliquots were used and the counts were read using the IBEOX/OSL system. According to the results, the doses absorbed by the area surrounding the target are significantly amount. Copyright © 2016 Elsevier Ltd. All rights reserved.
Age diagnosis based on incremental lines in dental cementum: a critical reflection.
Grosskopf, Birgit; McGlynn, George
2011-01-01
Age estimation based on the counting of incremental lines in dental cementum is a method frequently used for the estimation of the age at death for humans in bioarchaeology, and increasingly, forensic anthropology. Assessment of applicability, precision, and method reproducibility continue to be the focus of research in this area, and are occasionally accompanied by significant controversy. Differences in methodological techniques for data collection (e.g. number of sections, factor of magnification for counting or interpreting "outliers") are presented. Potential influences on method reliability are discussed, especially for their applicability in forensic contexts.
A Comparison of Automated and Manual Crater Counting Techniques in Images of Elysium Planitia.
NASA Astrophysics Data System (ADS)
Plesko, C. S.; Brumby, S. P.; Asphaug, E.
2004-11-01
Surveys of impact craters yield a wealth of information about Martian geology, providing clues to the relative age, local composition and erosional history of the surface. Martian craters are also of intrinsic geophysical interest, given that the processes by which they form are not entirely clear, especially cratering in ice-saturated regoliths (Plesko et al. 2004, AGU) which appear common on Mars (Squyres and Carr 1986). However, the deluge of data over the last decade has made comprehensive manual counts prohibitive, except in select regions. Given that most small craters on Mars may be secondaries from a few very recent impact events (McEwen et al. in press, Icarus 2004), using select regions for age dating introduces considerable potential for sampling error. Automation is thus an enabling planetary science technology. In contrast to machine counts, human counts are prone to human decision making, thus not intrinsically reproducible. One can address human "noise" by averaging over many human counts (Kanefsky et al. 2001), but this multiplies the already laborious effort required. In this study, we test automated crater counting algorithms developed with the Los Alamos National Laboratory genetic programming suite GENIE (Harvey et al., 2002) against established manual counts of craters in Elysium Planitia, using MOC and THEMIS data. We intend to establish the validity of our method against well-regarded hand counts (Hartmann et al. 2000), and then apply it generally to larger regions of Mars. Previous work on automated crater counting used customized algorithms (Bierhaus et al. 2003, Burl et al.. 2001). Algorithms generated by genetic programming have the advantage of requiring little time or user effort to generate, so it is relatively easy to generate a suite of algorithms for varied terrain types, or to compare results from multiple algorithms for improved accuracy (Plesko et al. 2003).
Single photon detection using Geiger mode CMOS avalanche photodiodes
NASA Astrophysics Data System (ADS)
Lawrence, William G.; Stapels, Christopher; Augustine, Frank L.; Christian, James F.
2005-10-01
Geiger mode Avalanche Photodiodes fabricated using complementary metal-oxide-semiconductor (CMOS) fabrication technology combine high sensitivity detectors with pixel-level auxiliary circuitry. Radiation Monitoring Devices has successfully implemented CMOS manufacturing techniques to develop prototype detectors with active diameters ranging from 5 to 60 microns and measured detection efficiencies of up to 60%. CMOS active quenching circuits are included in the pixel layout. The actively quenched pixels have a quenching time less than 30 ns and a maximum count rate greater than 10 MHz. The actively quenched Geiger mode avalanche photodiode (GPD) has linear response at room temperature over six orders of magnitude. When operating in Geiger mode, these GPDs act as single photon-counting detectors that produce a digital output pulse for each photon with no associated read noise. Thermoelectrically cooled detectors have less than 1 Hz dark counts. The detection efficiency, dark count rate, and after-pulsing of two different pixel designs are measured and demonstrate the differences in the device operation. Additional applications for these devices include nuclear imaging and replacement of photomultiplier tubes in dosimeters.
Emery, R J
1997-03-01
Institutional radiation safety programs routinely use wipe test sampling and liquid scintillation counting analysis to indicate the presence of removable radioactive contamination. Significant volumes of liquid waste can be generated by such surveillance activities, and the subsequent disposal of these materials can sometimes be difficult and costly. In settings where large numbers of negative results are regularly obtained, the limited grouping of samples for analysis based on expected value statistical techniques is possible. To demonstrate the plausibility of the approach, single wipe samples exposed to varying amounts of contamination were analyzed concurrently with nine non-contaminated samples. Although the sample grouping inevitably leads to increased quenching with liquid scintillation counting systems, the effect did not impact the ability to detect removable contamination in amounts well below recommended action levels. Opportunities to further improve this cost effective semi-quantitative screening procedure are described, including improvements in sample collection procedures, enhancing sample-counting media contact through mixing and extending elution periods, increasing sample counting times, and adjusting institutional action levels.
NASA Astrophysics Data System (ADS)
Hartmann, William K.; Werner, Stephanie C.
2010-06-01
Recent controversies about systems of crater-count dating have been largely resolved, and with continuing refinements, crater counts will offer a fundamental geological tool to interpret not only ages, but also the nature of geological processes altering the surface of Mars. As an example of the latter technique, we present data on two debris aprons east of Hellas. The aprons show much shorter survival times of small craters than do the nearby contiguous plains. The order-of-magnitude depths of layers involved in the loss process can be judged from the depths of the affected craters. We infer that ice-rich layers in the top tens of meters of both aprons have lost crater topography within the last few 10 8 yr, probably due to flow or sublimation of ice-rich materials. Mantling by ice-rich deposits, associated with climate change cycles of obliquity change, has probably also affected both the aprons and the plains. The crater-count tool thus adds chronological and vertical dimensional information to purely morphological studies.
Intestinal parasitic infections in relation to HIV/AIDS status, diarrhea and CD4 T-cell count.
Assefa, Shimelis; Erko, Berhanu; Medhin, Girmay; Assefa, Zelalem; Shimelis, Techalew
2009-09-18
HIV infection has been modifying both the epidemiology and outcome of parasitic infections. Hence, this study was undertaken to determine the prevalence of intestinal parasitic infection among people with and without HIV infection and its association with diarrhea and CD4 T-cell count. A cross-sectional study was conducted at Hawassa Teaching and Referral Hospital focusing on HIV positive individuals, who gave blood for CD4 T-cell count at their first enrollment and clients tested HIV negative from November, 2008 to March, 2009. Data on socio-demographic factors and diarrhea status were obtained by interviewing 378 consecutive participants (214 HIV positive and 164 HIV negative). Stool samples were collected from all study subjects and examined for parasites using direct, formol-ether and modified acid fast stain techniques. The prevalence of any intestinal parasitic infection was significantly higher among HIV positive participants. Specifically, rate of infection with Cryptosporidium, I. belli, and S. stercoralis were higher, particularly in those with CD4 count less than 200 cells/microL. Diarrhea was more frequent also at the same lower CD4 T-cell counts. Immunodeficiency increased the risk of having opportunistic parasites and diarrhea. Therefore; raising patient immune status and screening at least for those treatable parasites is important.
Exclusion-Based Capture and Enumeration of CD4+ T Cells from Whole Blood for Low-Resource Settings.
Howard, Alexander L; Pezzi, Hannah M; Beebe, David J; Berry, Scott M
2014-06-01
In developing countries, demand exists for a cost-effective method to evaluate human immunodeficiency virus patients' CD4(+) T-helper cell count. The TH (CD4) cell count is the current marker used to identify when an HIV patient has progressed to acquired immunodeficiency syndrome, which results when the immune system can no longer prevent certain opportunistic infections. A system to perform TH count that obviates the use of costly flow cytometry will enable physicians to more closely follow patients' disease progression and response to therapy in areas where such advanced equipment is unavailable. Our system of two serially-operated immiscible phase exclusion-based cell isolations coupled with a rapid fluorescent readout enables exclusion-based isolation and accurate counting of T-helper cells at lower cost and from a smaller volume of blood than previous methods. TH cell isolation via immiscible filtration assisted by surface tension (IFAST) compares well against the established Dynal T4 Quant Kit and is sensitive at CD4 counts representative of immunocompromised patients (less than 200 TH cells per microliter of blood). Our technique retains use of open, simple-to-operate devices that enable IFAST as a high-throughput, automatable sample preparation method, improving throughput over previous low-resource methods. © 2013 Society for Laboratory Automation and Screening.
Krishnan, T; Reddy, B M
1994-01-01
The graphical technique of biplot due to Gabriel and others is explained, and is applied to ten finger ridge-count means of 239 populations, mostly Indian. The biplots, together with concentration ellipses based on them, are used to study geographical, gender and ethnic/social group variability, to compare Indian populations with other populations and to study relations between individual counts and populations. The correlation structure of ridge-counts exhibits a tripartite division of digits demonstrated by many other studies, but with a somewhat different combination of digits. Comparisons are also made with the results of Leguebe and Vrydagh, who used principal components, discriminant functions, Andrews functions, etc., to study geographical and gender variations. There is a great deal of homogeneity in Indian populations when compared to populations from the rest of the world. Although broad geographical contiguity is reflected in the biplots, local (states within India) level contiguity is not maintained. Monogoloids and Caucasoids have distinct ridge-count structures. The higher level of homogeneity in females and on the left side observed by Leguebe and Vrydagh is also observed in the biplots. A comparison with principal component plots indicates that biplots yield a graphical representation similar to component plots, and convey more information than component plots.
Introductory Laboratory Exercises in Radiobiology
ERIC Educational Resources Information Center
Williams, J. R. Parry; Servant, D. M.
1970-01-01
Describes experiments suitable for introducing use of radioisotopes in biology. Includes demonstrations of tracing food chains, uptake of ions by plants, concentration of elements by insects, tracing photosynthetic reactions, activation analysis of copper, and somatic and genetic effects. Uses autoradiographic and counting techniques. (AL)
A Lively Class Section for the Adult Education Second-Language Course.
ERIC Educational Resources Information Center
Carton, Dana
1983-01-01
Exercises with numbers designed to hold the interest of a heterogeneous group of adult students are described. They include games about age, counting, and cards. Meaningful content and active, interested participation are features of the techniques. (MSE)
Development of DNA-Free Sediment for Ecological Assays with Genomic Endpoints
Recent advances in genomics are currently being exploited to discern ecological changes that have conventionally been measured using laborious counting techniques. For example, next generation sequencing technologies can be used to create DNA libraries from benthic community ass...
Basic techniques in mammalian cell tissue culture.
Phelan, Katy; May, Kristin M
2015-03-02
Cultured mammalian cells are used extensively in cell biology studies. It requires a number of special skills in order to be able to preserve the structure, function, behavior, and biology of the cells in culture. This unit describes the basic skills required to maintain and preserve cell cultures: maintaining aseptic technique, preparing media with the appropriate characteristics, passaging, freezing and storage, recovering frozen stocks, and counting viable cells. Copyright © 2015 John Wiley & Sons, Inc.
Di Schiavi, Maria Teresa; Foti, Marina; Mosconi, Maria Cristina; Mattiolo, Giuseppina; Cavallina, Roberta
2014-01-01
Irradiation is a preservation technology used to improve the safety and hygienic quality of food. Aim of this study was to assess the applicability and validity of the microbiological screening method direct epifluorescence filter technique (DEFT)/aerobic plate count (APC) (EN 13783:2001) for the identification of irradiated herbs and spices. Tests on non-irradiated and irradiated samples of dried herbs and spices were performed. The method was based on the comparison of APC and count obtained using DEFT. In accordance with the standard reference, this method is not applicable to samples with APC<103 colony forming units (CFU)/g and this is its main limit. The results obtained in our laboratories showed that in 50% of cases of non-irradiated samples and in 96% of the samples treated with ionising radiation, the method was not applicable due to a value of CFU/g <103. PMID:27800348
Tyree, Melvin T.; Dixon, Michael A.; Thompson, Robert G.
1984-01-01
An improved method of counting acoustic emission (AE) events from water-stressed stems of cedar (Thuja occidentalis L.) is presented. Amplified AEs are analyzed on a real time basis by a microcomputer. The instrumentation counts AE events in a fashion nearly analogous to scintillation counting of radioactive materials. The technique was applied to measuring ultrasonic AEs from the stems of cedar inside a pressure bomb. The shoots were originally fully hydrated. When the shoots are dehydrated in the bomb by application of an overpressure very few AEs were detected. When the bomb pressure is reduced after dehydration of the shoot, AE events could be detected. We conclude that ultrasonic AEs are caused by cavitation events (= structural breakdown of water columns in the tracheids of cedar) and not by the breaking of cellulose fibers in the wood. PMID:16663501
DOE Office of Scientific and Technical Information (OSTI.GOV)
Church, J; Slaughter, D; Norman, E
Error rates in a cargo screening system such as the Nuclear Car Wash [1-7] depend on the standard deviation of the background radiation count rate. Because the Nuclear Car Wash is an active interrogation technique, the radiation signal for fissile material must be detected above a background count rate consisting of cosmic, ambient, and neutron-activated radiations. It was suggested previously [1,6] that the Corresponding negative repercussions for the sensitivity of the system were shown. Therefore, to assure the most accurate estimation of the variation, experiments have been performed to quantify components of the actual variance in the background count rate,more » including variations in generator power, irradiation time, and container contents. The background variance is determined by these experiments to be a factor of 2 smaller than values assumed in previous analyses, resulting in substantially improved projections of system performance for the Nuclear Car Wash.« less
Diffusion processes in tumors: A nuclear medicine approach
NASA Astrophysics Data System (ADS)
Amaya, Helman
2016-07-01
The number of counts used in nuclear medicine imaging techniques, only provides physical information about the desintegration of the nucleus present in the the radiotracer molecules that were uptaken in a particular anatomical region, but that information is not a real metabolic information. For this reason a mathematical method was used to find a correlation between number of counts and 18F-FDG mass concentration. This correlation allows a better interpretation of the results obtained in the study of diffusive processes in an agar phantom, and based on it, an image from the PETCETIX DICOM sample image set from OsiriX-viewer software was processed. PET-CT gradient magnitude and Laplacian images could show direct information on diffusive processes for radiopharmaceuticals that enter into the cells by simple diffusion. In the case of the radiopharmaceutical 18F-FDG is necessary to include pharmacokinetic models, to make a correct interpretation of the gradient magnitude and Laplacian of counts images.
NASA Astrophysics Data System (ADS)
Lundqvist, Mats; Danielsson, Mats; Cederstroem, Bjoern; Chmill, Valery; Chuntonov, Alexander; Aslund, Magnus
2003-06-01
Sectra Microdose is the first single photon counting mammography detector. An edge-on crystalline silicon detector is connected to application specific integrated circuits that individually process each photon. The detector is scanned across the breast and the rejection of scattered radiation exceeds 97% without the use of a Bucky. Processing of each x-rays individually enables an optimization of the information transfer from the x-rays to the image in a way previously not possible. Combined with an almost absence of noise from scattered radiation and from electronics we foresee a possibility to reduce the radiation dose and/or increase the image quality. We will discuss fundamental features of the new direct photon counting technique in terms of dose efficiency and present preliminary measurements for a prototype on physical parameters such as Noise Power Spectra (NPS), MTF and DQE.
Li, Y Z; Hu, X D; Lai, X M; Li, Y F; Lei, Y
2018-01-01
Development of drug therapies and other techniques for wound care have resulted in significant improvement of the cure rate and shortening of the healing time for wounds. A modified technique of regulated oxygen-enriched negative pressure-assisted wound therapy (RO-NPT) has been reported. To evaluate the efficacy and impact of RO-NPT on wound recovery and inflammation. Infected wounds were established on 40 adult female white rabbits, which were then randomized to one of four groups: O 2 group, regulated negative pressure-assisted wound therapy (RNPT) group, regulated oxygen-enriched negative pressure-assisted wound therapy (RO-NPT) group and healthy control (HC) group. Each day, the O 2 group was treated with a constant oxygen supply (1 L/min) to the wound, while the RNPT group was treated with continuous regulated negative pressure (70 ± 5 mmHg) and the RNPT + O 2 group was treated with both. The HC group was treated with gauze dressing alone, which was changed every day. Leucocyte count, colony count and wound-healing rate were calculated. Levels of tumour necrosis factor (TNF)-α, interleukin (IL)-1β and IL-8 were evaluated by ELISA. RO-RNPT significantly decreased bacterial count and TNF-α level, and increased the wound-healing rate. IL-1β, IL-8 and leucocyte count had a tendency to increase in the early phase of inflammation and a tendency to decrease in the later phase of inflammation in the RO-RNPT group. RO-NPT therapy assisted wound recovery and inflammation control compared with the RNPT and oxygen-enriched therapies. RO-NPT therapy also increased levels of IL-1β and IL-8 and attenuated expression of TNF-α in the early phase of inflammation. © 2017 British Association of Dermatologists.
Johnson, G J; Buckworth, R C; Lee, H; Morgan, J A T; Ovenden, J R; McMahon, C R
2017-01-01
Multivariate and machine-learning methods were used to develop field identification techniques for two species of cryptic blacktip shark. From 112 specimens, precaudal vertebrae (PCV) counts and molecular analysis identified 95 Australian blacktip sharks Carcharhinus tilstoni and 17 common blacktip sharks Carcharhinus limbatus. Molecular analysis also revealed 27 of the 112 were C. tilstoni × C. limbatus hybrids, of which 23 had C. tilstoni PCV counts and four had C. limbatus PCV counts. In the absence of further information about hybrid phenotypes, hybrids were assigned as either C. limbatus or C. tilstoni based on PCV counts. Discriminant analysis achieved 80% successful identification, but machine-learning models were better, achieving 100% successful identification, using six key measurements (fork length, caudal-fin peduncle height, interdorsal space, second dorsal-fin height, pelvic-fin length and pelvic-fin midpoint to first dorsal-fin insertion). Furthermore, pelvic-fin markings could be used for identification: C. limbatus has a distinct black mark >3% of the total pelvic-fin area, while C. tilstoni has markings with diffuse edges, or has smaller or no markings. Machine learning and pelvic-fin marking identification methods were field tested achieving 87 and 90% successful identification, respectively. With further refinement, the techniques developed here will form an important part of a multi-faceted approach to identification of C. tilstoni and C. limbatus and have a clear management and conservation application to these commercially important sharks. The methods developed here are broadly applicable and can be used to resolve species identities in many fisheries where cryptic species exist. © 2016 The Fisheries Society of the British Isles.
Is it possible to sanitize athletes' shoes?
Messina, Gabriele; Burgassi, Sandra; Russo, Carmela; Ceriale, Emma; Quercioli, Cecilia; Meniconi, Cosetta
2015-02-01
Footwear should be designed to avoid trauma and injury to the skin of the feet that can favor bacterial and fungal infections. Procedures and substances for sanitizing the interior of shoes are uncommon but are important aspects of primary prevention against foot infections and unpleasant odor. To evaluate the efficacy of a sanitizing technique for reducing bacterial and fungal contamination of footwear. Crossover study. Mens Sana basketball team. Twenty-seven male athletes and 4 coaches (62 shoes). The experimental protocol required a first sample (swab), 1/shoe, at time 0 from inside the shoes of all athletes before the sanitizing technique began and a second sample at time 1, after about 4 weeks, April 2012 to May 2012, of daily use of the sanitizing technique. The differences before and after use of the sanitizing technique for total bacterial count at 36 °C and 22 °C for Staphylococcus spp, yeasts, molds, Enterococcus spp, Pseudomonas spp, Escherichia coli , and total coliform bacteria were evaluated. Before use of the sanitizing technique, the total bacterial counts at 36 °C and 22 °C and for Staphylococcus spp were greater by a factor of 5.8 (95% confidence interval [CI] = 3.42, 9.84), 5.84 (95% CI = 3.45, 9.78), and 4.78 (95% CI = 2.84, 8.03), respectively. All the other comparisons showed a reduction in microbial loads, whereas E coli and coliforms were no longer detected. No statistically significant decrease in yeasts (P = .0841) or molds (P = .6913) was recorded probably because of low contamination. The sanitizing technique significantly reduced the bacterial presence in athletes' shoes.
Hybrid dynamic radioactive particle tracking (RPT) calibration technique for multiphase flow systems
NASA Astrophysics Data System (ADS)
Khane, Vaibhav; Al-Dahhan, Muthanna H.
2017-04-01
The radioactive particle tracking (RPT) technique has been utilized to measure three-dimensional hydrodynamic parameters for multiphase flow systems. An analytical solution to the inverse problem of the RPT technique, i.e. finding the instantaneous tracer positions based upon instantaneous counts received in the detectors, is not possible. Therefore, a calibration to obtain a counts-distance map is needed. There are major shortcomings in the conventional RPT calibration method due to which it has limited applicability in practical applications. In this work, the design and development of a novel dynamic RPT calibration technique are carried out to overcome the shortcomings of the conventional RPT calibration method. The dynamic RPT calibration technique has been implemented around a test reactor with 1foot in diameter and 1 foot in height using Cobalt-60 as an isotopes tracer particle. Two sets of experiments have been carried out to test the capability of novel dynamic RPT calibration. In the first set of experiments, a manual calibration apparatus has been used to hold a tracer particle at known static locations. In the second set of experiments, the tracer particle was moved vertically downwards along a straight line path in a controlled manner. The obtained reconstruction results about the tracer particle position were compared with the actual known position and the reconstruction errors were estimated. The obtained results revealed that the dynamic RPT calibration technique is capable of identifying tracer particle positions with a reconstruction error between 1 to 5.9 mm for the conditions studied which could be improved depending on various factors outlined here.
Preliminary thermal imaging of cotton impurities
USDA-ARS?s Scientific Manuscript database
Discrepancies exist between the Advanced Fiber Information Systems (AFIS) seed coat nep measurements and the seed coat fragment count upon visual inspection. Various studies have indicated that the two techniques may not be sensing the same contaminants as seed coat entities. Thermal imaging is an...
Development of DNA-Free Sediment for Ecological Assays with Genomic Endpoints (NAC SETAC)
Recent advances in genomics are currently being exploited to discern ecological changes that have conventionally been measured using laborious counting techniques. For example, next generation sequencing technologies can be used to create DNA libraries from benthic community ass...
Beyond the standard plate count: genomic views into microbial food ecology
USDA-ARS?s Scientific Manuscript database
Food spoilage is a complex process that involves multiple species with specific niches and metabolic processes; bacterial culturing techniques are the traditional methods for identifying the microbes responsible. These culture-dependent methods may be considered selective, targeting the isolation of...
Surrogate Safety Assessment Model and Validation : Final Report
DOT National Transportation Integrated Search
2008-06-01
Safety of traffic facilities is most often measured by counting the number (and severity) of crashes that occur. It is not possible to apply such a measurement technique to traffic facility designs that have not yet been built or deployed in the real...
A COMPARISON OF ENUMERATION TECHNIQUES FOR CRYPTOSPORIDIUM PARVUM OOCYSTS
A variety of methods have been used to enumerate Cryptosporidium parvum oocysts from source or drinking waters. The reliability of these counting methods varies, in part, with suspension density, sample purity, and other factors. Frequently, the method of determination of suspens...
Teaching Teens "Stuff" That Counts. A Guide for Volunteers.
ERIC Educational Resources Information Center
Hemmerich, Cecelia A.; And Others
The nutrition instruction guide is designed for volunteer leaders in the Expanded Food and Nutrition Education Program (EFNEP), which focuses on youth nutrition education and understanding teenagers. Teaching techniques incorporate the importance of socialization, "discovering" answers, positive reinforcement, and teenager involvement in…
Virus detection and quantification using electrical parameters
NASA Astrophysics Data System (ADS)
Ahmad, Mahmoud Al; Mustafa, Farah; Ali, Lizna M.; Rizvi, Tahir A.
2014-10-01
Here we identify and quantitate two similar viruses, human and feline immunodeficiency viruses (HIV and FIV), suspended in a liquid medium without labeling, using a semiconductor technique. The virus count was estimated by calculating the impurities inside a defined volume by observing the change in electrical parameters. Empirically, the virus count was similar to the absolute value of the ratio of the change of the virus suspension dopant concentration relative to the mock dopant over the change in virus suspension Debye volume relative to mock Debye volume. The virus type was identified by constructing a concentration-mobility relationship which is unique for each kind of virus, allowing for a fast (within minutes) and label-free virus quantification and identification. For validation, the HIV and FIV virus preparations were further quantified by a biochemical technique and the results obtained by both approaches corroborated well. We further demonstrate that the electrical technique could be applied to accurately measure and characterize silica nanoparticles that resemble the virus particles in size. Based on these results, we anticipate our present approach to be a starting point towards establishing the foundation for label-free electrical-based identification and quantification of an unlimited number of viruses and other nano-sized particles.
Bowey-Dellinger, Kristen; Dixon, Luke; Ackerman, Kristin; Vigueira, Cynthia; Suh, Yewseok K; Lyda, Todd; Sapp, Kelli; Grider, Michael; Crater, Dinene; Russell, Travis; Elias, Michael; Coffield, V McNeil; Segarra, Verónica A
2017-01-01
Undergraduate students learn about mammalian cell culture applications in introductory biology courses. However, laboratory modules are rarely designed to provide hands-on experience with mammalian cells or teach cell culture techniques, such as trypsinization and cell counting. Students are more likely to learn about cell culture using bacteria or yeast, as they are typically easier to grow, culture, and manipulate given the equipment, tools, and environment of most undergraduate biology laboratories. In contrast, the utilization of mammalian cells requires a dedicated biological safety cabinet and rigorous antiseptic techniques. For this reason, we have devised a laboratory module and method herein that familiarizes students with common cell culture procedures, without the use of a sterile hood or large cell culture facility. Students design and perform a time-efficient inquiry-based cell viability experiment using HeLa cells and tools that are readily available in an undergraduate biology laboratory. Students will become familiar with common techniques such as trypsinizing cells, cell counting with a hemocytometer, performing serial dilutions, and determining cell viability using trypan blue dye. Additionally, students will work with graphing software to analyze their data and think critically about the mechanism of death on a cellular level. Two different adaptations of this inquiry-based lab are presented-one for non-biology majors and one for biology majors. Overall, these laboratories aim to expose students to mammalian cell culture and basic techniques and help them to conceptualize their application in scientific research.
Bowey-Dellinger, Kristen; Dixon, Luke; Ackerman, Kristin; Vigueira, Cynthia; Suh, Yewseok K.; Lyda, Todd; Sapp, Kelli; Grider, Michael; Crater, Dinene; Russell, Travis; Elias, Michael; Coffield, V. McNeil; Segarra, Verónica A.
2017-01-01
Undergraduate students learn about mammalian cell culture applications in introductory biology courses. However, laboratory modules are rarely designed to provide hands-on experience with mammalian cells or teach cell culture techniques, such as trypsinization and cell counting. Students are more likely to learn about cell culture using bacteria or yeast, as they are typically easier to grow, culture, and manipulate given the equipment, tools, and environment of most undergraduate biology laboratories. In contrast, the utilization of mammalian cells requires a dedicated biological safety cabinet and rigorous antiseptic techniques. For this reason, we have devised a laboratory module and method herein that familiarizes students with common cell culture procedures, without the use of a sterile hood or large cell culture facility. Students design and perform a time-efficient inquiry-based cell viability experiment using HeLa cells and tools that are readily available in an undergraduate biology laboratory. Students will become familiar with common techniques such as trypsinizing cells, cell counting with a hemocytometer, performing serial dilutions, and determining cell viability using trypan blue dye. Additionally, students will work with graphing software to analyze their data and think critically about the mechanism of death on a cellular level. Two different adaptations of this inquiry-based lab are presented—one for non-biology majors and one for biology majors. Overall, these laboratories aim to expose students to mammalian cell culture and basic techniques and help them to conceptualize their application in scientific research. PMID:28861134
NASA Astrophysics Data System (ADS)
Schooneveld, E. M.; Pietropaolo, A.; Andreani, C.; Perelli Cippo, E.; Rhodes, N. J.; Senesi, R.; Tardocchi, M.; Gorini, G.
2016-09-01
Neutron scattering techniques are attracting an increasing interest from scientists in various research fields, ranging from physics and chemistry to biology and archaeometry. The success of these neutron scattering applications is stimulated by the development of higher performance instrumentation. The development of new techniques and concepts, including radiative capture based neutron detection, is therefore a key issue to be addressed. Radiative capture based neutron detectors utilize the emission of prompt gamma rays after neutron absorption in a suitable isotope and the detection of those gammas by a photon counter. They can be used as simple counters in the thermal region and (simultaneously) as energy selector and counters for neutrons in the eV energy region. Several years of extensive development have made eV neutron spectrometers operating in the so-called resonance detector spectrometer (RDS) configuration outperform their conventional counterparts. In fact, the VESUVIO spectrometer, a flagship instrument at ISIS serving a continuous user programme for eV inelastic neutron spectroscopy measurements, is operating in the RDS configuration since 2007. In this review, we discuss the physical mechanism underlying the RDS configuration and the development of associated instrumentation. A few successful neutron scattering experiments that utilize the radiative capture counting techniques will be presented together with the potential of this technique for thermal neutron diffraction measurements. We also outline possible improvements and future perspectives for radiative capture based neutron detectors in neutron scattering application at pulsed neutron sources.
Christie, J; Schwan, E V; Bodenstein, L L; Sommerville, J E M; van der Merwe, L L
2011-06-01
Several faecal examination techniques have shown variable sensitivity in demonstrating Spirocerca lupi (S. lupi) eggs. The objective of this study was to determine which faecal examination technique, including a novel modified centrifugal flotation technique, was most sensitive to diagnose spirocercosis. Ten coproscopic examinations were performed on faeces collected from 33 dogs confirmed endoscopically to have spirocercosis. The tests included a direct faecal examination, a faecal sedimentation/flotation test, 4 direct faecal flotations and 4 modified faecal centrifugal flotations. These latter 2 flotation tests utilised 4 different faecal flotation solutions: NaNO3 (SG 1.22), MgSO4 (SG 1.29), ZnSO4 (SG 1.30) and sugar (SG 1.27). The sensitivity of the tests ranged between 42% and 67%, with the NaNO3 solution showing the highest sensitivity in both the direct and modified-centrifugal flotations. The modified NaNO3 centrifugal method ranked 1st with the highest mean egg count (45.24 +/- 83), and was superior (i.e. higher egg count) and significantly different (P < 0.05) compared with the routine saturated sugar, ZnSO4 and MgSO4 flotation methods. The routine NaNO3 flotation method was also superior and significantly different (P < 0.05) compared with the routine ZnSO4 and MgSO4 flotation methods. Fifteen per cent (n = 5) of dogs had neoplastic oesophageal nodules and a further 18% (n = 6) had both neoplastic and non-neoplastic nodules. S. lupi eggs were demonstrated in 40% of dogs with neoplastic nodules only and 72.9% of the dogs with non-neoplastic nodules. The mean egg count in the non-neoplastic group (61) was statistically greater (P = 0.02) than that of the neoplastic group (1). The results show that faecal examination using a NaNO3 solution is the most sensitive in the diagnosis of spirocercosis. The modified centrifugal flotation faecal method using this solution has the highest egg count. The study also found that dogs with neoplastic nodules shed significantly fewer eggs than dogs with non-neoplastic nodules.
Prior image constrained image reconstruction in emerging computed tomography applications
NASA Astrophysics Data System (ADS)
Brunner, Stephen T.
Advances have been made in computed tomography (CT), especially in the past five years, by incorporating prior images into the image reconstruction process. In this dissertation, we investigate prior image constrained image reconstruction in three emerging CT applications: dual-energy CT, multi-energy photon-counting CT, and cone-beam CT in image-guided radiation therapy. First, we investigate the application of Prior Image Constrained Compressed Sensing (PICCS) in dual-energy CT, which has been called "one of the hottest research areas in CT." Phantom and animal studies are conducted using a state-of-the-art 64-slice GE Discovery 750 HD CT scanner to investigate the extent to which PICCS can enable radiation dose reduction in material density and virtual monochromatic imaging. Second, we extend the application of PICCS from dual-energy CT to multi-energy photon-counting CT, which has been called "one of the 12 topics in CT to be critical in the next decade." Numerical simulations are conducted to generate multiple energy bin images for a photon-counting CT acquisition and to investigate the extent to which PICCS can enable radiation dose efficiency improvement. Third, we investigate the performance of a newly proposed prior image constrained scatter correction technique to correct scatter-induced shading artifacts in cone-beam CT, which, when used in image-guided radiation therapy procedures, can assist in patient localization, and potentially, dose verification and adaptive radiation therapy. Phantom studies are conducted using a Varian 2100 EX system with an on-board imager to investigate the extent to which the prior image constrained scatter correction technique can mitigate scatter-induced shading artifacts in cone-beam CT. Results show that these prior image constrained image reconstruction techniques can reduce radiation dose in dual-energy CT by 50% in phantom and animal studies in material density and virtual monochromatic imaging, can lead to radiation dose efficiency improvement in multi-energy photon-counting CT, and can mitigate scatter-induced shading artifacts in cone-beam CT in full-fan and half-fan modes.
The AOLI low-order non-linear curvature wavefront sensor: laboratory and on-sky results
NASA Astrophysics Data System (ADS)
Crass, Jonathan; King, David; MacKay, Craig
2014-08-01
Many adaptive optics (AO) systems in use today require the use of bright reference objects to determine the effects of atmospheric distortions. Typically these systems use Shack-Hartmann Wavefront sensors (SHWFS) to distribute incoming light from a reference object between a large number of sub-apertures. Guyon et al. evaluated the sensitivity of several different wavefront sensing techniques and proposed the non-linear Curvature Wavefront Sensor (nlCWFS) offering improved sensitivity across a range of orders of distortion. On large ground-based telescopes this can provide nearly 100% sky coverage using natural guide stars. We present work being undertaken on the nlCWFS development for the Adaptive Optics Lucky Imager (AOLI) project. The wavefront sensor is being developed as part of a low-order adaptive optics system for use in a dedicated instrument providing an AO corrected beam to a Lucky Imaging based science detector. The nlCWFS provides a total of four reference images on two photon-counting EMCCDs for use in the wavefront reconstruction process. We present results from both laboratory work using a calibration system and the first on-sky data obtained with the nlCWFS at the 4.2 metre William Herschel Telescope, La Palma. In addition, we describe the updated optical design of the wavefront sensor, strategies for minimising intrinsic effects and methods to maximise sensitivity using photon-counting detectors. We discuss on-going work to develop the high speed reconstruction algorithm required for the nlCWFS technique. This includes strategies to implement the technique on graphics processing units (GPUs) and to minimise computing overheads to obtain a prior for a rapid convergence of the wavefront reconstruction. Finally we evaluate the sensitivity of the wavefront sensor based upon both data and low-photon count strategies.
Anti-aliasing techniques in photon-counting depth imaging using GHz clock rates
NASA Astrophysics Data System (ADS)
Krichel, Nils J.; McCarthy, Aongus; Collins, Robert J.; Buller, Gerald S.
2010-04-01
Single-photon detection technologies in conjunction with low laser illumination powers allow for the eye-safe acquisition of time-of-flight range information on non-cooperative target surfaces. We previously presented a photon-counting depth imaging system designed for the rapid acquisition of three-dimensional target models by steering a single scanning pixel across the field angle of interest. To minimise the per-pixel dwelling times required to obtain sufficient photon statistics for accurate distance resolution, periodic illumination at multi- MHz repetition rates was applied. Modern time-correlated single-photon counting (TCSPC) hardware allowed for depth measurements with sub-mm precision. Resolving the absolute target range with a fast periodic signal is only possible at sufficiently short distances: if the round-trip time towards an object is extended beyond the timespan between two trigger pulses, the return signal cannot be assigned to an unambiguous range value. Whereas constructing a precise depth image based on relative results may still be possible, problems emerge for large or unknown pixel-by-pixel separations or in applications with a wide range of possible scene distances. We introduce a technique to avoid range ambiguity effects in time-of-flight depth imaging systems at high average pulse rates. A long pseudo-random bitstream is used to trigger the illuminating laser. A cyclic, fast-Fourier supported analysis algorithm is used to search for the pattern within return photon events. We demonstrate this approach at base clock rates of up to 2 GHz with varying pattern lengths, allowing for unambiguous distances of several kilometers. Scans at long stand-off distances and of scenes with large pixel-to-pixel range differences are presented. Numerical simulations are performed to investigate the relative merits of the technique.
Gadaria-Rathod, Neha; Dentone, Peter G; Peskin, Ellen; Maguire, Maureen G; Moser, Ann; Asbell, Penny A
2013-11-01
To evaluate pill counts and red blood cell (RBC) membrane fatty acid profiles as measures of compliance with oral omega3 polyunsaturated fatty acids (ω3 PUFAs) and to compare the two techniques. Sixteen dry eye disease subjects were given oral ω3 PUFA or placebo for 3 months. Compliance was measured by pill counts and blood tests at baseline and 3 months. The Wilcoxon signed-rank tests and rank-sum tests were used to compare changes from baseline and the difference between the two groups; Spearman correlation coefficients were used to assess the relationship of pill counts to changes in blood FAs. Pill counts for the ω3 (n=7) and placebo (n=9) groups showed a mean consumption of 4.39 and 4.76 pills per day, respectively. In the ω3 group, the median change from baseline was +1.46% for eicosapentaenoic acid (EPA) (P=0.03), +1.49% for docosahexaenoic acid (DHA) (P=0.08), and -1.91% for arachidonic acids (AA) (P=0.02). In the placebo group, median changes in all measured FAs were small and not statistically significant. The difference in change in FA levels between the two groups was significantly greater for EPA (P=0.01) and AA (P=0.04). The correlations between pill counts and changes in EPA (r=0.36, P=0.43) and DHA (r=0.17, P=0.70) were not strong. RBC FA analysis can be used to measure compliance in the active group and also monitor the placebo group for nonstudy ω3 intake. Low correlation of pill counts with blood levels suggests that pill counts alone may be inaccurate and should be replaced or supplemented with objective measures.
Half-Lives of 101Rh and 108m Ag
NASA Astrophysics Data System (ADS)
Norman, Eric; Browne, Edgardo; Shugart, Howard
2014-09-01
Half-lives of short-lived nuclei can easily be measured by direct counting techniques, whereas those of long-lived naturally-occurring nuclei are usually determined by specific activity measurements. However, half-lives in the range of 1 - 1,000,000 years are notoriously difficult to determine. For example, published values for the half-life of 101Rh range from 3.0 +/- 0.4 years to 10 +/- 1 years, and for 108m Ag published values range from 127 +/- 21 years to 438 +/- 9 years. In order to resolve the issues of what the half-lives of these isotopes actually are, we set up two separate long-term gamma-ray counting experiments. Gamma-ray data were collected in time bins using high-purity Ge detectors and ORTEC PC-based data acquisition systems. We counted in this manner for a period of approximately 5 years for 101Rh and 3 years for 108mAg. In this talk we will describe the details of these experiments and will present the final results for the half-lives of 101Rh and 108mAg determined from these measurements. Half-lives of short-lived nuclei can easily be measured by direct counting techniques, whereas those of long-lived naturally-occurring nuclei are usually determined by specific activity measurements. However, half-lives in the range of 1 - 1,000,000 years are notoriously difficult to determine. For example, published values for the half-life of 101Rh range from 3.0 +/- 0.4 years to 10 +/- 1 years, and for 108m Ag published values range from 127 +/- 21 years to 438 +/- 9 years. In order to resolve the issues of what the half-lives of these isotopes actually are, we set up two separate long-term gamma-ray counting experiments. Gamma-ray data were collected in time bins using high-purity Ge detectors and ORTEC PC-based data acquisition systems. We counted in this manner for a period of approximately 5 years for 101Rh and 3 years for 108mAg. In this talk we will describe the details of these experiments and will present the final results for the half-lives of 101Rh and 108mAg determined from these measurements. This work was supported in part by the U. S. Dept. of Energy under Contract Numbers DE-AC02-05CH11231 and DE-NA0000979.
Lone, Ayesha; Anany, Hany; Hakeem, Mohammed; Aguis, Louise; Avdjian, Anne-Claire; Bouget, Marina; Atashi, Arash; Brovko, Luba; Rochefort, Dominic; Griffiths, Mansel W
2016-01-18
Due to lack of adequate control methods to prevent contamination in fresh produce and growing consumer demand for natural products, the use of bacteriophages has emerged as a promising approach to enhance safety of these foods. This study sought to control Listeria monocytogenes in cantaloupes and RTE meat and Escherichia coli O104:H4 in alfalfa seeds and sprouts under different storage conditions by using specific lytic bacteriophage cocktails applied either free or immobilized. Bacteriophage cocktails were introduced into prototypes of packaging materials using different techniques: i) immobilizing on positively charged modified cellulose membranes, ii) impregnating paper with bacteriophage suspension, and iii) encapsulating in alginate beads followed by application of beads onto the paper. Phage-treated and non-treated samples were stored for various times and at temperatures of 4°C, 12°C or 25°C. In cantaloupe, when free phage cocktail was added, L. monocytogenes counts dropped below the detection limit of the plating technique (<1 log CFU/g) after 5 days of storage at both 4°C and 12°C. However, at 25°C, counts below the detection limit were observed after 3 and 6h and a 2-log CFU/g reduction in cell numbers was seen after 24h. For the immobilized Listeria phage cocktail, around 1-log CFU/g reduction in the Listeria count was observed by the end of the storage period for all tested storage temperatures. For the alfalfa seeds and sprouts, regardless of the type of phage application technique (spraying of free phage suspension, bringing in contact with bacteriophage-based materials (paper coated with encapsulated bacteriophage or impregnated with bacteriophage suspension)), the count of E. coli O104:H4 was below the detection limit (<1 log CFU/g) after 1h in seeds and about a 1-log cycle reduction in E. coli count was observed on the germinated sprouts by day 5. In ready-to-eat (RTE) meat, LISTEX™ P100, a commercial phage product, was able to significantly reduce the growth of L. monocytogenes at both storage temperatures, 4°C and 10°C, for 25 days regardless of bacteriophage application format (immobilized or non-immobilized (free)). In conclusion, the developed phage-based materials demonstrated significant antimicrobial effect, when applied to the artificially contaminated foods, and can be used as prototypes for developing bioactive antimicrobial packaging materials capable of enhancing the safety of fresh produce and RTE meat. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpson, A.; Pitts, M.; Ludowise, J.D.
The Hanford burial grounds contains a broad spectrum of low activity radioactive wastes, transuranic (TRU) wastes, and hazardous wastes including fission products, byproduct material (thorium and uranium), plutonium and laboratory chemicals. A passive neutron non-destructive assay technique has been developed for characterization of shielded concreted drums exhumed from the burial grounds. This method facilitates the separation of low activity radiological waste containers from TRU waste containers exhumed from the burial grounds. Two identical total neutron counting systems have been deployed, each consisting of He-3 detectors surrounded by a polyethylene moderator. The counts are processed through a statistical filter that removesmore » outliers in order to suppress cosmic spallation events and electronic noise. Upon completion of processing, a 'GO / NO GO' signal is provided to the operator based on a threshold level equivalent to 0.5 grams of weapons grade plutonium in the container being evaluated. This approach allows instantaneous decisions to be made on how to proceed with the waste. The counting systems have been set up using initial on-site measurements (neutron emitting standards loaded into surrogate waste containers) combined with Monte Carlo modeling techniques. The benefit of this approach is to allow the systems to extend their measurement ranges, in terms of applicable matrix types and container sizes, with minimal interruption to the operations at the burial grounds. (authors)« less
Mendez, Javier; Monleon-Getino, Antonio; Jofre, Juan; Lucena, Francisco
2017-10-01
The present study aimed to establish the kinetics of the appearance of coliphage plaques using the double agar layer titration technique to evaluate the feasibility of using traditional coliphage plaque forming unit (PFU) enumeration as a rapid quantification method. Repeated measurements of the appearance of plaques of coliphages titrated according to ISO 10705-2 at different times were analysed using non-linear mixed-effects regression to determine the most suitable model of their appearance kinetics. Although this model is adequate, to simplify its applicability two linear models were developed to predict the numbers of coliphages reliably, using the PFU counts as determined by the ISO after only 3 hours of incubation. One linear model, when the number of plaques detected was between 4 and 26 PFU after 3 hours, had a linear fit of: (1.48 × Counts 3 h + 1.97); and the other, values >26 PFU, had a fit of (1.18 × Counts 3 h + 2.95). If the number of plaques detected was <4 PFU after 3 hours, we recommend incubation for (18 ± 3) hours. The study indicates that the traditional coliphage plating technique has a reasonable potential to provide results in a single working day without the need to invest in additional laboratory equipment.
Development of a high-performance multichannel system for time-correlated single photon counting
NASA Astrophysics Data System (ADS)
Peronio, P.; Cominelli, A.; Acconcia, G.; Rech, I.; Ghioni, M.
2017-05-01
Time-Correlated Single Photon Counting (TCSPC) is one of the most effective techniques for measuring weak and fast optical signals. It outperforms traditional "analog" techniques due to its high sensitivity along with high temporal resolution. Despite those significant advantages, a main drawback still exists, which is related to the long acquisition time needed to perform a measurement. In past years many TCSPC systems have been developed with higher and higher number of channels, aimed to dealing with that limitation. Nevertheless, modern systems suffer from a strong trade-off between parallelism level and performance: the higher the number of channels the poorer the performance. In this work we present the design of a 32x32 TCSPC system meant for overtaking the existing trade-off. To this aim different technologies has been employed, to get the best performance both from detectors and sensing circuits. The exploitation of different technologies will be enabled by Through Silicon Vias (TSVs) which will be investigated as a possible solution for connecting the detectors to the sensing circuits. When dealing with a high number of channels, the count rate is inevitably set by the affordable throughput to the external PC. We targeted a throughput of 10Gb/s, which is beyond the state of the art, and designed the number of TCSPC channels accordingly. A dynamic-routing logic will connect the detectors to the lower number of acquisition chains.
Whales from space: counting southern right whales by satellite.
Fretwell, Peter T; Staniland, Iain J; Forcada, Jaume
2014-01-01
We describe a method of identifying and counting whales using very high resolution satellite imagery through the example of southern right whales breeding in part of the Golfo Nuevo, Península Valdés in Argentina. Southern right whales have been extensively hunted over the last 300 years and although numbers have recovered from near extinction in the early 20(th) century, current populations are fragmented and are estimated at only a small fraction of pre-hunting total. Recent extreme right whale calf mortality events at Península Valdés, which constitutes the largest single population, have raised fresh concern for the future of the species. The WorldView2 satellite has a maximum 50 cm resolution and a water penetrating coastal band in the far-blue part of the spectrum that allows it to see deeper into the water column. Using an image covering 113 km², we identified 55 probable whales and 23 other features that are possibly whales, with a further 13 objects that are only detected by the coastal band. Comparison of a number of classification techniques, to automatically detect whale-like objects, showed that a simple thresholding technique of the panchromatic and coastal band delivered the best results. This is the first successful study using satellite imagery to count whales; a pragmatic, transferable method using this rapidly advancing technology that has major implications for future surveys of cetacean populations.
Kwon, Seung Yong; Pham, Tuyen Danh; Park, Kang Ryoung; Jeong, Dae Sik; Yoon, Sungsoo
2016-06-11
Fitness classification is a technique to assess the quality of banknotes in order to determine whether they are usable. Banknote classification techniques are useful in preventing problems that arise from the circulation of substandard banknotes (such as recognition failures, or bill jams in automated teller machines (ATMs) or bank counting machines). By and large, fitness classification continues to be carried out by humans, and this can cause the problem of varying fitness classifications for the same bill by different evaluators, and requires a lot of time. To address these problems, this study proposes a fuzzy system-based method that can reduce the processing time needed for fitness classification, and can determine the fitness of banknotes through an objective, systematic method rather than subjective judgment. Our algorithm was an implementation to actual banknote counting machine. Based on the results of tests on 3856 banknotes in United States currency (USD), 3956 in Korean currency (KRW), and 2300 banknotes in Indian currency (INR) using visible light reflection (VR) and near-infrared light transmission (NIRT) imaging, the proposed method was found to yield higher accuracy than prevalent banknote fitness classification methods. Moreover, it was confirmed that the proposed algorithm can operate in real time, not only in a normal PC environment, but also in an embedded system environment of a banknote counting machine.
Kwon, Seung Yong; Pham, Tuyen Danh; Park, Kang Ryoung; Jeong, Dae Sik; Yoon, Sungsoo
2016-01-01
Fitness classification is a technique to assess the quality of banknotes in order to determine whether they are usable. Banknote classification techniques are useful in preventing problems that arise from the circulation of substandard banknotes (such as recognition failures, or bill jams in automated teller machines (ATMs) or bank counting machines). By and large, fitness classification continues to be carried out by humans, and this can cause the problem of varying fitness classifications for the same bill by different evaluators, and requires a lot of time. To address these problems, this study proposes a fuzzy system-based method that can reduce the processing time needed for fitness classification, and can determine the fitness of banknotes through an objective, systematic method rather than subjective judgment. Our algorithm was an implementation to actual banknote counting machine. Based on the results of tests on 3856 banknotes in United States currency (USD), 3956 in Korean currency (KRW), and 2300 banknotes in Indian currency (INR) using visible light reflection (VR) and near-infrared light transmission (NIRT) imaging, the proposed method was found to yield higher accuracy than prevalent banknote fitness classification methods. Moreover, it was confirmed that the proposed algorithm can operate in real time, not only in a normal PC environment, but also in an embedded system environment of a banknote counting machine. PMID:27294940
Detection and Monitoring of Airborne Nuclear Waste Materials. Annual Report to Department of Energy.
1979-12-04
an active core , its detection by counting techniques is often slow and impractical. For these reasons NRL under contract with DoE undertook develop ...Protection and Measurements, Tritium Measurement Techniques NCRP Report No. 47 (1976). 2. " Development of a Continuous Tritium Monitor for Fuel Reprocessing...Trans. Am. Nucl. Soc. 21, 91 (1975). 146. "Process Behavior of and Environmental Assessments of C Releases from an HTGR Fuel Reprocessing Facility" J. W
1982-09-01
techniques developed in earlier SYSTAN work (Food System Support of the Relocation Strateiy, September, 1975). Using this technique, the total...MINNEVPOLIS MN I 0 1 0 0 0 1,500 SUFER VALU STCRES INC FARGO NO 1 0 0 1 0 0 3,000 COUNT{ TOTAL: 2 0 1 1 0 0 4,500 DICKEY FAIP’’AY FOOS INC NO"IHFIELO MN I 1
Molecular techniques are an alternative to culturing and counting methods in quantifying indoor fungal contamination. Pyrosequencing offers the possibility of identifying unexpected indoor fungi. In this study, 50 house dust samples were collected from homes in the Yakima Valley,...
Improved format for radiocardiographic data
NASA Technical Reports Server (NTRS)
Dimeff, J.; Sevelius, G.
1973-01-01
Technique involves introduction of radioactive sample into antecubital vein. Scintillation crystal mounted in collimating housing views portion of right and left hearts. As radioactive sample passes through heart, counting rate is measured by crystal and recorded on strip chart. Data is insensitive to geometric effects and other parameters.
Techniques Suitable for a Portable Wear Metal Analyzer.
1981-09-01
measured by a detector. Commonly used detectors are semiconductor detectors or proportional counters. b. Energy-Dispersive XRPS . In the energy-dispersive...because the sample must be charred before the analysis. C. X-Ray Fluorescence Spectroscopy. Normally the counting time for XRPS is 100 seconds
Electromigration of Contaminated Soil by Electro-Bioremediation Technique
NASA Astrophysics Data System (ADS)
Azhar, A. T. S.; Nabila, A. T. A.; Nurshuhaila, M. S.; Shaylinda, M. Z. N.; Azim, M. A. M.
2016-07-01
Soil contamination with heavy metals poses major environmental and human health problems. This problem needs an efficient method and affordable technological solution such as electro-bioremediation technique. The electro-bioremediation technique used in this study is the combination of bacteria and electrokinetic process. The aim of this study is to investigate the effectiveness of Pseudomonas putida bacteria as a biodegradation agent to remediate contaminated soil. 5 kg of kaolin soil was spiked with 5 g of zinc oxide. During this process, the anode reservoir was filled with Pseudomonas putida while the cathode was filled with distilled water for 5 days at 50 V of electrical gradient. The X-Ray Fluorescent (XRF) test indicated that there was a significant reduction of zinc concentration for the soil near the anode with 89% percentage removal. The bacteria count is high near the anode which is 1.3x107 cfu/gww whereas the bacteria count at the middle and near the cathode was 5.0x106 cfu/gww and 8.0x106 cfu/gww respectively. The migration of ions to the opposite charge of electrodes during the electrokinetic process resulted from the reduction of zinc. The results obtained proved that the electro-bioremediation reduced the level of contaminants in the soil sample. Thus, the electro-bioremediation technique has the potential to be used in the treatment of contaminated soil.
The Search for AGN in Dusty Star Forming Hosts with JWST
NASA Astrophysics Data System (ADS)
Kirkpatrick, Allison; Alberts, Stacey; Pope, Alexandra; Rieke, George; Sajina, Anna
2018-01-01
The bulk of the stellar growth over cosmic time is dominated by IR luminous galaxies at cosmic noon (z=1-2), many of which harbor a hidden active galactic nucleus (AGN). I use state of the art infrared color diagnostics, combining Spitzer and Herschel observations, to separate dust-obscured AGN from dusty star forming galaxies (SFGs) in the CANDELS and COSMOS surveys. I calculate 24 micron counts of SFGs, AGN/star forming "Composites", and AGN. AGN and Composites dominate the counts above 0.8 mJy at 24 micron, and Composites form at least 25% of an IR sample even to faint detection limits. I develop methods to use the Mid-Infrared Instrument (MIRI) on JWST to identify dust-obscured AGN and Composite galaxies from z~1-2. I demonstrate that MIRI color techniques can select AGN with lower Eddington ratios and higher specific SFRs than X-ray techniques alone. JWST/MIRI will enable critical steps forward in identifying and understanding dust-obscured AGN and the link to their host galaxies.
Air-guided manual deep lamellar keratoplasty.
Caporossi, A; Simi, C; Licignano, R; Traversi, C; Balestrazzi, A
2004-01-01
To evaluate the efficacy of a new modified technique of deep lamellar keratoplasty (DLK). Nine eyes of eight patients with keratoconus of moderate degree were included. All patients underwent DLK with manual dissection from a limbal side port after an air bubble injection in the anterior chamber. The patients underwent a complete ophthalmologic examination 6 months after the suture removal, evaluating best-corrected visual acuity, corneal thickness, endothelial cell count, and topographic astigmatism. One case (11.1%) was converted to penetrating keratoplasty because of microperforation. In the eight successful cases, 7 eyes (77.8%) achieved 20/30 or better visual acuity 6 months after suture removal. Mean postoperative pachymetry was 604.76 microm (SD 46.76). Specular microscopy 6 months after suture removal revealed average endothelial cell count of 2273/mm2 (SD 229). This modified DLK technique is a safe and effective procedure and could facilitate, after a short learning curve, this kind of surgery with a low risk of conversion to penetrating keratoplasty.
Use of Aerial Photography to Monitor Fall Chinook Salmon Spawning in the Columbia River
DOE Office of Scientific and Technical Information (OSTI.GOV)
Visser, Richard H.; Dauble, Dennis D.; Geist, David R.
2002-11-01
This paper compares two methods for enumerating salmon redds and their application to monitoring spawning activity. Aerial photographs of fall chinook salmon spawning areas in the Hanford Reach of the Columbia River were digitized and mapped using Geographic Information Systems (GIS) techniques in 1994 and 1995 as part of an annual assessment of the population. The number of visible redds from these photographs were compared to counts obtained from visual surveys with fixed wing aircraft. The proportion of the total redds within each of five general survey areas was similar for the two monitoring techniques. However, the total number ofmore » redds based on aerial photographs was 2.2 and 3.0 times higher than those observed during visual surveys for 1994 and 1995, respectively. The divergence in redd counts was most evident near peak spawning activity when the number of redds within individual spawning clusters exceeded 500. Aerial photography improved our ability to monitor numbers of visible salmon redds and to quantify habitat use.« less
A non-contact technique for measuring eccrine sweat gland activity using passive thermal imaging.
Krzywicki, Alan T; Berntson, Gary G; O'Kane, Barbara L
2014-10-01
An approach for monitoring eccrine sweat gland activity using high resolution Mid-Wave Infrared (MWIR) imaging (3-5 μm wave band) is described. This technique is non-contact, passive, and provides high temporal and spatial resolution. Pore activity was monitored on the face and on the volar surfaces of the distal and medial phalanges of the index and middle fingers while participants performed a series of six deep inhalation and exhalation exercises. Two metrics called the Pore Activation Index (PAI) and Pore Count (PC) were defined as size-weighted and unweighted measures of active sweat gland counts respectively. PAI transient responses on the finger tips were found to be positively correlated to Skin Conductance Responses (SCRs). PAI responses were also observed on the face, although the finger sites appeared to be more responsive. Results indicate that thermal imaging of the pore response may provide a useful, non-contact, correlate measure for electrodermal responses recorded from related sites. Published by Elsevier B.V.
Lahmann, B; Milanese, L M; Han, W; Gatu Johnson, M; Séguin, F H; Frenje, J A; Petrasso, R D; Hahn, K D; Jones, B
2016-11-01
A compact neutron spectrometer, based on a CH foil for the production of recoil protons and CR-39 detection, is being developed for the measurements of the DD-neutron spectrum at the NIF, OMEGA, and Z facilities. As a CR-39 detector will be used in the spectrometer, the principal sources of background are neutron-induced tracks and intrinsic tracks (defects in the CR-39). To reject the background to the required level for measurements of the down-scattered and primary DD-neutron components in the spectrum, the Coincidence Counting Technique (CCT) must be applied to the data. Using a piece of CR-39 exposed to 2.5-MeV protons at the MIT HEDP accelerator facility and DD-neutrons at Z, a significant improvement of a DD-neutron signal-to-background level has been demonstrated for the first time using the CCT. These results are in excellent agreement with previous work applied to DT neutrons.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lahmann, B.; Milanese, L. M.; Han, W.
A compact neutron spectrometer, based on a CH foil for the production of recoil protons and CR-39 detection, is being developed for the measurements of the DD-neutron spectrum at the NIF, OMEGA, and Z facilities. As a CR-39 detector will be used in the spectrometer, the principal sources of background are neutron-induced tracks and intrinsic tracks (defects in the CR-39). To reject the background to the required level for measurements of the down-scattered and primary DD-neutron components in the spectrum, the Coincidence Counting Technique (CCT) must be applied to the data. Using a piece of CR-39 exposed to 2.5-MeV protonsmore » at the MIT HEDP accelerator facility and DD-neutrons at Z, a significant improvement of a DD-neutron signal-to-background level has been demonstrated for the first time using the CCT. In conclusion, these results are in excellent agreement with previous work applied to DT neutrons.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lahmann, B., E-mail: lahmann@mit.edu; Milanese, L. M.; Han, W.
A compact neutron spectrometer, based on a CH foil for the production of recoil protons and CR-39 detection, is being developed for the measurements of the DD-neutron spectrum at the NIF, OMEGA, and Z facilities. As a CR-39 detector will be used in the spectrometer, the principal sources of background are neutron-induced tracks and intrinsic tracks (defects in the CR-39). To reject the background to the required level for measurements of the down-scattered and primary DD-neutron components in the spectrum, the Coincidence Counting Technique (CCT) must be applied to the data. Using a piece of CR-39 exposed to 2.5-MeV protonsmore » at the MIT HEDP accelerator facility and DD-neutrons at Z, a significant improvement of a DD-neutron signal-to-background level has been demonstrated for the first time using the CCT. These results are in excellent agreement with previous work applied to DT neutrons.« less
Lahmann, B.; Milanese, L. M.; Han, W.; ...
2016-07-20
A compact neutron spectrometer, based on a CH foil for the production of recoil protons and CR-39 detection, is being developed for the measurements of the DD-neutron spectrum at the NIF, OMEGA, and Z facilities. As a CR-39 detector will be used in the spectrometer, the principal sources of background are neutron-induced tracks and intrinsic tracks (defects in the CR-39). To reject the background to the required level for measurements of the down-scattered and primary DD-neutron components in the spectrum, the Coincidence Counting Technique (CCT) must be applied to the data. Using a piece of CR-39 exposed to 2.5-MeV protonsmore » at the MIT HEDP accelerator facility and DD-neutrons at Z, a significant improvement of a DD-neutron signal-to-background level has been demonstrated for the first time using the CCT. In conclusion, these results are in excellent agreement with previous work applied to DT neutrons.« less
NASA Astrophysics Data System (ADS)
Medjoubi, K.; Dawiec, A.
2017-12-01
A simple method is proposed in this work for quantitative evaluation of the quality of the threshold adjustment and the flat-field correction of Hybrid Photon Counting pixel (HPC) detectors. This approach is based on the Photon Transfer Curve (PTC) corresponding to the measurement of the standard deviation of the signal in flat field images. Fixed pattern noise (FPN), easily identifiable in the curve, is linked to the residual threshold dispersion, sensor inhomogeneity and the remnant errors in flat fielding techniques. The analytical expression of the signal to noise ratio curve is developed for HPC and successfully used as a fit function applied to experimental data obtained with the XPAD detector. The quantitative evaluation of the FPN, described by the photon response non-uniformity (PRNU), is measured for different configurations (threshold adjustment method and flat fielding technique) and is demonstrated to be used in order to evaluate the best setting for having the best image quality from a commercial or a R&D detector.
NASA Astrophysics Data System (ADS)
Palma, K. D.; Pichotka, M.; Hasn, S.; Granja, C.
2017-02-01
In mammography the difficult task to detect microcalcifications (≈ 100 μm) and low contrast structures in the breast has been a topic of interest from its beginnings. The possibility to improve the image quality requires the effort to employ novel X-ray imaging techniques, such as phase-contrast, and high resolution detectors. Phase-contrast techniques are promising tools for medical diagnosis because they provide additional and complementary information to traditional absorption-based X-ray imaging methods. In this work a Hamamatsu microfocus X-ray source with tungsten anode and a photon counting detector (Timepix operated in Medipix mode) was used. A significant improvement in the detection of phase-effects using Medipix detector was observed in comparison to an standard flat-panel detector. An optimization of geometrical parameters reveals the dependency on the X-ray propagation path and the small angle deviation. The quantification of these effects was achieved taking into account the image noise, contrast, spatial resolution of the phase-enhancement, absorbed dose, and energy dependence.
The role of vocal individuality in conservation
Terry, Andrew MR; Peake, Tom M; McGregor, Peter K
2005-01-01
Identifying the individuals within a population can generate information on life history parameters, generate input data for conservation models, and highlight behavioural traits that may affect management decisions and error or bias within census methods. Individual animals can be discriminated by features of their vocalisations. This vocal individuality can be utilised as an alternative marking technique in situations where the marks are difficult to detect or animals are sensitive to disturbance. Vocal individuality can also be used in cases were the capture and handling of an animal is either logistically or ethically problematic. Many studies have suggested that vocal individuality can be used to count and monitor populations over time; however, few have explicitly tested the method in this role. In this review we discuss methods for extracting individuality information from vocalisations and techniques for using this to count and monitor populations over time. We present case studies in birds where vocal individuality has been applied to conservation and we discuss its role in mammals. PMID:15960848
NASA Astrophysics Data System (ADS)
Bashkov, O. V.; Bryansky, A. A.; Panin, S. V.; Zaikov, V. I.
2016-11-01
Strength properties of the glass fiber reinforced polymers (GFRP) fabricated by vacuum and vacuum autoclave molding techniques were analyzed. Measurements of porosity of the GFRP parts manufactured by various molding techniques were conducted with the help of optical microscopy. On the basis of experimental data obtained by means of acoustic emission hardware/software setup, the technique for running diagnostics and forecasting the bearing capacity of polymeric composite materials based on the result of three-point bending tests has been developed. The operation principle of the technique is underlined by the evaluation of the power function index change which takes place on the dependence of the total acoustic emission counts versus the loading stress.
Black hole entropy in massive Type IIA
NASA Astrophysics Data System (ADS)
Benini, Francesco; Khachatryan, Hrachya; Milan, Paolo
2018-02-01
We study the entropy of static dyonic BPS black holes in AdS4 in 4d N=2 gauged supergravities with vector and hyper multiplets, and how the entropy can be reproduced with a microscopic counting of states in the AdS/CFT dual field theory. We focus on the particular example of BPS black holes in AdS{\\hspace{0pt}}4 × S6 in massive Type IIA, whose dual three-dimensional boundary description is known and simple. To count the states in field theory we employ a supersymmetric topologically twisted index, which can be computed exactly with localization techniques. We find a perfect match at leading order.
Long-range time-of-flight scanning sensor based on high-speed time-correlated single-photon counting
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCarthy, Aongus; Collins, Robert J.; Krichel, Nils J.
2009-11-10
We describe a scanning time-of-flight system which uses the time-correlated single-photon counting technique to produce three-dimensional depth images of distant, noncooperative surfaces when these targets are illuminated by a kHz to MHz repetition rate pulsed laser source. The data for the scene are acquired using a scanning optical system and an individual single-photon detector. Depth images have been successfully acquired with centimeter xyz resolution, in daylight conditions, for low-signature targets in field trials at distances of up to 325 m using an output illumination with an average optical power of less than 50 {mu}W.
Standardisation and half-life of 89Zr.
García-Toraño, E; Peyrés, V; Roteta, M; Mejuto, M; Sánchez-Cabezudo, A; Romero, E
2018-04-01
The nuclide 89 Zr is being tested for the labelling of compounds with long blood circulation times. It decays by beta plus emission (22.8%) and by electron capture (77.2%) to 89 Y. Its half-life has been determined by following the decay rate with two measurement systems; an Ionisation Chamber and an HPGe detector. The combination of six results gives a value of T 1/2 = 78.333 (38) h, slightly lower than the DDEP recommended value of 78.42 (13) h. This radionuclide has also been standardised by liquid scintillation counting, 4πγ counting and coincidence techniques. Copyright © 2017 Elsevier Ltd. All rights reserved.
Super resolution imaging of HER2 gene amplification
NASA Astrophysics Data System (ADS)
Okada, Masaya; Kubo, Takuya; Masumoto, Kanako; Iwanaga, Shigeki
2016-02-01
HER2 positive breast cancer is currently examined by counting HER2 genes using fluorescence in situ hybridization (FISH)-stained breast carcinoma samples. In this research, two-dimensional super resolution fluorescence microscopy based on stochastic optical reconstruction microscopy (STORM), with a spatial resolution of approximately 20 nm in the lateral direction, was used to more precisely distinguish and count HER2 genes in a FISH-stained tissue section. Furthermore, by introducing double-helix point spread function (DH-PSF), an optical phase modulation technique, to super resolution microscopy, three-dimensional images were obtained of HER2 in a breast carcinoma sample approximately 4 μm thick.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aima, M; Viscariello, N; Patton, T
Purpose: The aim of this work is to propose a method to optimize radioactive source localization (RSL) for non-palpable breast cancer surgery. RSL is commonly used as a guiding technique during surgery for excision of non-palpable tumors. A collimated hand-held detector is used to localize radioactive sources implanted in tumors. Incisions made by the surgeon are based on maximum observed detector counts, and tumors are subsequently resected based on an arbitrary estimate of the counts expected at the surgical margin boundary. This work focuses on building a framework to predict detector counts expected throughout the procedure to improve surgical margins.more » Methods: A gamma detection system called the Neoprobe GDS was used for this work. The probe consists of a cesium zinc telluride crystal and a collimator. For this work, an I-125 Best Medical model 2301 source was used. The source was placed in three different phantoms, a PMMA, a Breast (25%- glandular tissue/75%- adipose tissue) and a Breast (75-25) phantom with a backscatter thickness of 6 cm. Counts detected by the probe were recorded with varying amounts of phantom thicknesses placed on top of the source. A calibration curve was generated using MATLAB based on the counts recorded for the calibration dataset acquired with the PMMA phantom. Results: The observed detector counts data used as the validation set was accurately predicted to within ±3.2%, ±6.9%, ±8.4% for the PMMA, Breast (75-25), Breast (25–75) phantom respectively. The average difference between predicted and observed counts was −0.4%, 2.4%, 1.4% with a standard deviation of 1.2 %, 1.8%, 3.4% for the PMMA, Breast (75-25), Breast (25–75) phantom respectively. Conclusion: The results of this work provide a basis for characterization of a detector used for RSL. Counts were predicted to within ±9% for three different phantoms without the application of a density correction factor.« less
Gaburro, Julie; Duchemin, Jean-Bernard; Paradkar, Prasad N; Nahavandi, Saeid; Bhatti, Asim
2016-11-18
Widespread in the tropics, the mosquito Aedes aegypti is an important vector of many viruses, posing a significant threat to human health. Vector monitoring often requires fecundity estimation by counting eggs laid by female mosquitoes. Traditionally, manual data analyses have been used but this requires a lot of effort and is the methods are prone to errors. An easy tool to assess the number of eggs laid would facilitate experimentation and vector control operations. This study introduces a built-in software called ICount allowing automatic egg counting of the mosquito vector, Aedes aegypti. ICount egg estimation compared to manual counting is statistically equivalent, making the software effective for automatic and semi-automatic data analysis. This technique also allows rapid analysis compared to manual methods. Finally, the software has been used to assess p-cresol oviposition choices under laboratory conditions in order to test the system with different egg densities. ICount is a powerful tool for fast and precise egg count analysis, freeing experimenters from manual data processing. Software access is free and its user-friendly interface allows easy use by non-experts. Its efficiency has been tested in our laboratory with oviposition dual choices of Aedes aegypti females. The next step will be the development of a mobile application, based on the ICount platform, for vector monitoring surveys in the field.
Changes in total viable count and TVB-N content in marinated chicken breast fillets during storage
NASA Astrophysics Data System (ADS)
Baltić, T.; Ćirić, J.; Velebit, B.; Petronijević, R.; Lakićević, B.; Đorđević, V.; Janković, V.
2017-09-01
Marination is a popular technique for enhancing meat properties. Depending on the marinade type and ingredients added, marination can improve sensory, chemical and microbiological quality of meat products. In this study, the total viable count and total volatile basic nitrogen (TVB-N) content in marinated chicken breast fillets were investigated. The possible correlation between bacterial growth and formation of TVB-N was also tested. Chicken breast fillets were immersed in a solution of table salt (as a control) orthree different marinades,which consisted of table salt, sodium tripolyphosphate and/or sodium citrate, and stored in air for nine days at 4±1°C. Analyses of the total viable count and TVB-N were performed on days0, 3, 6 and 9 day of storage. The total viable count gradually increased in all examined groups, and statistically significant differences (p<0.01 p<0.05) between treatments on days0, 3 and 6 day of storage were established. TVB-N values in marinated chicken were significantly higher (p<0.01 p<0.05) compared to the control. Using the multiple linear regression, a positive correlation between total viable count and formation of TVB-N in chicken marinated with sodium citrate was established (p<0.05), while the intensity of TVB-N formation was lowest in chicken marinated with sodium tripolyphosphate.
Sentürk, Damla; Dalrymple, Lorien S; Nguyen, Danh V
2014-11-30
We propose functional linear models for zero-inflated count data with a focus on the functional hurdle and functional zero-inflated Poisson (ZIP) models. Although the hurdle model assumes the counts come from a mixture of a degenerate distribution at zero and a zero-truncated Poisson distribution, the ZIP model considers a mixture of a degenerate distribution at zero and a standard Poisson distribution. We extend the generalized functional linear model framework with a functional predictor and multiple cross-sectional predictors to model counts generated by a mixture distribution. We propose an estimation procedure for functional hurdle and ZIP models, called penalized reconstruction, geared towards error-prone and sparsely observed longitudinal functional predictors. The approach relies on dimension reduction and pooling of information across subjects involving basis expansions and penalized maximum likelihood techniques. The developed functional hurdle model is applied to modeling hospitalizations within the first 2 years from initiation of dialysis, with a high percentage of zeros, in the Comprehensive Dialysis Study participants. Hospitalization counts are modeled as a function of sparse longitudinal measurements of serum albumin concentrations, patient demographics, and comorbidities. Simulation studies are used to study finite sample properties of the proposed method and include comparisons with an adaptation of standard principal components regression. Copyright © 2014 John Wiley & Sons, Ltd.
Investigation of ultra low-dose scans in the context of quantum-counting clinical CT
NASA Astrophysics Data System (ADS)
Weidinger, T.; Buzug, T. M.; Flohr, T.; Fung, G. S. K.; Kappler, S.; Stierstorfer, K.; Tsui, B. M. W.
2012-03-01
In clinical computed tomography (CT), images from patient examinations taken with conventional scanners exhibit noise characteristics governed by electronics noise, when scanning strongly attenuating obese patients or with an ultra-low X-ray dose. Unlike CT systems based on energy integrating detectors, a system with a quantum counting detector does not suffer from this drawback. Instead, the noise from the electronics mainly affects the spectral resolution of these detectors. Therefore, it does not contribute to the image noise in spectrally non-resolved CT images. This promises improved image quality due to image noise reduction in scans obtained from clinical CT examinations with lowest X-ray tube currents or obese patients. To quantify the benefits of quantum counting detectors in clinical CT we have carried out an extensive simulation study of the complete scanning and reconstruction process for both kinds of detectors. The simulation chain encompasses modeling of the X-ray source, beam attenuation in the patient, and calculation of the detector response. Moreover, in each case the subsequent image preprocessing and reconstruction is modeled as well. The simulation-based, theoretical evaluation is validated by experiments with a novel prototype quantum counting system and a Siemens Definition Flash scanner with a conventional energy integrating CT detector. We demonstrate and quantify the improvement from image noise reduction achievable with quantum counting techniques in CT examinations with ultra-low X-ray dose and strong attenuation.
A general study of techniques for ultraviolet astrophysical studies on space vehicles
NASA Technical Reports Server (NTRS)
Moos, H. W.; Fastie, W. G.; Davidsen, A. F.
1977-01-01
Recent accomplishments in three areas of UV instrumentation for space astronomy are discussed. These areas include reliable UV photometry, sensitive photon-detection techniques, and precise telescope pointing. Calibration facilities for spectrometers designed to operate in the spectral regions above 1200 A and down to 400 A are described which employ a series of diodes calibrated against electron synchrotron radiation as well as other radiometric standards. Improvements in photon-detection sensitivity achieved with the aid of pulse-counting electronics and multispectral detectors are reported, and the technique of precise subarcsecond telescope pointing is briefly noted. Some observational results are presented which demonstrate the advantages and precision of the instruments and techniques considered.
Predictions of CD4 lymphocytes’ count in HIV patients from complete blood count
2013-01-01
Background HIV diagnosis, prognostic and treatment requires T CD4 lymphocytes’ number from flow cytometry, an expensive technique often not available to people in developing countries. The aim of this work is to apply a previous developed methodology that predicts T CD4 lymphocytes’ value based on total white blood cell (WBC) count and lymphocytes count applying sets theory, from information taken from the Complete Blood Count (CBC). Methods Sets theory was used to classify into groups named A, B, C and D the number of leucocytes/mm3, lymphocytes/mm3, and CD4/μL3 subpopulation per flow cytometry of 800 HIV diagnosed patients. Union between sets A and C, and B and D were assessed, and intersection between both unions was described in order to establish the belonging percentage to these sets. Results were classified into eight ranges taken by 1000 leucocytes/mm3, calculating the belonging percentage of each range with respect to the whole sample. Results Intersection (A ∪ C) ∩ (B ∪ D) showed an effectiveness in the prediction of 81.44% for the range between 4000 and 4999 leukocytes, 91.89% for the range between 3000 and 3999, and 100% for the range below 3000. Conclusions Usefulness and clinical applicability of a methodology based on sets theory were confirmed to predict the T CD4 lymphocytes’ value, beginning with WBC and lymphocytes’ count from CBC. This methodology is new, objective, and has lower costs than the flow cytometry which is currently considered as Gold Standard. PMID:24034560
NASA Astrophysics Data System (ADS)
Hegenbart, L.; Na, Y. H.; Zhang, J. Y.; Urban, M.; Xu, X. George
2008-10-01
There are currently no physical phantoms available for calibrating in vivo counting devices that represent women with different breast sizes because such phantoms are difficult, time consuming and expensive to fabricate. In this work, a feasible alternative involving computational phantoms was explored. A series of new female voxel phantoms with different breast sizes were developed and ported into a Monte Carlo radiation transport code for performing virtual lung counting efficiency calibrations. The phantoms are based on the RPI adult female phantom, a boundary representation (BREP) model. They were created with novel deformation techniques and then voxelized for the Monte Carlo simulations. Eight models have been selected with cup sizes ranging from AA to G according to brassiere industry standards. Monte Carlo simulations of a lung counting system were performed with these phantoms to study the effect of breast size on lung counting efficiencies, which are needed to determine the activity of a radionuclide deposited in the lung and hence to estimate the resulting dose to the worker. Contamination scenarios involving three different radionuclides, namely Am-241, Cs-137 and Co-60, were considered. The results show that detector efficiencies considerably decrease with increasing breast size, especially for low energy photon emitting radionuclides. When the counting efficiencies of models with cup size AA were compared to those with cup size G, a difference of up to 50% was observed. The detector efficiencies for each radionuclide can be approximated by curve fitting in the total breast mass (polynomial of second order) or the cup size (power).
NASA Astrophysics Data System (ADS)
Lockhart, M.; Henzlova, D.; Croft, S.; Cutler, T.; Favalli, A.; McGahee, Ch.; Parker, R.
2018-01-01
Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli(DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory and implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. The current paper discusses and presents the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. In order to assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. The DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.
2012-01-01
Background A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. Methods We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). Results The instruments under study provide excellent tools for preparing decisions within the SAP in a transparent way when structuring the primary analysis, sensitivity or ancillary analyses, and specific analyses for secondary endpoints. The mean logarithmic score and DIC discriminate well between different model scenarios. It becomes obvious that the naive choice of a conventional random effects Poisson model is often inappropriate for real-life count data. The findings are used to specify an appropriate mixed model employed in the sensitivity analyses of an ongoing phase III trial. Conclusions The proposed Bayesian methods are not only appealing for inference but notably provide a sophisticated insight into different aspects of model performance, such as forecast verification or calibration checks, and can be applied within the model selection process. The mean of the logarithmic score is a robust tool for model ranking and is not sensitive to sample size. Therefore, these Bayesian model selection techniques offer helpful decision support for shaping sensitivity and ancillary analyses in a statistical analysis plan of a clinical trial with longitudinal count data as the primary endpoint. PMID:22962944
Adrion, Christine; Mansmann, Ulrich
2012-09-10
A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). The instruments under study provide excellent tools for preparing decisions within the SAP in a transparent way when structuring the primary analysis, sensitivity or ancillary analyses, and specific analyses for secondary endpoints. The mean logarithmic score and DIC discriminate well between different model scenarios. It becomes obvious that the naive choice of a conventional random effects Poisson model is often inappropriate for real-life count data. The findings are used to specify an appropriate mixed model employed in the sensitivity analyses of an ongoing phase III trial. The proposed Bayesian methods are not only appealing for inference but notably provide a sophisticated insight into different aspects of model performance, such as forecast verification or calibration checks, and can be applied within the model selection process. The mean of the logarithmic score is a robust tool for model ranking and is not sensitive to sample size. Therefore, these Bayesian model selection techniques offer helpful decision support for shaping sensitivity and ancillary analyses in a statistical analysis plan of a clinical trial with longitudinal count data as the primary endpoint.
Villamor, N; Kirsch, A; Huhn, D; Vives-Corrons, J L; Serke, S
1996-06-01
Flow cytometrical methods have been introduced recently as an alternative to the enumeration of reticulocytes by microscopy. Two of these methods have gained widespread use in haematological practice; the multiparametric flow cytometer using thiazole orange staining (Retic-Count, FACScan) and the single-application reticulocyte counter using auramine-O staining (R-series, Sysmex). Several studies have emphasized the excellent correlations between microscopy and these techniques. The purpose of our study has been to examine the specificity of these automated devices with regard to cells classified as 'reticulocytes' and the effect that this may have on measures of reticulocyte maturity. Our results indicate that the specificity of reticulocyte measurements by both the Sysmex R-1000/-3000 and the Retic-Count system is relatively low. This is due to the presence of leucocytes amongst cells classified as reticulocytes. These leucocytes display intense staining with either dye, leading to an erroneous estimation of RMI (thiazole orange) and high fluorescence count (R-1000/-3000). This error is directly correlated with the leucocyte count. The basis for reticulocyte identification should be improved before automated estimation of reticulocyte maturation can be used in clinical practice.
NASA Astrophysics Data System (ADS)
Dudak, J.; Zemlicka, J.; Karch, J.; Hermanova, Z.; Kvacek, J.; Krejci, F.
2017-01-01
Photon counting detectors Timepix are known for their unique properties enabling X-ray imaging with extremely high contrast-to-noise ratio. Their applicability has been recently further improved since a dedicated technique for assembling large area Timepix detector arrays was introduced. Despite the fact that the sensitive area of Timepix detectors has been significantly increased, the pixel pitch is kept unchanged (55 microns). This value is much larger compared to widely used and popular X-ray imaging cameras utilizing scintillation crystals and CCD-based read-out. On the other hand, photon counting detectors provide steeper point-spread function. Therefore, with given effective pixel size of an acquired radiography, Timepix detectors provide higher spatial resolution than X-ray cameras with scintillation-based devices unless the image is affected by penumbral blur. In this paper we take an advance of steep PSF of photon counting detectors and test the possibility to improve the quality of computed tomography reconstruction using finer sampling of reconstructed voxel space. The achieved results are presented in comparison with data acquired under the same conditions using a commercially available state-of-the-art CCD X-ray camera.
The Effect of Age and Task Difficulty
ERIC Educational Resources Information Center
Mallo, Jason; Nordstrom, Cynthia R.; Bartels, Lynn K.; Traxler, Anthony
2007-01-01
Electronic Performance Monitoring (EPM) is a common technique used to record employee performance. EPM may include counting computer keystrokes, monitoring employees' phone calls or internet activity, or documenting time spent on work activities. Despite EPM's prevalence, no studies have examined how this management tool affects older workers--a…
Novel Visualization of Large Health Related Data Sets
2014-03-01
demonstration of the visualization techniques and results from our earliest visualization, which used counts of the various data elements queried using...locations (e.g. areas with high pollen that increases the need for more intensive health care for people with asthma) and save millions of dollars
Barnard, Ralston W.; Jensen, Dal H.
1982-01-01
Uranium formations are assayed by prompt fission neutron logging techniques. The uranium in the formation is proportional to the ratio of epithermal counts to thermal or eqithermal dieaway. Various calibration factors enhance the accuracy of the measurement.
Detecting isotopic ratio outliers
NASA Astrophysics Data System (ADS)
Bayne, C. K.; Smith, D. H.
An alternative method is proposed for improving isotopic ratio estimates. This method mathematically models pulse-count data and uses iterative reweighted Poisson regression to estimate model parameters to calculate the isotopic ratios. This computer-oriented approach provides theoretically better methods than conventional techniques to establish error limits and to identify outliers.
Amorphous silicon ionizing particle detectors
Street, Robert A.; Mendez, Victor P.; Kaplan, Selig N.
1988-01-01
Amorphous silicon ionizing particle detectors having a hydrogenated amorphous silicon (a--Si:H) thin film deposited via plasma assisted chemical vapor deposition techniques are utilized to detect the presence, position and counting of high energy ionizing particles, such as electrons, x-rays, alpha particles, beta particles and gamma radiation.
Which Subsystems used to produce archival science products?
Atmospheric Science Data Center
2014-12-08
... - Combine telemetry and ephemeris data to produce Earth location geometry and convert radiometric counts produced by the ... inversion techniques. Daily and Monthly Time/Space Averaging - Convert from time-ordered to regionally-accessible data ... Data Products - Generate well-documented science archival products in an easily accessible format. ...
Photon Counting Techniques Applied to Single Aerosol Particle Spectroscopy.
NASA Astrophysics Data System (ADS)
Joynson, Steven
Available from UMI in association with The British Library. Optical effects on single airborne particles were examined for their potential use in aerosol characterisation. All phenomena arising from the elastic or quasi-elastic scattering, or the absorption of light were considered. A survey of published research identified the effects that have so far been proposed and investigated by other researchers. The feasibility of using these effects is then discussed and appropriate calculations and measurements made. After reviewing the classical theory of the interaction of light with small particles it was apparent that there was a number of other effects that had not yet been considered or examined by other researchers. Calculations and measurements of these effects were then made and are also presented here. The effects were examined optically using photon counting equipment to count and store the dynamic light scattering signals from a single particles in an aerosol flow. The measurement thus entailed using a low intensity probe beam to measure the effects of higher intensity pump radiation on the motion, shape and scattering properties of a test particle. The amount of information in the probe signal was increased by using a velocimetry arrangement. In the absence of suitable commercially available photon counting equipment a new system had to be designed and developed at RMCS. Although requiring much time and effort to develop, the equipment allowed a new approach to light scattering research. The successful operation of the equipment was confirmed by the good agreement found when comparing measured photon count series statistics with those of the simulated signals presented by other researchers. Experiments that were done to measure some of the optical effects are described and the results presented. They demonstrate the successful diffusion sizing of individual aerosol particles and their motion under radiation pressure. Further experimental results demonstrate the measurement of radiation absorption by the thermally-increased diffusion rate. Other results provide evidence for what appears to be the explosive vapourisation of material at the peak radiation absorption centres of a liquid droplet. Finally, the uses and limitations of the techniques are summarised and proposals are made for further research.
A semi-automated technique for labeling and counting of apoptosing retinal cells
2014-01-01
Background Retinal ganglion cell (RGC) loss is one of the earliest and most important cellular changes in glaucoma. The DARC (Detection of Apoptosing Retinal Cells) technology enables in vivo real-time non-invasive imaging of single apoptosing retinal cells in animal models of glaucoma and Alzheimer’s disease. To date, apoptosing RGCs imaged using DARC have been counted manually. This is time-consuming, labour-intensive, vulnerable to bias, and has considerable inter- and intra-operator variability. Results A semi-automated algorithm was developed which enabled automated identification of apoptosing RGCs labeled with fluorescent Annexin-5 on DARC images. Automated analysis included a pre-processing stage involving local-luminance and local-contrast “gain control”, a “blob analysis” step to differentiate between cells, vessels and noise, and a method to exclude non-cell structures using specific combined ‘size’ and ‘aspect’ ratio criteria. Apoptosing retinal cells were counted by 3 masked operators, generating ‘Gold-standard’ mean manual cell counts, and were also counted using the newly developed automated algorithm. Comparison between automated cell counts and the mean manual cell counts on 66 DARC images showed significant correlation between the two methods (Pearson’s correlation coefficient 0.978 (p < 0.001), R Squared = 0.956. The Intraclass correlation coefficient was 0.986 (95% CI 0.977-0.991, p < 0.001), and Cronbach’s alpha measure of consistency = 0.986, confirming excellent correlation and consistency. No significant difference (p = 0.922, 95% CI: −5.53 to 6.10) was detected between the cell counts of the two methods. Conclusions The novel automated algorithm enabled accurate quantification of apoptosing RGCs that is highly comparable to manual counting, and appears to minimise operator-bias, whilst being both fast and reproducible. This may prove to be a valuable method of quantifying apoptosing retinal cells, with particular relevance to translation in the clinic, where a Phase I clinical trial of DARC in glaucoma patients is due to start shortly. PMID:24902592
Inter-rater reliability of malaria parasite counts and comparison of methods
2009-01-01
Background The introduction of artemesinin-based treatment for falciparum malaria has led to a shift away from symptom-based diagnosis. Diagnosis may be achieved by using rapid non-microscopic diagnostic tests (RDTs), of which there are many available. Light microscopy, however, has a central role in parasite identification and quantification and remains the main method of parasite-based diagnosis in clinic and hospital settings and is necessary for monitoring the accuracy of RDTs. The World Health Organization has prepared a proficiency testing panel containing a range of malaria-positive blood samples of known parasitaemia, to be used for the assessment of commercially available malaria RDTs. Different blood film and counting methods may be used for this purpose, which raises questions regarding accuracy and reproducibility. A comparison was made of the established methods for parasitaemia estimation to determine which would give the least inter-rater and inter-method variation Methods Experienced malaria microscopists counted asexual parasitaemia on different slides using three methods; the thin film method using the total erythrocyte count, the thick film method using the total white cell count and the Earle and Perez method. All the slides were stained using Giemsa pH 7.2. Analysis of variance (ANOVA) models were used to find the inter-rater reliability for the different methods. The paired t-test was used to assess any systematic bias between the two methods, and a regression analysis was used to see if there was a changing bias with parasite count level. Results The thin blood film gave parasite counts around 30% higher than those obtained by the thick film and Earle and Perez methods, but exhibited a loss of sensitivity with low parasitaemia. The thick film and Earle and Perez methods showed little or no bias in counts between the two methods, however, estimated inter-rater reliability was slightly better for the thick film method. Conclusion The thin film method gave results closer to the true parasite count but is not feasible at a parasitaemia below 500 parasites per microlitre. The thick film method was both reproducible and practical for this project. The determination of malarial parasitaemia must be applied by skilled operators using standardized techniques. PMID:19939271
PID techniques: Alternatives to RICH Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vavra, J.; /SLAC
2011-03-01
In this review article we discuss the recent progress in PID techniques other than the RICH methods. In particular we mention the recent progress in the Transition Radiation Detector (TRD), dE/dx cluster counting, and Time Of Flight (TOF) techniques. The TRD technique is mature and has been tried in many hadron colliders. It needs space though, about 20cm of detector radial space for every factor of 10 in the {pi}/e rejection power, and this tends to make such detectors large. Although the cluster counting technique is an old idea, it was never tried in a real physics experiment. Recently, theremore » are efforts to revive it for the SuperB experiment using He-based gases and waveform digitizing electronics. A factor of almost 2 improvement, compared to the classical dE/dx performance, is possible in principle. However, the complexity of the data analysis will be substantial. The TOF technique is well established, but introduction of new fast MCP-PMT and G-APD detectors creates new possibilities. It seems that resolutions below 20-30ps may be possible at some point in the future with relatively small systems, and perhaps this could be pushed down to 10-15ps with very small systems, assuming that one can solve many systematic issues. However, the cost, rate limitation, aging and cross-talk in multi-anode devices at high BW are problems. There are several groups working on these issues, so progress is likely. Table 6 summarizes the author's opinion of pros and cons of various detectors presented in this paper based on their operational capabilities. We refer the reader to Ref.40 for discussion of other more general limits from the PID point of view.« less
Is it Possible to Sanitize Athletes' Shoes?
Messina, Gabriele; Burgassi, Sandra; Russo, Carmela; Ceriale, Emma; Quercioli, Cecilia; Meniconi, Cosetta
2015-01-01
Context: Footwear should be designed to avoid trauma and injury to the skin of the feet that can favor bacterial and fungal infections. Procedures and substances for sanitizing the interior of shoes are uncommon but are important aspects of primary prevention against foot infections and unpleasant odor. Objective: To evaluate the efficacy of a sanitizing technique for reducing bacterial and fungal contamination of footwear. Design: Crossover study. Setting: Mens Sana basketball team. Patients or Other Participants: Twenty-seven male athletes and 4 coaches (62 shoes). Intervention(s): The experimental protocol required a first sample (swab), 1/shoe, at time 0 from inside the shoes of all athletes before the sanitizing technique began and a second sample at time 1, after about 4 weeks, April 2012 to May 2012, of daily use of the sanitizing technique. Main Outcome Measure(s): The differences before and after use of the sanitizing technique for total bacterial count at 36°C and 22°C for Staphylococcus spp, yeasts, molds, Enterococcus spp, Pseudomonas spp, Escherichia coli, and total coliform bacteria were evaluated. Results: Before use of the sanitizing technique, the total bacterial counts at 36°C and 22°C and for Staphylococcus spp were greater by a factor of 5.8 (95% confidence interval [CI] = 3.42, 9.84), 5.84 (95% CI = 3.45, 9.78), and 4.78 (95% CI = 2.84, 8.03), respectively. All the other comparisons showed a reduction in microbial loads, whereas E coli and coliforms were no longer detected. No statistically significant decrease in yeasts (P = .0841) or molds (P = .6913) was recorded probably because of low contamination. Conclusions: The sanitizing technique significantly reduced the bacterial presence in athletes' shoes. PMID:25415415
Basic Techniques in Mammalian Cell Tissue Culture.
Phelan, Katy; May, Kristin M
2016-11-01
Cultured mammalian cells are used extensively in cell biology studies. It requires a number of special skills in order to be able to preserve the structure, function, behavior, and biology of the cells in culture. This unit describes the basic skills required to maintain and preserve cell cultures: maintaining aseptic technique, preparing media with the appropriate characteristics, passaging, freezing and storage, recovering frozen stocks, and counting viable cells. © 2016 by John Wiley & Sons, Inc. Copyright © 2016 John Wiley & Sons, Inc.
Amorphous silicon ionizing particle detectors
Street, R.A.; Mendez, V.P.; Kaplan, S.N.
1988-11-15
Amorphous silicon ionizing particle detectors having a hydrogenated amorphous silicon (a--Si:H) thin film deposited via plasma assisted chemical vapor deposition techniques are utilized to detect the presence, position and counting of high energy ionizing particles, such as electrons, x-rays, alpha particles, beta particles and gamma radiation. 15 figs.
The Ecological Stewardship Institute at Northern Kentucky University and the U.S. Environmental Protection Agency are collaborating to optimize a harmful algal bloom detection algorithm that estimates the presence and count of cyanobacteria in freshwater systems by image analysis...
Microbial Contamination of Chicken Wings: An Open-Ended Laboratory Project.
ERIC Educational Resources Information Center
Deutch, Charles E.
2001-01-01
Introduces the chicken wing project in which students assess the microbial contamination of chicken wings for the safety of foods. Uses the colony counting technique and direct wash fluid examination for determining the microbial contamination, and investigates methods to reduce the level of microbial contamination. (Contains 14 references.) (YDS)
Exploring Discrete Mathematics with American Football
ERIC Educational Resources Information Center
Muldoon Brown, Tricia; Kahn, Eric B.
2015-01-01
This paper presents an extended project that offers, through American football, an application of concepts from enumerative combinatorics and an introduction to proofs course. The questions in this paper and subsequent details concerning equivalence relations and counting techniques can be used to reinforce these new topics to students in such a…
Barnard, R.W.; Jensen, D.H.
1980-11-05
Uranium formations are assayed by prompt fission neutron logging techniques. The uranium in the formation is proportional to the ratio of epithermal counts to thermal or epithermal dieaway. Various calibration factors enhance the accuracy of the measurement.
The Chronic and Acute Effects of Exercise Upon Selected Blood Measures.
ERIC Educational Resources Information Center
Roitman, J. L.; Brewer, J. P.
This study investigated the effects of chronic and acute exercise upon selected blood measures and indices. Nine male cross-country runners were studied. Red blood count, hemoglobin, and hematocrit were measured using standard laboratory techniques; mean corpuscular volume (MCV), mean corpuscular hemoglobin, and mean corpuscular hemoglobin…
Direct reading of electrocardiograms and respiration rates
NASA Technical Reports Server (NTRS)
Wise, J. P.
1969-01-01
Technique for reading heart and respiration rates is more accurate and direct than the previous method. Index of a plastic calibrated card is aligned with a point on the electrocardiogram. Complexes are counted as indicated on the card and heart or respiration rate is read directly from the appropriate scale.
USDA-ARS?s Scientific Manuscript database
The standard sampling technique used to quantify cotton fleahopper, Pseudatomoscelis seriatus (Reuter), abundance in cotton, Gossypium hirsutum L., involves direct counts of adults and nymphs on plants. This method, however, becomes increasingly laborious and time consuming as plants increase in si...
DN/DG Screening of Environmental Swipe Samples: FY2016 Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Glasgow, David C.; Croft, Stephen; Venkataraman, Ramkumar
The Delayed Neutron Delayed Gamma (DNDG) technique provides a new analytical capability to the International Atomic Energy Agency (IAEA) for detecting undeclared nuclear activities. IAEA’s Long Term R&D (LTRD) plan has a stated high urgency need to develop elemental and isotopic signatures of nuclear fuel cycle activities and processes (LTRD 2.2). The new DNDG capability is used to co-detect both uranium and plutonium as an extension of a DN only method that is already being utilized by the IAEA for the analysis of swipes to inform on undeclared nuclear activities. Analytical method involving irradiation of swipe samples potentially containing tracemore » quantities of fissile material in a thermal neutron field, followed by the counting of delayed neutrons, is a well-known technique in the field of safeguards and nonproliferation. It is used for detecting the presence of microscopic amounts of fissile material, (typically a linear combination of 233U, 235U, 239Pu, and 241Pu)and quantifying it in terms of the equivalent mass of 235U. The delayed neutron (DN) technique is very sensitive and is been routinely employed at the High Flux Isotope Reactor (HFIR) facility at Oak Ridge National Laboratory (ORNL). Both uranium and plutonium are of high safeguards value. However, the DN technique is not well suited for distinguishing between U and Pu isotopes since the decay curves overlap closely. The delayed gamma (DG) technique will help detect the presence of 239Pu in a mixture of U and Pu. Thus the DNDG approach combines the best of both worlds; the sensitivity of DN counting and the isotopic specificity of DG counting. The present work seeks to build on the delayed neutron and delayed gamma methods that have been developed at ORNL. It is recognized that the distribution profile of heavy fission products remains fairly invariant for the fissile nuclides whereas the distribution of light fission products varies from one isotope to another. That is, the ratio of the yield of a light fission fragment to a heavy fission fragments is isotope specific. Measurement of the ratio of the net full energy peak (FEP) from low/high mass fission products is an elegant way to characterize the fraction of fissile materials present in a mixture. By empirically calibrating the ratio of the net FEP as a function of known concentration of the binary mixture, one can determine the fraction of fissile isotopes in an unknown sample. In the work done in fiscal year (FY) 2016, samples of single fissile material isotopes as well as binary mixtures were irradiated in a well thermalized irradiation field in the HFIR. Delayed neutron counting was performed using the neutron counter at the HFIR Neutron Activation Analysis (NAA) laboratory. Delayed gamma counting was performed using a shielded high purity germanium (HPGe) detector. Delayed neutron decay curve results highlighted the difficulty of distinguishing between U and Pu isotopes, and the need for including the delayed gamma component. Based on delayed gamma spectrometry, twelve ratios of low mass/high fission product gamma ray FEP have been identified as valid candidates. Linearity of the ratios, as a function of 239Pu fraction in 235U+ 239Pu mixtures, was confirmed for the low mass/high mass candidates that were selected. The DNDG method we are spearheading allows not only the presence of total fissile content to be detected, but whether the material is predominantly U or predominantly Pu, or a mixture. This provides additional SG relevant information.« less
Elastography methods for the non-invasive assessment of portal hypertension.
Roccarina, Davide; Rosselli, Matteo; Genesca, Joan; Tsochatzis, Emmanuel A
2018-02-01
The gold standard to assess the presence and severity of portal hypertension remains the hepatic vein pressure gradient, however the recent development of non-invasive assessment using elastography techniques offers valuable alternatives. In this review, we discuss the diagnostic accuracy and utility of such techniques in patients with portal hypertension due to cirrhosis. Areas covered: A literature search focused on liver and spleen stiffness measurement with different elastographic techniques for the assessment of the presence and severity of portal hypertension and oesophageal varices in people with chronic liver disease. The combination of elastography with parameters such as platelet count and spleen size is also discussed. Expert commentary: Non-invasive assessment of liver fibrosis and portal hypertension is a validated tool for the diagnosis and follow-up of patients. Baveno VI recommended the combination of transient elastography and platelet count for ruling out varices needing treatment in patients with compensated advanced chronic liver disease. Assessment of aetiology specific cut-offs for ruling in and ruling out clinically significant portal hypertension is an unmet clinical need. The incorporation of spleen stiffness measurements in non-invasive algorithms using validated software and improved measuring scales might enhance the non-invasive diagnosis of portal hypertension in the next 5 years.
Link, W.A.; Sauer, J.R.; Niven, D.K.
2006-01-01
Analysis of Christmas Bird Count (CBC) data is complicated by the need to account for variation in effort on counts and to provide summaries over large geographic regions. We describe a hierarchical model for analysis of population change using CBC data that addresses these needs. The effect of effort is modeled parametrically, with parameter values varying among strata as identically distributed random effects. Year and site effects are modeled hierarchically, accommodating large regional variation in number of samples and precision of estimates. The resulting model is complex, but a Bayesian analysis can be conducted using Markov chain Monte Carlo techniques. We analyze CBC data for American Black Ducks (Anas rubripes), a species of considerable management interest that has historically been monitored using winter surveys. Over the interval 1966-2003, Black Duck populations showed distinct regional patterns of population change. The patterns shown by CBC data are similar to those shown by the Midwinter Waterfowl Inventory for the United States.
Analysis of silver stained nucleolar organizing regions in odontogenic cysts and tumors.
Prasanna, Md; Charan, Cr; Reddy Ealla, Kranti Kiran; Surekha, V; Kulkarni, Ganesh; Gokavarapu, Sandhya
2014-09-01
The present study aimed to investigate the probable differences in cell proliferation index of odontogenic cysts and tumors by means of a comparative silver stained nucleolar organizing region (AgNOR) quantification. This descriptive cross-sectional study was done on archival paraffin blocks (n = 62), consisting of 10 odontogenic keratocysts, 10 dentigerous cysts, 10 radicular cysts, 10 conventional ameloblastomas, 10 adenomatoid odontogenic tumors, 10 calcifying epithelial odontogenic tumors and 2 ameloblasic carcinomas. The mean AgNOR count of odontogenic cysts was 1.709 and the benign odontogenic tumors was 1.862. Highest AgNOR count was recorded in odontogenic keratocyst and lowest was seen in radicular cyst. Statistically significant difference in AgNOR counts of ameloblastoma and adenomatoid odontogenic tumor, amelobalastoma and calcifying epithelial odontogenic tumor, benign odontogenic tumors and ameloblastic carcinoma were seen. AgNORs in ameloblastic carcinoma were more in number and more widely spread. AgNOR technique may be considered a good indicator of cell proliferation in odontogenic cysts and tumors.
Analysis of silver stained nucleolar organizing regions in odontogenic cysts and tumors
Prasanna, MD; Charan, CR; Reddy Ealla, Kranti Kiran; Surekha, V; Kulkarni, Ganesh; Gokavarapu, Sandhya
2014-01-01
Objective: The present study aimed to investigate the probable differences in cell proliferation index of odontogenic cysts and tumors by means of a comparative silver stained nucleolar organizing region (AgNOR) quantification. Study Design: This descriptive cross-sectional study was done on archival paraffin blocks (n = 62), consisting of 10 odontogenic keratocysts, 10 dentigerous cysts, 10 radicular cysts, 10 conventional ameloblastomas, 10 adenomatoid odontogenic tumors, 10 calcifying epithelial odontogenic tumors and 2 ameloblasic carcinomas. Results: The mean AgNOR count of odontogenic cysts was 1.709 and the benign odontogenic tumors was 1.862. Highest AgNOR count was recorded in odontogenic keratocyst and lowest was seen in radicular cyst. Statistically significant difference in AgNOR counts of ameloblastoma and adenomatoid odontogenic tumor, amelobalastoma and calcifying epithelial odontogenic tumor, benign odontogenic tumors and ameloblastic carcinoma were seen. AgNORs in ameloblastic carcinoma were more in number and more widely spread. Conclusion: AgNOR technique may be considered a good indicator of cell proliferation in odontogenic cysts and tumors. PMID:25364178
Diffusion processes in tumors: A nuclear medicine approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amaya, Helman, E-mail: haamayae@unal.edu.co
The number of counts used in nuclear medicine imaging techniques, only provides physical information about the desintegration of the nucleus present in the the radiotracer molecules that were uptaken in a particular anatomical region, but that information is not a real metabolic information. For this reason a mathematical method was used to find a correlation between number of counts and {sup 18}F-FDG mass concentration. This correlation allows a better interpretation of the results obtained in the study of diffusive processes in an agar phantom, and based on it, an image from the PETCETIX DICOM sample image set from OsiriX-viewer softwaremore » was processed. PET-CT gradient magnitude and Laplacian images could show direct information on diffusive processes for radiopharmaceuticals that enter into the cells by simple diffusion. In the case of the radiopharmaceutical {sup 18}F-FDG is necessary to include pharmacokinetic models, to make a correct interpretation of the gradient magnitude and Laplacian of counts images.« less
Detection and Estimation of an Optical Image by Photon-Counting Techniques. Ph.D. Thesis
NASA Technical Reports Server (NTRS)
Wang, Lily Lee
1973-01-01
Statistical description of a photoelectric detector is given. The photosensitive surface of the detector is divided into many small areas, and the moment generating function of the photo-counting statistic is derived for large time-bandwidth product. The detection of a specified optical image in the presence of the background light by using the hypothesis test is discussed. The ideal detector based on the likelihood ratio from a set of numbers of photoelectrons ejected from many small areas of the photosensitive surface is studied and compared with the threshold detector and a simple detector which is based on the likelihood ratio by counting the total number of photoelectrons from a finite area of the surface. The intensity of the image is assumed to be Gaussian distributed spatially against the uniformly distributed background light. The numerical approximation by the method of steepest descent is used, and the calculations of the reliabilities for the detectors are carried out by a digital computer.
Image charge multi-role and function detectors
NASA Astrophysics Data System (ADS)
Milnes, James; Lapington, Jon S.; Jagutzki, Ottmar; Howorth, Jon
2009-06-01
The image charge technique used with microchannel plate imaging tubes provides several operational and practical benefits by serving to isolate the electronic image readout from the detector. The simple dielectric interface between detector and readout provides vacuum isolation and no vacuum electrical feed-throughs are required. Since the readout is mechanically separate from the detector, an image tube of generic design can be simply optimised for various applications by attaching it to different readout devices and electronics. We present imaging performance results using a single image tube with a variety of readout devices suited to differing applications: (a) A four electrode charge division tetra wedge anode, optimised for best spatial resolution in photon counting mode. (b) A cross delay line anode, enabling higher count rate, and the possibility of discriminating near co-incident events, and an event timing resolution of better than 1 ns. (c) A multi-anode readout connected, either to a multi-channel oscilloscope for analogue measurements of fast optical pulses, or alternately, to a multi-channel time correlated single photon counting (TCSPC) card.
Senftle, F.E.; Moxham, R.M.; Tanner, A.B.
1972-01-01
The recent availability of borehole logging sondes employing a source of neutrons and a Ge(Li) detector opens up the possibility of analyzing either decay or capture gamma rays. The most efficient method for a given element can be predicted by calculating the decay-to-capture count ratio for the most prominent peaks in the respective spectra. From a practical point of view such a calculation must be slanted toward short irradiation and count times at each station in a borehole. A simplified method of computation is shown, and the decay-to-capture count ratio has been calculated and tabulated for the optimum value in the decay mode irrespective of the irradiation time, and also for a ten minute irradiation time. Based on analysis of a single peak in each spectrum, the results indicate the preferred technique and the best decay or capture peak to observe for those elements of economic interest. ?? 1972.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rivkin, R.B.; Seliger, H.H.
1981-07-01
Short term rates of /sup 14/C uptake for single cells and small numbers of isolated algal cells of five phytoplankton species from natural populations were measured by liquid scintillation counting. Regression analysis of uptake rates per cell for cells isolated from unialgal cultures of seven species of dinoflagellates, ranging in volume from ca. 10/sup 3/ to 10/sup 7/ ..mu..m/sup 3/, gave results identical to uptake rates per cell measured by conventional /sup 14/C techniques. Relative standard errors or regression coefficients ranged between 3 and 10%, indicating that for any species there was little variation in photosynthesis per cell.
Performance and capacity analysis of Poisson photon-counting based Iter-PIC OCDMA systems.
Li, Lingbin; Zhou, Xiaolin; Zhang, Rong; Zhang, Dingchen; Hanzo, Lajos
2013-11-04
In this paper, an iterative parallel interference cancellation (Iter-PIC) technique is developed for optical code-division multiple-access (OCDMA) systems relying on shot-noise limited Poisson photon-counting reception. The novel semi-analytical tool of extrinsic information transfer (EXIT) charts is used for analysing both the bit error rate (BER) performance as well as the channel capacity of these systems and the results are verified by Monte Carlo simulations. The proposed Iter-PIC OCDMA system is capable of achieving two orders of magnitude BER improvements and a 0.1 nats of capacity improvement over the conventional chip-level OCDMA systems at a coding rate of 1/10.
An improved method for chromosome counting in maize.
Kato, A
1997-09-01
An improved method for counting chromosomes in maize (Zea mays L.) is presented. Application of cold treatment (5C, 24 hr), heat treatment (42 C, 5 min) and a second cold treatment (5C, 24 hr) to root tips before fixation increased the number of condensed and dispersed countable metaphase chromosome figures. Fixed root tips were prepared by the enzymatic maceration-air drying method and preparations were stained with acetic orcein. Under favorable conditions, one preparation with 50-100 countable chromosome figures could be obtained in diploid maize using this method. Conditions affecting the dispersion of the chromosomes are described. This technique is especially useful for determining the somatic chromosome number in triploid and tetraploid maize lines.
Hamby, David M [Corvallis, OR; Farsoni, Abdollah T [Corvallis, OR; Cazalas, Edward [Corvallis, OR
2011-06-21
A technique and device provides absolute skin dosimetry in real time at multiple tissue depths simultaneously. The device uses a phoswich detector which has multiple scintillators embedded at different depths within a non-scintillating material. A digital pulse processor connected to the phoswich detector measures a differential distribution (dN/dH) of count rate N as function of pulse height H for signals from each of the multiple scintillators. A digital processor computes in real time from the differential count-rate distribution for each of multiple scintillators an estimate of an ionizing radiation dose delivered to each of multiple depths of skin tissue corresponding to the multiple scintillators embedded at multiple corresponding depths within the non-scintillating material.
Yunoki, A; Kawada, Y; Yamada, T; Unno, Y; Sato, Y; Hino, Y
2013-11-01
We measured 4π and 2π counting efficiencies for internal conversion electrons (ICEs), gross β-particles and also β-rays alone with various source conditions regarding absorber and backing foil thickness using e-X coincidence technique. Dominant differences regarding the penetration, attenuation and backscattering properties among ICEs and β-rays were revealed. Although the abundance of internal conversion electrons of (137)Cs-(137)Ba is only 9.35%, 60% of gross counts may be attributed to ICEs in worse source conditions. This information will be useful for radionuclide metrology and for surface contamination monitoring. © 2013 Elsevier Ltd. All rights reserved.
Alkali Halide Microstructured Optical Fiber for X-Ray Detection
NASA Technical Reports Server (NTRS)
DeHaven, S. L.; Wincheski, R. A.; Albin, S.
2014-01-01
Microstructured optical fibers containing alkali halide scintillation materials of CsI(Na), CsI(Tl), and NaI(Tl) are presented. The scintillation materials are grown inside the microstructured fibers using a modified Bridgman-Stockbarger technique. The x-ray photon counts of these fibers, with and without an aluminum film coating are compared to the output of a collimated CdTe solid state detector over an energy range from 10 to 40 keV. The photon count results show significant variations in the fiber output based on the materials. The alkali halide fiber output can exceed that of the CdTe detector, dependent upon photon counter efficiency and fiber configuration. The results and associated materials difference are discussed.
As-built design specification for proportion estimate software subsystem
NASA Technical Reports Server (NTRS)
Obrien, S. (Principal Investigator)
1980-01-01
The Proportion Estimate Processor evaluates four estimation techniques in order to get an improved estimate of the proportion of a scene that is planted in a selected crop. The four techniques to be evaluated were provided by the techniques development section and are: (1) random sampling; (2) proportional allocation, relative count estimate; (3) proportional allocation, Bayesian estimate; and (4) sequential Bayesian allocation. The user is given two options for computation of the estimated mean square error. These are referred to as the cluster calculation option and the segment calculation option. The software for the Proportion Estimate Processor is operational on the IBM 3031 computer.
Breast tissue decomposition with spectral distortion correction: A postmortem study
Ding, Huanjun; Zhao, Bo; Baturin, Pavlo; Behroozi, Farnaz; Molloi, Sabee
2014-01-01
Purpose: To investigate the feasibility of an accurate measurement of water, lipid, and protein composition of breast tissue using a photon-counting spectral computed tomography (CT) with spectral distortion corrections. Methods: Thirty-eight postmortem breasts were imaged with a cadmium-zinc-telluride-based photon-counting spectral CT system at 100 kV. The energy-resolving capability of the photon-counting detector was used to separate photons into low and high energy bins with a splitting energy of 42 keV. The estimated mean glandular dose for each breast ranged from 1.8 to 2.2 mGy. Two spectral distortion correction techniques were implemented, respectively, on the raw images to correct the nonlinear detector response due to pulse pileup and charge-sharing artifacts. Dual energy decomposition was then used to characterize each breast in terms of water, lipid, and protein content. In the meantime, the breasts were chemically decomposed into their respective water, lipid, and protein components to provide a gold standard for comparison with dual energy decomposition results. Results: The accuracy of the tissue compositional measurement with spectral CT was determined by comparing to the reference standard from chemical analysis. The averaged root-mean-square error in percentage composition was reduced from 15.5% to 2.8% after spectral distortion corrections. Conclusions: The results indicate that spectral CT can be used to quantify the water, lipid, and protein content in breast tissue. The accuracy of the compositional analysis depends on the applied spectral distortion correction technique. PMID:25281953
Li, Xu; Zhang, Feng; Zhang, Wenzhi; Shang, Xifu; Han, Jintao; Liu, Pengfei
2017-03-01
Technique note. To report a new method for precisely controlling the depth of percutaneous pedicle screws (PPS)-without radiation exposure to surgeons and less fluoroscopy exposure to patients than with conventional methods. PPS is widely used in minimal invasive spine surgery; the advantages include reduced muscle damage, pain, and hospital stays. However, placement of PPS demands repeated checking with fluoroscopy. Thus, radiation exposure is considerable for both surgeons and patients. The PPS depth was determined by counting rotations of the screws. The distance between screw threads can be measured for particular screws; thus, full rotations of the PPS results in the screw advancing in the pedicle the distance between screw threads. To fully insert screws into the pedicle, the number of full rotations is equal to the number of threads in the PPS. We applied this technique in 58 patients with thoracolumbar fracture. The position and depth of the screws was checked during the operation with the C-arm and after operation by anteroposterior X-ray film or computed tomography. No additional procedures were required to correct the screws; we observed no neurological deficits or malpositioning of the screws. In the screw placement procedure, the radiation exposure for surgeons is zero, and the patient is well protected from extensive radiation exposure. This method of counting rotation of screws is a safe way to precisely determine the depth of PPS in the placement procedure. IV.
NASA Astrophysics Data System (ADS)
Jochum, K. P.; Seufert, H. M.
1995-09-01
We have developed new spark source mass spectrometric (SSMS) techniques for simultaneous analysis of platinum-group elements (PGE) together with other trace elements in stony meteorites. We have measured elemental abundances of Rh, Ru, Os, Ir, Pt, Au in carbonaceous chondrites of different types including the two CI chondrites Orgueil and Ivuna. These data are relevant for the determination of solar-system abundances. Whereas the solar-system abundances of most PGE are well known, this is not the case for Rh, and no literature data exist for carbonaceous chondrites, mainly because of analytical difficulties. The SSMS techniques include new calibration procedures and the use of a recently developed multi-ion counting (MIC) system [1]. The mono-isotopic element Rh and the other PGE were determined by using internal standard elements (e.g., Nd, U) that were measured by isotope dilution in the same sample electrode material. The data were calibrated with certified standard solutions of PGE which were doped on trace-element poor rock samples. Ion abundances were measured using both the conventional photoplate detection and the ion-counting techniques. The new MIC technique that uses up to 20 small channeltrons for ion counting measurements has the advantage of improved precision, detection limits and analysis time compared to photoplate detection. Tab. 1 shows the Rh analyses for the meteorites Orgueil, Ivuna, Murchison, Allende and Karoonda obtained by conventional photoplate detection. These are the first Rh results for carbonaceous chondrites. The data for the two CI chondrites Orgueil and Ivuna are identical and agree within 4 % with the CI estimate of Anders and Grevesse [2] which was derived indirectly from analyses for H-chondrites. The PGE Os, Ir, Pt, Au and W, Re, Th, U concentrations were determined by both detection systems. Data obtained with the MIC system are more precise (about 4% for concentrations in the ppb range) compared to the photoplate detection system (about 10 - 15 %). Both data sets agree within error limits. Rhodium correlates well with Pt and other PGE indicating no significant fractionation between the different types of carbonaceous chondrites (Tab. 1). References: [1] Jochum K. P. et al. (1994) Fresenius J. Anal. Chem., 350, 642-644. [2] Anders E. and Grevesse N. (1989) GCA, 53, 197-214.
NO TIME FOR DEAD TIME: TIMING ANALYSIS OF BRIGHT BLACK HOLE BINARIES WITH NuSTAR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bachetti, Matteo; Barret, Didier; Harrison, Fiona A.
Timing of high-count-rate sources with the NuSTAR Small Explorer Mission requires specialized analysis techniques. NuSTAR was primarily designed for spectroscopic observations of sources with relatively low count rates rather than for timing analysis of bright objects. The instrumental dead time per event is relatively long (∼2.5 msec) and varies event-to-event by a few percent. The most obvious effect is a distortion of the white noise level in the power density spectrum (PDS) that cannot be easily modeled with standard techniques due to the variable nature of the dead time. In this paper, we show that it is possible to exploitmore » the presence of two completely independent focal planes and use the cospectrum, the real part of the cross PDS, to obtain a good proxy of the white-noise-subtracted PDS. Thereafter, one can use a Monte Carlo approach to estimate the remaining effects of dead time, namely, a frequency-dependent modulation of the variance and a frequency-independent drop of the sensitivity to variability. In this way, most of the standard timing analysis can be performed, albeit with a sacrifice in signal-to-noise ratio relative to what would be achieved using more standard techniques. We apply this technique to NuSTAR observations of the black hole binaries GX 339–4, Cyg X-1, and GRS 1915+105.« less
LPS-induced microvascular leukocytosis can be assessed by blue-field entoptic phenomenon.
Kolodjaschna, Julia; Berisha, Fatmire; Lung, Solveig; Schaller, Georg; Polska, Elzbieta; Jilma, Bernd; Wolzt, Michael; Schmetterer, Leopold
2004-08-01
Administration of low doses of Escherichia coli endotoxin [a lipopolysaccharide (LPS)] to humans enables the study of inflammatory mechanisms. The purpose of the present study was to investigate whether the blue-field entoptic technique may be used to quantify the increase in circulating leukocytes in the ocular microvasculature after LPS infusion. In addition, combined laser Doppler velocimetry and retinal vessel size measurement were used to study red blood cell movement. Twelve healthy male volunteers received 20 IU/kg iv LPS as a bolus infusion. Outcome parameters were measured at baseline and 4 h after LPS administration. In the first protocol (n = 6 subjects), ocular hemodynamic effects were assessed with the blue-field entoptic technique, the retinal vessel analyzer, and laser Doppler velocimetry. In the second protocol (n = 6 subjects), white blood cell (WBC) counts from peripheral blood samples and blue-field entoptic technique measurements were performed. LPS caused peripheral blood leukocytosis and increased WBC density in ocular microvessels (by 49%; P = 0.036) but did not change WBC velocity. In addition, retinal venous diameter was increased (by 9%; P = 0.008), but red blood cell velocity remained unchanged. The LPS-induced changes in retinal WBC density and leukocyte counts were significantly correlated (r = 0.87). The present study indicates that the blue-field entoptic technique can be used to assess microvascular leukocyte recruitment in vivo. In addition, our data indicate retinal venous dilation in response to endotoxin.
An Extremely Low Power Quantum Optical Communication Link for Autonomous Robotic Explorers
NASA Technical Reports Server (NTRS)
Lekki, John; Nguyen, Quang-Viet; Bizon, Tom; Nguyen, Binh; Kojima, Jun
2007-01-01
One concept for planetary exploration involves using many small robotic landers that can cover more ground than a single conventional lander. In addressing this vision, NASA has been challenged in the National Nanotechnology Initiative to research the development of miniature robots built from nano-sized components. These robots have very significant challenges, such as mobility and communication, given the small size and limited power generation capability. The research presented here has been focused on developing a communications system that has the potential for providing ultra-low power communications for robots such as these. In this paper an optical communications technique that is based on transmitting recognizable sets of photons is presented. Previously pairs of photons that have an entangled quantum state have been shown to be recognizable in ambient light. The main drawback to utilizing entangled photons is that they can only be generated through a very energy inefficient nonlinear process. In this paper a new technique that generates sets of photons from pulsed sources is described and an experimental system demonstrating this technique is presented. This technique of generating photon sets from pulsed sources has the distinct advantage in that it is much more flexible and energy efficient, and is well suited to take advantage of the very high energy efficiencies that are possible when using nano scale sources. For these reasons the communication system presented in this paper is well suited for use in very small, low power landers and rovers. In this paper a very low power optical communications system for miniature robots, as small as 1 cu cm is addressed. The communication system is a variant of photon counting communications. Instead of counting individual photons the system only counts the arrival of time coincident sets of photons. Using sets of photons significantly decreases the bit error rate because they are highly identifiable in the presence of ambient light. An experiment demonstrating reliable communication over a distance of 70 meters using less than a billionth of a watt of radiated power is presented. The components used in this system were chosen so that they could in the future be integrated into a cubic centimeter device.
Lockhart, M.; Henzlova, D.; Croft, S.; ...
2017-09-20
Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli (DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory andmore » implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. Here, we discuss and present the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. To assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. In conclusion, the DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.« less
Adult Hematology and Clinical Chemistry Laboratory Reference Ranges in a Zimbabwean Population.
Samaneka, Wadzanai P; Mandozana, Gibson; Tinago, Willard; Nhando, Nehemiah; Mgodi, Nyaradzo M; Bwakura-Dangarembizi, Mutsawashe F; Munjoma, Marshall W; Gomo, Zvenyika A R; Chirenje, Zvavahera M; Hakim, James G
2016-01-01
Laboratory reference ranges used for clinical care and clinical trials in various laboratories in Zimbabwe were derived from textbooks and research studies conducted more than ten years ago. Periodic verification of these ranges is essential to track changes over time. The purpose of this study was to establish hematology and chemistry laboratory reference ranges using more rigorous methods. A community-based cross-sectional study was carried out in Harare, Chitungwiza, and Mutoko. A multistage sampling technique was used. Samples were transported from the field for analysis at the ISO15189 certified University of Zimbabwe-University of California San Francisco Central Research Laboratory. Hematology and clinical chemistry reference ranges lower and upper reference limits were estimated at the 2.5th and 97.5th percentiles respectively. A total of 769 adults (54% males) aged 18 to 55 years were included in the analysis. Median age was 28 [IQR: 23-35] years. Males had significantly higher red cell counts, hemoglobin, hematocrit, and mean corpuscular hemoglobin compared to females. Females had higher white cell counts, platelets, absolute neutrophil counts, and absolute lymphocyte counts compared to males. There were no gender differences in eosinophils, monocytes, and absolute basophil count. Males had significantly higher levels of urea, sodium, potassium, calcium, creatinine, amylase, total protein, albumin and liver enzymes levels compared to females. Females had higher cholesterol and lipase compared with males. There are notable differences in the white cell counts, neutrophils, cholesterol, and creatinine kinase when compared with the currently used reference ranges. Data from this study provides new country specific reference ranges which should be immediately adopted for routine clinical care and accurate monitoring of adverse events in research studies.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lockhart, M.; Henzlova, D.; Croft, S.
Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli (DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory andmore » implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. Here, we discuss and present the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. To assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. In conclusion, the DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.« less
Dodd, C.K.; Dorazio, R.M.
2004-01-01
A critical variable in both ecological and conservation field studies is determining how many individuals of a species are present within a defined sampling area. Labor intensive techniques such as capture-mark-recapture and removal sampling may provide estimates of abundance, but there are many logistical constraints to their widespread application. Many studies on terrestrial and aquatic salamanders use counts as an index of abundance, assuming that detection remains constant while sampling. If this constancy is violated, determination of detection probabilities is critical to the accurate estimation of abundance. Recently, a model was developed that provides a statistical approach that allows abundance and detection to be estimated simultaneously from spatially and temporally replicated counts. We adapted this model to estimate these parameters for salamanders sampled over a six vear period in area-constrained plots in Great Smoky Mountains National Park. Estimates of salamander abundance varied among years, but annual changes in abundance did not vary uniformly among species. Except for one species, abundance estimates were not correlated with site covariates (elevation/soil and water pH, conductivity, air and water temperature). The uncertainty in the estimates was so large as to make correlations ineffectual in predicting which covariates might influence abundance. Detection probabilities also varied among species and sometimes among years for the six species examined. We found such a high degree of variation in our counts and in estimates of detection among species, sites, and years as to cast doubt upon the appropriateness of using count data to monitor population trends using a small number of area-constrained survey plots. Still, the model provided reasonable estimates of abundance that could make it useful in estimating population size from count surveys.
Criado, Ignacio; Muñoz-Criado, Santiago; Rodríguez-Caballero, Arancha; Nieto, Wendy G.; Romero, Alfonso; Fernández-Navarro, Paulino; Alcoceba, Miguel; Contreras, Teresa; González, Marcos; Orfao, Alberto; Almeida, Julia
2017-01-01
Patients diagnosed with chronic lymphocytic leukemia (CLL) display a high incidence of infections due to an associated immunodeficiency that includes hypogammaglobulinemia. A higher risk of infections has also been recently reported for high-count monoclonal B-cell lymphocytosis, while no information is available in low-count monoclonal B-cell lymphocytosis. Here, we evaluated the status of the humoral immune system in patients with chronic lymphocytic leukemia (n=58), as well as in low- (n=71) and high- (n=29) count monoclonal B-cell lymphocytosis versus healthy donors (n=91). Total free plasma immunoglobulin titers and specific levels of antibodies against cytomegalovirus, Epstein-Barr virus, influenza and S.pneumoniae were measured by nephelometry and ELISA-based techniques, respectively. Overall, our results show that both CLL and high-count monoclonal B-cell lymphocytosis patients, but not low-count monoclonal B-cell lymphocytosis subjects, present with relatively high levels of antibodies specific for the latent viruses investigated, associated with progressively lower levels of S.pneumoniae-specific immunoglobulins. These findings probably reflect asymptomatic chronic reactivation of humoral immune responses against host viruses associated with expanded virus-specific antibody levels and progressively decreased protection against other micro-organisms, denoting a severe humoral immunodeficiency state not reflected by the overall plasma immunoglobulin levels. Alternatively, these results could reflect a potential role of ubiquitous viruses in the pathogenesis of the disease. Further analyses are necessary to establish the relevance of such asymptomatic humoral immune responses against host viruses in the expansion of the tumor B-cell clone and progression from monoclonal B-cell lymphocytosis to CLL. PMID:28385786
NASA Astrophysics Data System (ADS)
Croft, Stephen; Favalli, Andrea
2017-10-01
Neutron multiplicity counting using shift-register calculus is an established technique in the science of international nuclear safeguards for the identification, verification, and assay of special nuclear materials. Typically passive counting is used for Pu and mixed Pu-U items and active methods are used for U materials. Three measured counting rates, singles, doubles and triples are measured and, in combination with a simple analytical point-model, are used to calculate characteristics of the measurement item in terms of known detector and nuclear parameters. However, the measurement problem usually involves more than three quantities of interest, but even in cases where the next higher order count rate, quads, is statistically viable, it is not quantitatively applied because corrections for dead time losses are currently not available in the predominant analysis paradigm. In this work we overcome this limitation by extending the commonly used dead time correction method, developed by Dytlewski, to quads. We also give results for pents, which may be of interest for certain special investigations. Extension to still higher orders, may be accomplished by inspection based on the sequence presented. We discuss the foundations of the Dytlewski method, give limiting cases, and highlight the opportunities and implications that these new results expose. In particular there exist a number of ways in which the new results may be combined with other approaches to extract the correlated rates, and this leads to various practical implementations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Croft, Stephen; Favalli, Andrea
Here, neutron multiplicity counting using shift-register calculus is an established technique in the science of international nuclear safeguards for the identification, verification, and assay of special nuclear materials. Typically passive counting is used for Pu and mixed Pu-U items and active methods are used for U materials. Three measured counting rates, singles, doubles and triples are measured and, in combination with a simple analytical point-model, are used to calculate characteristics of the measurement item in terms of known detector and nuclear parameters. However, the measurement problem usually involves more than three quantities of interest, but even in cases where themore » next higher order count rate, quads, is statistically viable, it is not quantitatively applied because corrections for dead time losses are currently not available in the predominant analysis paradigm. In this work we overcome this limitation by extending the commonly used dead time correction method, developed by Dytlewski, to quads. We also give results for pents, which may be of interest for certain special investigations. Extension to still higher orders, may be accomplished by inspection based on the sequence presented. We discuss the foundations of the Dytlewski method, give limiting cases, and highlight the opportunities and implications that these new results expose. In particular there exist a number of ways in which the new results may be combined with other approaches to extract the correlated rates, and this leads to various practical implementations.« less
Evaluation of cell proliferation in malignant and potentially malignant oral lesions
Madan, Mani; Chandra, Shaleen; Raj, Vineet; Madan, Rohit
2015-01-01
Aims: To evaluate the cell proliferation rate by the expression of proliferating cell nuclear antigen (PCNA) and argyrophilic nucleolar organizing region (AgNOR) counts and to assess its usefulness as a marker for malignant potential in oral epithelial lesions. Materials and Methods: The study group included 30 cases of leukoplakia, 15 nondysplastic (NDL), 15 dysplastic (DL), 15 cases of oral squamous cell carcinoma (OSCC) and 5 cases of normal oral mucosa. Formalin fixed paraffin embedded tissues were subjected to immunohistochemical staining for PCNA and AgNOR technique. The PCNA labeling index (LI) and the AgNOR dots were evaluated for the entire sample. Statistical Analysis Used: ANOVA, Tukey honestly significant difference, Pearson's correlation. Results: In this study, the AgNOR count of OSCC was lower than the DL lesions moreover the AgNOR counts were found to be higher in normal mucosa as compared to the DL and the NDL epithelium. The study results also showed that the mean AgNOR count failed to distinguish between DL and NDL lesions. Overall we observed increased PCNA expression from normal epithelium to NDL to DL lesion. Conclusions: Based on the findings of the present study on oral epithelial precancerous and cancerous lesions we conclude that mean AgNOR count alone cannot be a valuable parameter to distinguish between the normal, NDL, DL epithelium and OSCC but, on the other hand, we found out that PCNA can be a useful biomarker for delineating normal epithelium from DL epithelium and OSCC. PMID:26980956
Evaluation of cell proliferation in malignant and potentially malignant oral lesions.
Madan, Mani; Chandra, Shaleen; Raj, Vineet; Madan, Rohit
2015-01-01
To evaluate the cell proliferation rate by the expression of proliferating cell nuclear antigen (PCNA) and argyrophilic nucleolar organizing region (AgNOR) counts and to assess its usefulness as a marker for malignant potential in oral epithelial lesions. The study group included 30 cases of leukoplakia, 15 nondysplastic (NDL), 15 dysplastic (DL), 15 cases of oral squamous cell carcinoma (OSCC) and 5 cases of normal oral mucosa. Formalin fixed paraffin embedded tissues were subjected to immunohistochemical staining for PCNA and AgNOR technique. The PCNA labeling index (LI) and the AgNOR dots were evaluated for the entire sample. ANOVA, Tukey honestly significant difference, Pearson's correlation. In this study, the AgNOR count of OSCC was lower than the DL lesions moreover the AgNOR counts were found to be higher in normal mucosa as compared to the DL and the NDL epithelium. The study results also showed that the mean AgNOR count failed to distinguish between DL and NDL lesions. Overall we observed increased PCNA expression from normal epithelium to NDL to DL lesion. Based on the findings of the present study on oral epithelial precancerous and cancerous lesions we conclude that mean AgNOR count alone cannot be a valuable parameter to distinguish between the normal, NDL, DL epithelium and OSCC but, on the other hand, we found out that PCNA can be a useful biomarker for delineating normal epithelium from DL epithelium and OSCC.
NASA Astrophysics Data System (ADS)
Hong, Inki; Cho, Sanghee; Michel, Christian J.; Casey, Michael E.; Schaefferkoetter, Joshua D.
2014-09-01
A new data handling method is presented for improving the image noise distribution and reducing bias when reconstructing very short frames from low count dynamic PET acquisition. The new method termed ‘Complementary Frame Reconstruction’ (CFR) involves the indirect formation of a count-limited emission image in a short frame through subtraction of two frames with longer acquisition time, where the short time frame data is excluded from the second long frame data before the reconstruction. This approach can be regarded as an alternative to the AML algorithm recently proposed by Nuyts et al, as a method to reduce the bias for the maximum likelihood expectation maximization (MLEM) reconstruction of count limited data. CFR uses long scan emission data to stabilize the reconstruction and avoids modification of algorithms such as MLEM. The subtraction between two long frame images, naturally allows negative voxel values and significantly reduces bias introduced in the final image. Simulations based on phantom and clinical data were used to evaluate the accuracy of the reconstructed images to represent the true activity distribution. Applicability to determine the arterial input function in human and small animal studies is also explored. In situations with limited count rate, e.g. pediatric applications, gated abdominal, cardiac studies, etc., or when using limited doses of short-lived isotopes such as 15O-water, the proposed method will likely be preferred over independent frame reconstruction to address bias and noise issues.
Croft, Stephen; Favalli, Andrea
2017-07-16
Here, neutron multiplicity counting using shift-register calculus is an established technique in the science of international nuclear safeguards for the identification, verification, and assay of special nuclear materials. Typically passive counting is used for Pu and mixed Pu-U items and active methods are used for U materials. Three measured counting rates, singles, doubles and triples are measured and, in combination with a simple analytical point-model, are used to calculate characteristics of the measurement item in terms of known detector and nuclear parameters. However, the measurement problem usually involves more than three quantities of interest, but even in cases where themore » next higher order count rate, quads, is statistically viable, it is not quantitatively applied because corrections for dead time losses are currently not available in the predominant analysis paradigm. In this work we overcome this limitation by extending the commonly used dead time correction method, developed by Dytlewski, to quads. We also give results for pents, which may be of interest for certain special investigations. Extension to still higher orders, may be accomplished by inspection based on the sequence presented. We discuss the foundations of the Dytlewski method, give limiting cases, and highlight the opportunities and implications that these new results expose. In particular there exist a number of ways in which the new results may be combined with other approaches to extract the correlated rates, and this leads to various practical implementations.« less
Determining the Uncertainty of X-Ray Absorption Measurements
Wojcik, Gary S.
2004-01-01
X-ray absorption (or more properly, x-ray attenuation) techniques have been applied to study the moisture movement in and moisture content of materials like cement paste, mortar, and wood. An increase in the number of x-ray counts with time at a location in a specimen may indicate a decrease in moisture content. The uncertainty of measurements from an x-ray absorption system, which must be known to properly interpret the data, is often assumed to be the square root of the number of counts, as in a Poisson process. No detailed studies have heretofore been conducted to determine the uncertainty of x-ray absorption measurements or the effect of averaging data on the uncertainty. In this study, the Poisson estimate was found to adequately approximate normalized root mean square errors (a measure of uncertainty) of counts for point measurements and profile measurements of water specimens. The Poisson estimate, however, was not reliable in approximating the magnitude of the uncertainty when averaging data from paste and mortar specimens. Changes in uncertainty from differing averaging procedures were well-approximated by a Poisson process. The normalized root mean square errors decreased when the x-ray source intensity, integration time, collimator size, and number of scanning repetitions increased. Uncertainties in mean paste and mortar count profiles were kept below 2 % by averaging vertical profiles at horizontal spacings of 1 mm or larger with counts per point above 4000. Maximum normalized root mean square errors did not exceed 10 % in any of the tests conducted. PMID:27366627
Azimuthal Structure of the Sand Erg that Encircles the North Polar Water-Ice Cap
NASA Astrophysics Data System (ADS)
Teodoro, L. A.; Elphic, R. C.; Eke, V. R.; Feldman, W. C.; Maurice, S.; Pathare, A.
2011-12-01
The sand erg that completely encircles the perennial water-ice cap that covers the Martian north geographic pole displays considerable azimuthal structure as seen in visible and near-IR images. Much of this structure is associated with the terminations of the many steep troughs that cut spiral the approximately 3 km thick polar ice cap. Other contributions come from the katabatic winds that spill over steep-sided edges of the cap, such as what bounds the largest set of dunes that comprise Olympia Undae. During the spring and summer months when these winds initiate from the higher altitudes that contain sublimating CO2 ice, which is very cold and dry, heat adiabatically when they compress as they lose altitude. These winds should then remove H2O moisture from the uppermost layer of the sand dunes that are directly in their path. Two likely locations where this desiccation may occur preferentially is at the termination of Chasma Boreale and the ice cap at Olympia Undae. We will search for this effect by sharpening the spatial structure of the epithermal neutron counting rates measured at northern high latitudes using the Mars Odyssey Neutron Spectrometer (MONS). The epithermal range of neutron energies is nearly uniquely sensitive to the hydrogen content of surface soils, which should likely be in the form of H2O/OH molecules/radicals. We therefore convert epithermal counting rates in terms of Water-Equivalent-Hydrogen, WEH. However, MONS counting-rate data have a FWHM of ~550 km., which is sufficiently broad to prevent a close association of WEH variability with images of geological features. In this study, we reduce spurious features in the instrument smeared neutron counting rates through deconvolution. We choose the PIXON numerical deconvolution technique for this purpose. This technique uses a statistical approach (Pina 2001, Eke 2001), which is capable of removing spurious features in the data in the presence of noise. We have previously carried out a detailed study of the martian polar regions applying such a methodology to Martian epithermal neutrons (e.g. Teodoro 2010, 2011). In the present study, we will apply this technique to the recent reanalysis of MONS epithermal data (Maurice et al., 2011), which is marked by significantly lower statistical and systematic uncertainties that have plagued older versions of these data.
Heer, D M; Passel, J F
1987-01-01
This article compares 2 different methods for estimating the number of undocumented Mexican adults in Los Angeles County. The 1st method, the survey-based method, uses a combination of 1980 census data and the results of a survey conducted in Los Angeles County in 1980 and 1981. A sample was selected from babies born in Los Angeles County who had a mother or father of Mexican origin. The survey included questions about the legal status of the baby's parents and certain other relatives. The resulting estimates of undocumented Mexican immigrants are for males aged 18-44 and females aged 18-39. The 2nd method, the residual method, involves comparison of census figures for aliens counted with estimates of legally-resident aliens developed principally with data from the Immigration and Naturalization Service (INS). For this study, estimates by age, sex, and period of entry were produced for persons born in Mexico and living in Los Angeles County. The results of this research indicate that it is possible to measure undocumented immigration with different techniques, yet obtain results that are similar. Both techniques presented here are limited in that they represent estimates of undocumented aliens based on the 1980 census. The number of additional undocumented aliens not counted remains a subject of conjecture. The fact that the proportions undocumented shown in the survey (228,700) are quite similar to the residual estimates (317,800) suggests that the number of undocumented aliens not counted in the census may not be an extremely large fraction of the undocumented population. The survey-based estimates have some significant advantages over the residual estimates. The survey provides tabulations of the undocumented population by characteristics other than the limited demographic information provided by the residual technique. On the other hand, the survey-based estimates require that a survey be conducted and, if national or regional estimates are called for, they may require a number of surveys. The residual technique, however, also requires a data source other than the census. However, the INS discontinued the annual registration of aliens after 1981. Thus, estimates of undocumented aliens based on the residual technique will probably not be possible for subnational areas using the 1990 census unless the registration program is reinstituted. Perhaps the best information on the undocumented population in the 1990 census will come from an improved version of the survey-based technique described here applied in selected local areas.
NASA Astrophysics Data System (ADS)
Scott, K. S.; Yun, M. S.; Wilson, G. W.; Austermann, J. E.; Aguilar, E.; Aretxaga, I.; Ezawa, H.; Ferrusca, D.; Hatsukade, B.; Hughes, D. H.; Iono, D.; Giavalisco, M.; Kawabe, R.; Kohno, K.; Mauskopf, P. D.; Oshima, T.; Perera, T. A.; Rand, J.; Tamura, Y.; Tosaki, T.; Velazquez, M.; Williams, C. C.; Zeballos, M.
2010-07-01
We present the first results from a confusion-limited map of the Great Observatories Origins Deep Survey-South (GOODS-S) taken with the AzTEC camera on the Atacama Submillimeter Telescope Experiment. We imaged a field to a 1σ depth of 0.48-0.73 mJybeam-1, making this one of the deepest blank-field surveys at mm-wavelengths ever achieved. Although by traditional standards our GOODS-S map is extremely confused due to a sea of faint underlying sources, we demonstrate through simulations that our source identification and number counts analyses are robust, and the techniques discussed in this paper are relevant for other deeply confused surveys. We find a total of 41 dusty starburst galaxies with signal-to-noise ratios S/N >= 3. 5 within this uniformly covered region, where only two are expected to be false detections, and an additional seven robust source candidates located in the noisier (1σ ~ 1 mJybeam-1) outer region of the map. We derive the 1.1 mm number counts from this field using two different methods: a fluctuation or ``P(d)'' analysis and a semi-Bayesian technique and find that both methods give consistent results. Our data are well fit by a Schechter function model with . Given the depth of this survey, we put the first tight constraints on the 1.1 mm number counts at S1.1mm = 0.5 mJy, and we find evidence that the faint end of the number counts at from various SCUBA surveys towards lensing clusters are biased high. In contrast to the 870μm survey of this field with the LABOCA camera, we find no apparent underdensity of sources compared to previous surveys at 1.1mm the estimates of the number counts of SMGs at flux densities >1mJy determined here are consistent with those measured from the AzTEC/SHADES survey. Additionally, we find a significant number of SMGs not identified in the LABOCA catalogue. We find that in contrast to observations at λ <= 500μm, MIPS 24μm sources do not resolve the total energy density in the cosmic infrared background at 1.1 mm, demonstrating that a population of z >~ 3 dust-obscured galaxies that are unaccounted for at these shorter wavelengths potentially contribute to a large fraction (~2/3) of the infrared background at 1.1 mm.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Favalli, Andrea; Iliev, Metodi; Ianakiev, Kiril
High-energy delayed γ-ray spectroscopy is a potential technique for directly assaying spent fuel assemblies and achieving the safeguards goal of quantifying nuclear material inventories for spent fuel handling, interim storage, reprocessing facilities, repository sites, and final disposal. Requirements for the γ-ray detection system, up to ~6 MeV, can be summarized as follows: high efficiency at high γ-ray energies, high energy resolution, good linearity between γ-ray energy and output signal amplitude, ability to operate at very high count rates, and ease of use in industrial environments such as nuclear facilities. High Purity Germanium Detectors (HPGe) are the state of the artmore » and provide excellent energy resolution but are limited in their count rate capability. Lanthanum Bromide (LaBr 3) scintillation detectors offer significantly higher count rate capabilities at lower energy resolution. Thus, LaBr 3 detectors may be an effective alternative for nuclear spent-fuel applications, where count-rate capability is a requirement. This paper documents the measured performance of a 2” (length) × 2” (diameter) of LaBr3 scintillation detector system, coupled to a negatively biased PMT and a tapered active high voltage divider, with count-rates up to ~3 Mcps. An experimental methodology was developed that uses the average current from the PMT’s anode and a dual source method to characterize the detector system at specific very high count rate values. Delayed γ-ray spectra were acquired with the LaBr 3 detector system at the Idaho Accelerator Center, Idaho State University, where samples of ~3g of 235U were irradiated with moderated neutrons from a photo-neutron source. Results of the spectroscopy characterization and analysis of the delayed γ-ray spectra acquired indicate the possible use of LaBr3 scintillation detectors when high count rate capability may outweigh the lower energy resolution.« less
Favalli, Andrea; Iliev, Metodi; Ianakiev, Kiril; ...
2017-10-09
High-energy delayed γ-ray spectroscopy is a potential technique for directly assaying spent fuel assemblies and achieving the safeguards goal of quantifying nuclear material inventories for spent fuel handling, interim storage, reprocessing facilities, repository sites, and final disposal. Requirements for the γ-ray detection system, up to ~6 MeV, can be summarized as follows: high efficiency at high γ-ray energies, high energy resolution, good linearity between γ-ray energy and output signal amplitude, ability to operate at very high count rates, and ease of use in industrial environments such as nuclear facilities. High Purity Germanium Detectors (HPGe) are the state of the artmore » and provide excellent energy resolution but are limited in their count rate capability. Lanthanum Bromide (LaBr 3) scintillation detectors offer significantly higher count rate capabilities at lower energy resolution. Thus, LaBr 3 detectors may be an effective alternative for nuclear spent-fuel applications, where count-rate capability is a requirement. This paper documents the measured performance of a 2” (length) × 2” (diameter) of LaBr3 scintillation detector system, coupled to a negatively biased PMT and a tapered active high voltage divider, with count-rates up to ~3 Mcps. An experimental methodology was developed that uses the average current from the PMT’s anode and a dual source method to characterize the detector system at specific very high count rate values. Delayed γ-ray spectra were acquired with the LaBr 3 detector system at the Idaho Accelerator Center, Idaho State University, where samples of ~3g of 235U were irradiated with moderated neutrons from a photo-neutron source. Results of the spectroscopy characterization and analysis of the delayed γ-ray spectra acquired indicate the possible use of LaBr3 scintillation detectors when high count rate capability may outweigh the lower energy resolution.« less
Cho, H-M; Ding, H; Ziemer, B P; Molloi, S
2014-12-07
Accurate energy calibration is critical for the application of energy-resolved photon-counting detectors in spectral imaging. The aim of this study is to investigate the feasibility of energy response calibration and characterization of a photon-counting detector using x-ray fluorescence. A comprehensive Monte Carlo simulation study was performed using Geant4 Application for Tomographic Emission (GATE) to investigate the optimal technique for x-ray fluorescence calibration. Simulations were conducted using a 100 kVp tungsten-anode spectra with 2.7 mm Al filter for a single pixel cadmium telluride (CdTe) detector with 3 × 3 mm(2) in detection area. The angular dependence of x-ray fluorescence and scatter background was investigated by varying the detection angle from 20° to 170° with respect to the beam direction. The effects of the detector material, shape, and size on the recorded x-ray fluorescence were investigated. The fluorescent material size effect was considered with and without the container for the fluorescent material. In order to provide validation for the simulation result, the angular dependence of x-ray fluorescence from five fluorescent materials was experimentally measured using a spectrometer. Finally, eleven of the fluorescent materials were used for energy calibration of a CZT-based photon-counting detector. The optimal detection angle was determined to be approximately at 120° with respect to the beam direction, which showed the highest fluorescence to scatter ratio (FSR) with a weak dependence on the fluorescent material size. The feasibility of x-ray fluorescence for energy calibration of photon-counting detectors in the diagnostic x-ray energy range was verified by successfully calibrating the energy response of a CZT-based photon-counting detector. The results of this study can be used as a guideline to implement the x-ray fluorescence calibration method for photon-counting detectors in a typical imaging laboratory.
NASA Astrophysics Data System (ADS)
Cho, H.-M.; Ding, H.; Ziemer, BP; Molloi, S.
2014-12-01
Accurate energy calibration is critical for the application of energy-resolved photon-counting detectors in spectral imaging. The aim of this study is to investigate the feasibility of energy response calibration and characterization of a photon-counting detector using x-ray fluorescence. A comprehensive Monte Carlo simulation study was performed using Geant4 Application for Tomographic Emission (GATE) to investigate the optimal technique for x-ray fluorescence calibration. Simulations were conducted using a 100 kVp tungsten-anode spectra with 2.7 mm Al filter for a single pixel cadmium telluride (CdTe) detector with 3 × 3 mm2 in detection area. The angular dependence of x-ray fluorescence and scatter background was investigated by varying the detection angle from 20° to 170° with respect to the beam direction. The effects of the detector material, shape, and size on the recorded x-ray fluorescence were investigated. The fluorescent material size effect was considered with and without the container for the fluorescent material. In order to provide validation for the simulation result, the angular dependence of x-ray fluorescence from five fluorescent materials was experimentally measured using a spectrometer. Finally, eleven of the fluorescent materials were used for energy calibration of a CZT-based photon-counting detector. The optimal detection angle was determined to be approximately at 120° with respect to the beam direction, which showed the highest fluorescence to scatter ratio (FSR) with a weak dependence on the fluorescent material size. The feasibility of x-ray fluorescence for energy calibration of photon-counting detectors in the diagnostic x-ray energy range was verified by successfully calibrating the energy response of a CZT-based photon-counting detector. The results of this study can be used as a guideline to implement the x-ray fluorescence calibration method for photon-counting detectors in a typical imaging laboratory.
Cho, H-M; Ding, H; Ziemer, BP; Molloi, S
2014-01-01
Accurate energy calibration is critical for the application of energy-resolved photon-counting detectors in spectral imaging. The aim of this study is to investigate the feasibility of energy response calibration and characterization of a photon-counting detector using X-ray fluorescence. A comprehensive Monte Carlo simulation study was performed using Geant4 Application for Tomographic Emission (GATE) to investigate the optimal technique for X-ray fluorescence calibration. Simulations were conducted using a 100 kVp tungsten-anode spectra with 2.7 mm Al filter for a single pixel cadmium telluride (CdTe) detector with 3 × 3 mm2 in detection area. The angular dependence of X-ray fluorescence and scatter background was investigated by varying the detection angle from 20° to 170° with respect to the beam direction. The effects of the detector material, shape, and size on the recorded X-ray fluorescence were investigated. The fluorescent material size effect was considered with and without the container for the fluorescent material. In order to provide validation for the simulation result, the angular dependence of X-ray fluorescence from five fluorescent materials was experimentally measured using a spectrometer. Finally, eleven of the fluorescent materials were used for energy calibration of a CZT-based photon-counting detector. The optimal detection angle was determined to be approximately at 120° with respect to the beam direction, which showed the highest fluorescence to scatter ratio (FSR) with a weak dependence on the fluorescent material size. The feasibility of X-ray fluorescence for energy calibration of photon-counting detectors in the diagnostic X-ray energy range was verified by successfully calibrating the energy response of a CZT-based photon-counting detector. The results of this study can be used as a guideline to implement the X-ray fluorescence calibration method for photon-counting detectors in a typical imaging laboratory. PMID:25369288
Ward-Paige, Christine; Mills Flemming, Joanna; Lotze, Heike K.
2010-01-01
Background Increasingly, underwater visual censuses (UVC) are used to assess fish populations. Several studies have demonstrated the effectiveness of protected areas for increasing fish abundance or provided insight into the natural abundance and structure of reef fish communities in remote areas. Recently, high apex predator densities (>100,000 individuals·km−2) and biomasses (>4 tonnes·ha−1) have been reported for some remote islands suggesting the occurrence of inverted trophic biomass pyramids. However, few studies have critically evaluated the methods used for sampling conspicuous and highly mobile fish such as sharks. Ideally, UVC are done instantaneously, however, researchers often count animals that enter the survey area after the survey has started, thus performing non-instantaneous UVC. Methodology/Principal Findings We developed a simulation model to evaluate counts obtained by divers deploying non-instantaneous belt-transect and stationary-point-count techniques. We assessed how fish speed and survey procedure (visibility, diver speed, survey time and dimensions) affect observed fish counts. Results indicate that the bias caused by fish speed alone is huge, while survey procedures had varying effects. Because the fastest fishes tend to be the largest, the bias would have significant implications on their biomass contribution. Therefore, caution is needed when describing abundance, biomass, and community structure based on non-instantaneous UVC, especially for highly mobile species such as sharks. Conclusions/Significance Based on our results, we urge that published literature state explicitly whether instantaneous counts were made and that survey procedures be accounted for when non-instantaneous counts are used. Using published density and biomass values of communities that include sharks we explore the effect of this bias and suggest that further investigation may be needed to determine pristine shark abundances and the existence of inverted biomass pyramids. Because such studies are used to make important management and conservation decisions, incorrect estimates of animal abundance and biomass have serious and significant implications. PMID:20661304
NASA Astrophysics Data System (ADS)
Favalli, Andrea; Iliev, Metodi; Ianakiev, Kiril; Hunt, Alan W.; Ludewigt, Bernhard
2018-01-01
High-energy delayed γ-ray spectroscopy is a potential technique for directly assaying spent fuel assemblies and achieving the safeguards goal of quantifying nuclear material inventories for spent fuel handling, interim storage, reprocessing facilities, repository sites, and final disposal. Requirements for the γ-ray detection system, up to ∼6 MeV, can be summarized as follows: high efficiency at high γ-ray energies, high energy resolution, good linearity between γ-ray energy and output signal amplitude, ability to operate at very high count rates, and ease of use in industrial environments such as nuclear facilities. High Purity Germanium Detectors (HPGe) are the state of the art and provide excellent energy resolution but are limited in their count rate capability. Lanthanum Bromide (LaBr3) scintillation detectors offer significantly higher count rate capabilities at lower energy resolution. Thus, LaBr3 detectors may be an effective alternative for nuclear spent-fuel applications, where count-rate capability is a requirement. This paper documents the measured performance of a 2" (length) × 2" (diameter) of LaBr3 scintillation detector system, coupled to a negatively biased PMT and a tapered active high voltage divider, with count-rates up to ∼3 Mcps. An experimental methodology was developed that uses the average current from the PMT's anode and a dual source method to characterize the detector system at specific very high count rate values. Delayed γ-ray spectra were acquired with the LaBr3 detector system at the Idaho Accelerator Center, Idaho State University, where samples of ∼3g of 235U were irradiated with moderated neutrons from a photo-neutron source. Results of the spectroscopy characterization and analysis of the delayed γ-ray spectra acquired indicate the possible use of LaBr3 scintillation detectors when high count rate capability may outweigh the lower energy resolution.
What's better than the pill, vasectomy, celibacy and rhythm?
Rorvik, D M
1975-01-01
A report of a new technique of male contraception involves the use of heat which lowers the sperm count. Early reports of lowered sperm counts in men wearing jockstraps or increased sperm counts in men whose testicles have been cooled several degrees have led to the experimentation in the male rat of the effects of heat and ultrasound on the sperm count and the ability to fertilize the female. 250 male rats were divided into 5 groups: 1) control, 2) a 60 degree C water circulating testicle cup with 15 minutes exposure, 3) exposure to radiant energy for 15 minutes and raising scrotal temperatures to 60 degrees C, 4) exposure to microwaves of varying powers, and 5) exposure to ultrasound of 1 w/cm to 2 w/cm for 1 minute. Group 2 results indicated that libido was uninhibited, testosterone levels undisturbed a nd organ sizes unaffected by the hot-water treatment. It took 30-35 days for any pregnancies to occur after a single treatment. In group 3, results were substantially the same except that it took 60-75 days for any pregnancies to occur. In group 4, a 20% exposure to radiation for 5 minutes impaired fertility for 65-80 days, while those exposed to 20% for 15 minutes were still infertile at the end of the 10-month study. In these cases libido was also unimpaired. Animals in group 5, exposed to 1 w/cm for 1 minute had impaired fertility for 150-210 days although testicular temperature rose to only 38 degrees C. When exposure to ultr asound was doubled, fertility was impaired throughout the study. In all cases where fertility was restored, resulting offspring appeared normal and were themselves capable of reproducing normal-appearing offspring. Ultrasound was considered the most promising heat source. In all cases the germinal epithelium function is arrested. Future research should be directed to making the ultrasound technique more finely tuned to a contraceptive function.
Parameter Estimation in Astronomy with Poisson-Distributed Data. 1; The (CHI)2(gamma) Statistic
NASA Technical Reports Server (NTRS)
Mighell, Kenneth J.
1999-01-01
Applying the standard weighted mean formula, [Sigma (sub i)n(sub i)ssigma(sub i, sup -2)], to determine the weighted mean of data, n(sub i), drawn from a Poisson distribution, will, on average, underestimate the true mean by approx. 1 for all true mean values larger than approx.3 when the common assumption is made that the error of the i th observation is sigma(sub i) = max square root of n(sub i), 1).This small, but statistically significant offset, explains the long-known observation that chi-square minimization techniques which use the modified Neyman'chi(sub 2) statistic, chi(sup 2, sub N) equivalent Sigma(sub i)((n(sub i) - y(sub i)(exp 2)) / max(n(sub i), 1), to compare Poisson - distributed data with model values, y(sub i), will typically predict a total number of counts that underestimates the true total by about 1 count per bin. Based on my finding that weighted mean of data drawn from a Poisson distribution can be determined using the formula [Sigma(sub i)[n(sub i) + min(n(sub i), 1)](n(sub i) + 1)(exp -1)] / [Sigma(sub i)(n(sub i) + 1)(exp -1))], I propose that a new chi(sub 2) statistic, chi(sup 2, sub gamma) equivalent, should always be used to analyze Poisson- distributed data in preference to the modified Neyman's chi(exp 2) statistic. I demonstrated the power and usefulness of,chi(sub gamma, sup 2) minimization by using two statistical fitting techniques and five chi(exp 2) statistics to analyze simulated X-ray power - low 15 - channel spectra with large and small counts per bin. I show that chi(sub gamma, sup 2) minimization with the Levenberg - Marquardt or Powell's method can produce excellent results (mean slope errors approx. less than 3%) with spectra having as few as 25 total counts.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, H; Cho, H; Molloi, S
Purpose: To investigate the feasibility of energy response calibration of a Si strip photon-counting detector by using the x-ray fluorescence technique. Methods: X-ray fluorescence was generated by using a pencil beam from a tungsten anode x-ray tube with 2 mm Al filtration. Spectra were acquired at 90° from the primary beam direction with an energy-resolved photon-counting detector based on Si strips. The distances from the source to target and the target to detector were approximately 19 and 11 cm, respectively. Four different materials, containing Ag, I, Ba, and Gd, were placed in small plastic aliquots with a diameter of approximatelymore » 0.7 cm for x-ray fluorescence measurements. Linear regression analysis was performed to derive the gain and offset values for the correlation between the measured fluorescence peak center and the known energies for materials. The energy resolution was derived from the full width at half maximum (FWHM) of the fluorescence peaks. In addition, the angular dependence of the recorded fluorescence spectra was studied at 30°, 60°, and 120°. Results: Strong fluorescence signals of all four target materials were recorded with the investigated geometry for the Si strip detector. The recorded pulse height was calibrated with respect to photon energy and the gain and offset values were calculated to be 7.0 mV/keV and −69.3 mV, respectively. Negligible variation in energy calibration was observed among the four energy thresholds. The variation among different pixels was estimated to be approximately 1 keV. The energy resolution of the detector was estimated to be 7.9% within the investigated energy range. Conclusion: The performance of a spectral imaging system using energy-resolved photon-counting detectors is very dependent on the energy calibration of the detector. The proposed x-ray fluorescence technique provides an accurate and efficient way to calibrate the energy response of a photon-counting detector.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saha, K; Barbarits, J; Humenik, R
Purpose: Chang’s mathematical formulation is a common method of attenuation correction applied on reconstructed Jaszczak phantom images. Though Chang’s attenuation correction method has been used for 360° angle acquisition, its applicability for 180° angle acquisition remains a question with one vendor’s camera software producing artifacts. The objective of this work is to ensure that Chang’s attenuation correction technique can be applied for reconstructed Jaszczak phantom images acquired in both 360° and 180° mode. Methods: The Jaszczak phantom filled with 20 mCi of diluted Tc-99m was placed on the patient table of Siemens e.cam™ (n = 2) and Siemens Symbia™ (nmore » = 1) dual head gamma cameras centered both in lateral and axial directions. A total of 3 scans were done at 180° and 2 scans at 360° orbit acquisition modes. Thirty two million counts were acquired for both modes. Reconstruction of the projection data was performed using filtered back projection smoothed with pre reconstruction Butterworth filter (order: 6, cutoff: 0.55). Reconstructed transaxial slices were attenuation corrected by Chang’s attenuation correction technique as implemented in the camera software. Corrections were also done using a modified technique where photon path lengths for all possible attenuation paths through a pixel in the image space were added to estimate the corresponding attenuation factor. The inverse of the attenuation factor was utilized to correct the attenuated pixel counts. Results: Comparable uniformity and noise were observed for 360° acquired phantom images attenuation corrected by the vendor technique (28.3% and 7.9%) and the proposed technique (26.8% and 8.4%). The difference in uniformity for 180° acquisition between the proposed technique (22.6% and 6.8%) and the vendor technique (57.6% and 30.1%) was more substantial. Conclusion: Assessment of attenuation correction performance by phantom uniformity analysis illustrated improved uniformity with the proposed algorithm compared to the camera software.« less
Multi-core fiber amplifier arrays for intra-satellite links
NASA Astrophysics Data System (ADS)
Kechagias, Marios; Crabb, Jonathan; Stampoulidis, Leontios; Farzana, Jihan; Kehayas, Efstratios; Filipowicz, Marta; Napierala, Marek; Murawski, Michal; Nasilowski, Tomasz; Barbero, Juan
2017-09-01
In this paper we present erbium doped fibre (EDF) aimed at signal amplification within satellite photonic payload systems operating in C telecommunication band. In such volume-hungry applications, the use of advanced optical transmission techniques such as space division multiplexing (SDM) can be advantageous to reduce the component and cable count.
Optical remote sensing for forest area estimation
Randolph H. Wynne; Richard G. Oderwald; Gregory A. Reams; John A. Scrivani
2000-01-01
The air photo dot-count method is now widely and successfully used for estimating operational forest area in the USDA Forest Inventory and Analysis (FIA) program. Possible alternatives that would provide for more frequent updates, spectral change detection, and maps of forest area include the AVHRR calibration center technique and various Landsat TM classification...
A Classroom Technique for Demonstrating Negative Attitudes toward Aging.
ERIC Educational Resources Information Center
Panek, Paul E.
1984-01-01
Students ask five individuals what three terms or words come to mind when they hear the term old person. The student prepares an overall list and frequency count of responses. Students present their findings to the class. Each response is discussed and its connotation (e.g., negative, positive, neutral) determined. (RM)
ERIC Educational Resources Information Center
Sullivan, Megan
2005-01-01
If you are an athlete or sports enthusiast, you know that every second counts. To find that 1-2% improvement that can make the difference between 1st and 5th place, sport biomechanists use science to investigate sports techniques and equipment, seeking ways to improve athlete performance and reduce injury risk. In essence, they want athletes to…
Reliability and precision of pellet-group counts for estimating landscape-level deer density
David S. deCalesta
2013-01-01
This study provides hitherto unavailable methodology for reliably and precisely estimating deer density within forested landscapes, enabling quantitative rather than qualitative deer management. Reliability and precision of the deer pellet-group technique were evaluated in 1 small and 2 large forested landscapes. Density estimates, adjusted to reflect deer harvest and...
Novel permutation measures for image encryption algorithms
NASA Astrophysics Data System (ADS)
Abd-El-Hafiz, Salwa K.; AbdElHaleem, Sherif H.; Radwan, Ahmed G.
2016-10-01
This paper proposes two measures for the evaluation of permutation techniques used in image encryption. First, a general mathematical framework for describing the permutation phase used in image encryption is presented. Using this framework, six different permutation techniques, based on chaotic and non-chaotic generators, are described. The two new measures are, then, introduced to evaluate the effectiveness of permutation techniques. These measures are (1) Percentage of Adjacent Pixels Count (PAPC) and (2) Distance Between Adjacent Pixels (DBAP). The proposed measures are used to evaluate and compare the six permutation techniques in different scenarios. The permutation techniques are applied on several standard images and the resulting scrambled images are analyzed. Moreover, the new measures are used to compare the permutation algorithms on different matrix sizes irrespective of the actual parameters used in each algorithm. The analysis results show that the proposed measures are good indicators of the effectiveness of the permutation technique.
NASA Astrophysics Data System (ADS)
Babick, Frank; Mielke, Johannes; Wohlleben, Wendel; Weigel, Stefan; Hodoroaba, Vasile-Dan
2016-06-01
Currently established and projected regulatory frameworks require the classification of materials (whether nano or non-nano) as specified by respective definitions, most of which are based on the size of the constituent particles. This brings up the question if currently available techniques for particle size determination are capable of reliably classifying materials that potentially fall under these definitions. In this study, a wide variety of characterisation techniques, including counting, fractionating, and spectroscopic techniques, has been applied to the same set of materials under harmonised conditions. The selected materials comprised well-defined quality control materials (spherical, monodisperse) as well as industrial materials of complex shapes and considerable polydispersity. As a result, each technique could be evaluated with respect to the determination of the number-weighted median size. Recommendations on the most appropriate and efficient use of techniques for different types of material are given.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tarisien, M.; Plaisir, C.; Gobet, F.
2011-02-15
We present a stand-alone system to characterize the high-energy particles emitted in the interaction of ultrahigh intensity laser pulses with matter. According to the laser and target characteristics, electrons or protons are produced with energies higher than a few mega electron volts. Selected material samples can, therefore, be activated via nuclear reactions. A multidetector, named NATALIE, has been developed to count the {beta}{sup +} activity of these irradiated samples. The coincidence technique used, designed in an integrated system, results in very low background in the data, which is required for low activity measurements. It, therefore, allows a good precision onmore » the nuclear activation yields of the produced radionuclides. The system allows high counting rates and online correction of the dead time. It also provides, online, a quick control of the experiment. Geant4 simulations are used at different steps of the data analysis to deduce, from the measured activities, the energy and angular distributions of the laser-induced particle beams. Two applications are presented to illustrate the characterization of electrons and protons.« less
Pulse pile-up in hard X-ray detector systems. [for solar X-rays
NASA Technical Reports Server (NTRS)
Datlowe, D. W.
1975-01-01
When pulse-height spectra are measured by a nuclear detection system at high counting rates, the probability that two or more pulses will arrive within the resolving time of the system is significant. This phenomenon, pulse pile-up, distorts the pulse-height spectrum and must be considered in the interpretation of spectra taken at high counting rates. A computational technique for the simulation of pile-up is developed. The model is examined in the three regimes where (1) the time between pulses is long compared to the detector-system resolving time, (2) the time between pulses is comparable to the resolving time, and (3) many pulses occur within the resolving time. The technique is used to model the solar hard X-ray experiment on the OSO-7 satellite; comparison of the model with data taken during three large flares shows excellent agreement. The paper also describes rule-of-thumb tests for pile-up and identifies the important detector design factors for minimizing pile-up, i.e., thick entrance windows and short resolving times in the system electronics.
Bender, L.C.; Myers, W.L.; Gould, W.R.
2003-01-01
Both ground and helicopter surveys are commonly used to collect sex and age composition data for ungulates. Little attention has been paid, however, to whether data collected by each technique are similar. We compared helicopter and ground composition data for both elk Cervus elaphus and mule deer Odocoileus hemionus across a variety of habitats in the state of Washington, USA. We found that ground and helicopter counts differed (P's < 0.002) consistently in male age structure estimates for elk, and that the two survey methods differed in estimates of adult sex ratios for mule deer (P = 0.023). Counts from helicopters provided larger sample sizes, tended to be more consistent annually in their results, and were corroborated by other demographic studies of the test populations. We conclude that helicopter and ground surveys differ for male age structure and perhaps male:female ratios, but are similar for young:female ratios. Managers should maintain a standardized technique using the same survey vehicle for trend analysis of composition data.
Medipix2 based CdTe microprobe for dental imaging
NASA Astrophysics Data System (ADS)
Vykydal, Z.; Fauler, A.; Fiederle, M.; Jakubek, J.; Svestkova, M.; Zwerger, A.
2011-12-01
Medical imaging devices and techniques are demanded to provide high resolution and low dose images of samples or patients. Hybrid semiconductor single photon counting devices together with suitable sensor materials and advanced techniques of image reconstruction fulfil these requirements. In particular cases such as the direct observation of dental implants also the size of the imaging device itself plays a critical role. This work presents the comparison of 2D radiographs of tooth provided by a standard commercial dental imaging system (Gendex 765DC X-ray tube with VisualiX scintillation detector) and two Medipix2 USB Lite detectors one equipped with a Si sensor (300 μm thick) and one with a CdTe sensor (1 mm thick). Single photon counting capability of the Medipix2 device allows virtually unlimited dynamic range of the images and thus increases the contrast significantly. The dimensions of the whole USB Lite device are only 15 mm × 60 mm of which 25% consists of the sensitive area. Detector of this compact size can be used directly inside the patients' mouth.
NASA Technical Reports Server (NTRS)
Wilson, James Charles
1994-01-01
There were two principal objectives of the cooperative agreement between NASA and the University of Denver. The first goal was to modify the design of the ER-2 condensation nuclei counter (CNC) so that the effective lower detection limit would be improved at high altitudes. This improvement was sought because, in the instrument used prior to 1993, diffusion losses prevented the smallest detectable particles from reaching the detection volume of the instrument during operation at low pressure. Therefore, in spite of the sensor's ability to detect particles as small as 0.008 microns in diameter, many of these particles were lost in transport to the sensing region and were not counted. Most of the particles emitted by aircraft are smaller than 0.1 micron in diameter. At the start date of this work, May 1990, continuous sizing techniques available on the ER-2 were only capable of detecting particles larger than 0.17 micron. Thus, the second objective of this work was to evaluate candidate sizing techniques in an effort to gain additional information concerning the size of particles emitted by aircraft.
Veatch, Sarah L.; Machta, Benjamin B.; Shelby, Sarah A.; Chiang, Ethan N.; Holowka, David A.; Baird, Barbara A.
2012-01-01
We present an analytical method using correlation functions to quantify clustering in super-resolution fluorescence localization images and electron microscopy images of static surfaces in two dimensions. We use this method to quantify how over-counting of labeled molecules contributes to apparent self-clustering and to calculate the effective lateral resolution of an image. This treatment applies to distributions of proteins and lipids in cell membranes, where there is significant interest in using electron microscopy and super-resolution fluorescence localization techniques to probe membrane heterogeneity. When images are quantified using pair auto-correlation functions, the magnitude of apparent clustering arising from over-counting varies inversely with the surface density of labeled molecules and does not depend on the number of times an average molecule is counted. In contrast, we demonstrate that over-counting does not give rise to apparent co-clustering in double label experiments when pair cross-correlation functions are measured. We apply our analytical method to quantify the distribution of the IgE receptor (FcεRI) on the plasma membranes of chemically fixed RBL-2H3 mast cells from images acquired using stochastic optical reconstruction microscopy (STORM/dSTORM) and scanning electron microscopy (SEM). We find that apparent clustering of FcεRI-bound IgE is dominated by over-counting labels on individual complexes when IgE is directly conjugated to organic fluorophores. We verify this observation by measuring pair cross-correlation functions between two distinguishably labeled pools of IgE-FcεRI on the cell surface using both imaging methods. After correcting for over-counting, we observe weak but significant self-clustering of IgE-FcεRI in fluorescence localization measurements, and no residual self-clustering as detected with SEM. We also apply this method to quantify IgE-FcεRI redistribution after deliberate clustering by crosslinking with two distinct trivalent ligands of defined architectures, and we evaluate contributions from both over-counting of labels and redistribution of proteins. PMID:22384026
Grigoryan, Artyom M; Dougherty, Edward R; Kononen, Juha; Bubendorf, Lukas; Hostetter, Galen; Kallioniemi, Olli
2002-01-01
Fluorescence in situ hybridization (FISH) is a molecular diagnostic technique in which a fluorescent labeled probe hybridizes to a target nucleotide sequence of deoxyribose nucleic acid. Upon excitation, each chromosome containing the target sequence produces a fluorescent signal (spot). Because fluorescent spot counting is tedious and often subjective, automated digital algorithms to count spots are desirable. New technology provides a stack of images on multiple focal planes throughout a tissue sample. Multiple-focal-plane imaging helps overcome the biases and imprecision inherent in single-focal-plane methods. This paper proposes an algorithm for global spot counting in stacked three-dimensional slice FISH images without the necessity of nuclei segmentation. It is designed to work in complex backgrounds, when there are agglomerated nuclei, and in the presence of illumination gradients. It is based on the morphological top-hat transform, which locates intensity spikes on irregular backgrounds. After finding signals in the slice images, the algorithm groups these together to form three-dimensional spots. Filters are employed to separate legitimate spots from fluorescent noise. The algorithm is set in a comprehensive toolbox that provides visualization and analytic facilities. It includes simulation software that allows examination of algorithm performance for various image and algorithm parameter settings, including signal size, signal density, and the number of slices.
Fekrazad, Reza; Seraj, Bahman; Chiniforush, Nasim; Rokouei, Mehrak; Mousavi, Niloofar; Ghadimi, Sara
2017-06-01
Antimicrobial photodynamic therapy (aPDT) is a novel technique for reduction of pathogenic microorganisms in dentistry. The aim of this study was to evaluate the effects of aPDT on Streptococcus mutans reduction in children with severe early childhood caries. Twenty-two children with severe early childhood caries aged 3-6 years were treated with toluoidine blue O (TBO) for 1min and irradiated by a Light Emitting Diode (LED; FotoSan, CMS Dental, Denmark) with the exposure time of 150s. Saliva samples were collected at baseline, 1h and 7 days after treatment. S. mutans counts were determined using the Dentocult SM Strip mutans. The counts of S. mutans in saliva decreased significantly after 1h (P<0.001). However, the difference in reduction of S. mutans counts in saliva was not significant between the baseline and 7 days after treatment (P>0.05). aPDT seems to be efficient to reduce salivary S. mutans immediately after treatment in children with severe early childhood caries. However, further research is needed to evaluate different doses and frequency of irradiation in combination with restoring carious teeth to find more durable results. Copyright © 2017 Elsevier B.V. All rights reserved.
Hogg, Abigail
2017-01-01
Objective. To examine how instructor-developed reading material relates to pre-class time spent preparing for the readiness assurance process (RAP) in a team-based learning (TBL) course. Methods. Students within pharmacokinetics and physiology were asked to self-report the amount of time spent studying for the RAP. Correlation analysis and multilevel linear regression techniques were used to identify factors within the pre-class reading material that contribute to self-reported study time. Results. On average students spent 3.2 hours preparing for a section of material in the TBL format. The ratio of predicted reading time, based on reading speed and word count, and self-reported study time was greater than 1:3. Self-reported study time was positively correlated with word count, number of tables and figures, and overall page length. For predictors of self-reported study time, topic difficulty and number of figures were negative predictors whereas word count and number of self-assessments were positive predictors. Conclusion. Factors related to reading material are moderate predictors of self-reported student study time for an accountability assessment. A more significant finding is student self-reported study time is much greater than the time predicted by simple word count. PMID:28970604
Spatiotemporal Dynamics of Total Viable Vibrio spp. in a NW Mediterranean Coastal Area.
Girard, Léa; Peuchet, Sébastien; Servais, Pierre; Henry, Annabelle; Charni-Ben-Tabassi, Nadine; Baudart, Julia
2017-09-27
A cellular approach combining Direct Viable Counting and Fluorescent In Situ Hybridization using a one-step multiple-probe technique and Solid Phase Cytometry (DVC-FISH-SPC) was developed to monitor total viable vibrios and cover the detection of a large diversity of vibrios. FISH combined three probes in the same assay and targeted sequences located at different positions on the 16S rRNA of Vibrio and Aliivibrio members. We performed a 10-month in situ study to investigate the weekly dynamics of viable vibrios relative to culturable counts at two northwestern Mediterranean coastal sites, and identified the key physicochemical factors for their occurrence in water using a multivariate analysis. Total viable and culturable cell counts showed the same temporal pattern during the warmer season, whereas the ratios between both methods were inverted during the colder seasons (<15°C), indicating that some of the vibrio community had entered into a viable but non-culturable (VBNC) state. We confirmed that Seawater Surface Temperature explained 51-62% of the total variance in culturable counts, and also showed that the occurrence of viable vibrios is controlled by two variables, pheopigment (15%) and phosphate (12%) concentrations, suggesting that other unidentified factors play a role in maintaining viability.
Physiological responses of bacteria in biofilms to disinfection.
Yu, F P; McFeters, G A
1994-01-01
In situ enumeration methods using fluorescent probes and a radioisotope labelling technique were applied to evaluate physiological changes of Klebsiella pneumoniae within biofilms after disinfection treatment. Chlorine (0.25 mg of free chlorine per liter [pH 7.2]) and monochloramine (1 mg/liter [pH 9.0]) were employed as disinfectants in the study. Two fluorgenic compounds, 5-cyano-2,3-ditolyl tetrazolium chloride and rhodamine 123, and tritiated uridine incorporation were chosen for assessment of physiological activities. Results obtained by these methods were compared with those from the plate count and direct viable count methods. 5-Cyano-2,3-ditolyl tetrazolium chloride is an indicator of bacterial respiratory activity, rhodamine 123 is incorporated into bacteria in response to transmembrane potential, and the incorporation of uridine represents the global RNA turnover rate. The results acquired by these methods following disinfection exposure showed a range of responses and suggested different physiological reactions in biofilms exposed to chlorine and monochloramine. The direct viable count response and respiratory activity were affected more by disinfection than were the transmembrane potential and RNA turnover rate on the basis of comparable efficiency as evaluated by plate count enumeration. Information revealed by these approaches can provide different physiological insights that may be used in evaluating the efficacy of biofilm disinfection. PMID:8074525
Estimation method for serial dilution experiments.
Ben-David, Avishai; Davidson, Charles E
2014-12-01
Titration of microorganisms in infectious or environmental samples is a corner stone of quantitative microbiology. A simple method is presented to estimate the microbial counts obtained with the serial dilution technique for microorganisms that can grow on bacteriological media and develop into a colony. The number (concentration) of viable microbial organisms is estimated from a single dilution plate (assay) without a need for replicate plates. Our method selects the best agar plate with which to estimate the microbial counts, and takes into account the colony size and plate area that both contribute to the likelihood of miscounting the number of colonies on a plate. The estimate of the optimal count given by our method can be used to narrow the search for the best (optimal) dilution plate and saves time. The required inputs are the plate size, the microbial colony size, and the serial dilution factors. The proposed approach shows relative accuracy well within ±0.1log10 from data produced by computer simulations. The method maintains this accuracy even in the presence of dilution errors of up to 10% (for both the aliquot and diluent volumes), microbial counts between 10(4) and 10(12) colony-forming units, dilution ratios from 2 to 100, and plate size to colony size ratios between 6.25 to 200. Published by Elsevier B.V.
Effectiveness of two synthetic fiber filters for removing white cells from AS-1 red cells.
Pikul, F J; Farrar, R P; Boris, M B; Estok, L; Marlo, D; Wildgen, M; Chaplin, H
1989-09-01
Two commercially available synthetic fiber filters were studied for their effectiveness at removing white cells (WBCs) from AS-1-preserved red cells (RBCs) stored less than or equal to 14 days. In all, 65 filtrations were performed. An automated microprocessor-controlled hydraulic system designed for use with cellulose acetate fiber filters was employed to prepare filtered RBCs before release for transfusion. Studies were also carried out on polyester fiber filters, which are designed to be used in-line during transfusion. Residual WBCs were below the accurate counting range of Coulter counters and of conventional manual chamber counts. An isosmotic ammonium chloride RBC lysis method, plus a modified chamber counting technique, permitted a 270-fold increase over the number of WBCs counted by the conventional manual method. For the polyester fiber-filtered products, residual WBCs per unit were not affected by speed of filtration, prior length of storage, or mechanical tapping during filtration. The effectiveness of WBC removal (mean 99.7%), total residual WBCs (means, 4.8 and 5.5 x 10(6], and RBC recovery (mean, 93%) was the same for both filters. The majority of residual WBCs were lymphocytes. WBC removal and RBC recovery were strikingly superior to results reported with nonfiltration methods.
Persky, Adam M; Hogg, Abigail
2017-08-01
Objective. To examine how instructor-developed reading material relates to pre-class time spent preparing for the readiness assurance process (RAP) in a team-based learning (TBL) course. Methods. Students within pharmacokinetics and physiology were asked to self-report the amount of time spent studying for the RAP. Correlation analysis and multilevel linear regression techniques were used to identify factors within the pre-class reading material that contribute to self-reported study time. Results. On average students spent 3.2 hours preparing for a section of material in the TBL format. The ratio of predicted reading time, based on reading speed and word count, and self-reported study time was greater than 1:3. Self-reported study time was positively correlated with word count, number of tables and figures, and overall page length. For predictors of self-reported study time, topic difficulty and number of figures were negative predictors whereas word count and number of self-assessments were positive predictors. Conclusion. Factors related to reading material are moderate predictors of self-reported student study time for an accountability assessment. A more significant finding is student self-reported study time is much greater than the time predicted by simple word count.
Dudak, Jan; Zemlicka, Jan; Karch, Jakub; Patzelt, Matej; Mrzilkova, Jana; Zach, Petr; Hermanova, Zuzana; Kvacek, Jiri; Krejci, Frantisek
2016-01-01
Using dedicated contrast agents high-quality X-ray imaging of soft tissue structures with isotropic micrometre resolution has become feasible. This technique is frequently titled as virtual histology as it allows production of slices of tissue without destroying the sample. The use of contrast agents is, however, often an irreversible time-consuming procedure and despite the non-destructive principle of X-ray imaging, the sample is usually no longer usable for other research methods. In this work we present the application of recently developed large-area photon counting detector for high resolution X-ray micro-radiography and micro-tomography of whole ex-vivo ethanol-preserved mouse organs. The photon counting detectors provide dark-current-free quantum-counting operation enabling acquisition of data with virtually unlimited contrast-to-noise ratio (CNR). Thanks to the very high CNR even ethanol-only preserved soft-tissue samples without addition of any contrast agent can be visualized in great detail. As ethanol preservation is one of the standard steps of tissue fixation for histology, the presented method can open a way for widespread use of micro-CT with all its advantages for routine 3D non-destructive soft-tissue visualisation. PMID:27461900
Use of mixed cultures of biocontrol agents to control sheep nematodes.
Baloyi, M A; Laing, M D; Yobo, K S
2012-03-23
Biological control is a promising non-chemical approach for the control of gastrointestinal nematodes of sheep. Use of combinations of biocontrol agents have been reported to be an effective method to increase the efficacy of biological control effects. In this study, combinations of either two Bacillus thuringiensis (Bt) or Clonostachys rosea (C. rosea) isolates and Bt+C. rosea isolates were evaluated in vitro in microtitre plates for their biocontrol activity on sheep nematodes. The Baermann technique was used to extract the surviving L3 larval stages of intestinal nematodes and counted under a dissecting microscope to determine the larval counts. Results indicate that there was a significant reduction of nematode counts due to combination of biocontrol agents (P<0.001). Combinations of Bt isolates reduced nematodes counts by 72.8%, 64% and 29.8%. The results revealed a control level of 57% when C. rosea isolates P3+P8 were combined. Combination of Bt and C. rosea isolates B10+P8 caused the greatest mortality of 76.7%. Most combinations were antagonistic, with only a few combinations showing an additive effect. None were synergistic. The isolate combinations were more effective than when isolates were used alone. Copyright © 2011 Elsevier B.V. All rights reserved.
Fayez, A; El Shantaly, K M; Abbas, M; Hauser, S; Müller, S C; Fathy, A
2010-01-01
We compared outcome and complications of three simple varicocelectomy techniques. Groups were divided according to whether they would receive the Ivanissevich technique (n = 55), Tauber's technique (n = 51) or subinguinal sclerotherapy (n = 49). Selection criteria were: infertility >1 year, subnormal semen, sonographic diameter of veins >3 mm and time of regurge >2 s. Patients were randomly assigned to the groups of treatment, with follow-up every 3 months for 1 year. Improvement was only in sperm count and total motility for all groups. Pregnancy rates were 20, 13.73 and 12.24%, respectively, with no significant difference between groups. Hydrocele occurred only in the group which received the Ivanissevich technique (5.5%). Tauber's technique is simple; however, it has the disadvantage of multiple branching of small veins. Copyright © 2010 S. Karger AG, Basel.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Keepin, G.R.
Over the years the Los Alamos safeguards program has developed, tested, and implemented a broad range of passive and active nondestructive analysis (NDA) instruments (based on gamma and x-ray detection and neutron counting) that are now widely employed in safeguarding nuclear materials of all forms. Here very briefly, the major categories of gamma ray and neutron based NDA techniques, give some representative examples of NDA instruments currently in use, and cite a few notable instances of state-of-the-art NDA technique development. Historical aspects and a broad overview of the safeguards program are also presented.
Airborne Lidar Measurements of Atmospheric Pressure Made Using the Oxygen A-Band
NASA Technical Reports Server (NTRS)
Riris, Haris; Rodriquez, Michael; Allan, Graham R.; Hasselbrack, William E.; Stephen, Mark A.; Abshire, James B.
2011-01-01
We report on airborne measurements of atmospheric pressure using a fiber-laser based lidar operating in the oxygen A-band near 765 nm and the integrated path differential absorption measurement technique. Our lidar uses fiber optic technology and non-linear optics to generate tunable laser radiation at 765 nm, which overlaps an absorption line pair in the Oxygen A-band. We use a pulsed time resolved technique, which rapidly steps the laser wavelength across the absorption line pair, a 20 cm telescope and photon counting detector to measure Oxygen concentrations.
Avoidance of Wrong-level Thoracic Spine Surgery Using Sterile Spinal Needles: A Technical Report.
Chin, Kingsley R; Seale, Jason; Cumming, Vanessa
2017-02-01
A technical report. The aim of the present study was to present an improvement on localization techniques employed for use in the thoracic spine using sterile spinal needles docked on the transverse process of each vertebra, which can be performed in both percutaneous and open spinal procedures. Wrong-level surgery may have momentous clinical and emotional implications for a patient and surgeon. It is reported that one in every 2 spine surgeons will operate on the wrong level during his or her career. Correctly localizing the specific thoracic level remains a significant challenge during spine surgery. Fluoroscopic anteroposterior and lateral views were obtained starting in the lower lumbar spine, and an 18-G spinal needle was placed in the transverse process of L3 counting up from the sacrum and also at T12. The fluoroscopy was then moved cephalad and counting from the spinal needle at T12, the other spinal needles were placed at the targeted operating thoracic vertebrae. Once this was done, we were able to accurately determine the thoracic levels for surgical intervention. Using this technique, the markers were kept in place even after the incisions were made. This prevented us from losing our location in the thoracic spine. Correctly placed instrumentation was made evident with postoperative imaging. We have described the successful use of a new technique using spinal needles docked against transverse processes to correctly and reliably identify thoracic levels before instrumentation. The technique was reproducible in both open surgeries and for a percutaneous procedure. This technique maintains the correct spinal level during an open procedure. We posit that wrong-level thoracic spine surgery may be preventable.
NASA Astrophysics Data System (ADS)
Schlacher, Thomas A.; Lucrezi, Serena; Peterson, Charles H.; Connolly, Rod M.; Olds, Andrew D.; Althaus, Franziska; Hyndes, Glenn A.; Maslo, Brooke; Gilby, Ben L.; Leon, Javier X.; Weston, Michael A.; Lastra, Mariano; Williams, Alan; Schoeman, David S.
2016-06-01
Most ecological studies require knowledge of animal abundance, but it can be challenging and destructive of habitat to obtain accurate density estimates for cryptic species, such as crustaceans that tunnel deeply into the seafloor, beaches, or mudflats. Such fossorial species are, however, widely used in environmental impact assessments, requiring sampling techniques that are reliable, efficient, and environmentally benign for these species and environments. Counting and measuring the entrances of burrows made by cryptic species is commonly employed to index population and body sizes of individuals. The fundamental premise is that burrow metrics consistently predict density and size. Here we review the evidence for this premise. We also review criteria for selecting among sampling methods: burrow counts, visual censuses, and physical collections. A simple 1:1 correspondence between the number of holes and population size cannot be assumed. Occupancy rates, indexed by the slope of regression models, vary widely between species and among sites for the same species. Thus, 'average' or 'typical' occupancy rates should not be extrapolated from site- or species specific field validations and then be used as conversion factors in other situations. Predictions of organism density made from burrow counts often have large uncertainty, being double to half of the predicted mean value. Whether such prediction uncertainty is 'acceptable' depends on investigators' judgements regarding the desired detectable effect sizes. Regression models predicting body size from burrow entrance dimensions are more precise, but parameter estimates of most models are specific to species and subject to site-to-site variation within species. These results emphasise the need to undertake thorough field validations of indirect census techniques that include tests of how sensitive predictive models are to changes in habitat conditions or human impacts. In addition, new technologies (e.g. drones, thermal-, acoustic- or chemical sensors) should be used to enhance visual census techniques of burrows and surface-active animals.
Rieger, J.; Twardziok, S.; Huenigen, H.; Hirschberg, R.M.; Plendl, J.
2013-01-01
Staining of mast cells (MCs), including porcine ones, is critically dependent upon the fixation and staining technique. In the pig, mucosal and submucosal MCs do not stain or stain only faintly after formalin fixation. Some fixation methods are particularly recommended for MC staining, for example the fixation with Carnoy or lead salts. Zinc salt fixation (ZSF) has been reported to work excellently for the preservation of fixation-sensitive antigens. The aim of this study was to establish a reliable histological method for counting of MCs in the porcine intestinum. For this purpose, different tissue fixation and staining methods that also allow potential subsequent immunohistochemical investigations were evaluated in the porcine mucosa, as well as submucosa of small and large intestine. Tissues were fixed in Carnoy, lead acetate, lead nitrate, Zamboni and ZSF and stained subsequently with either polychromatic methylene blue, alcian blue or toluidine blue. For the first time our study reveals that ZSF, a heavy metal fixative, preserves metachromatic staining of porcine MCs. Zamboni fixation was not suitable for histochemical visualization of MCs in the pig intestine. All other tested fixatives were suitable. Alcian blue and toluidine blue co-stained intestinal goblet cells which made a prima facie identification of MCs difficult. The polychromatic methylene blue proved to be the optimal staining. In order to compare MC counting results of the different fixation methods, tissue shrinkage was taken into account. As even the same fixation caused shrinkagedifferences between tissue from small and large intestine, different factors for each single fixation and intestinal localization had to be calculated. Tissue shrinkage varied between 19% and 57%, the highest tissue shrinkage was found after fixation with ZSF in the large intestine, the lowest one in the small intestine after lead acetate fixation. Our study emphasizes that MC counting results from data using different fixation techniques can only be compared if the respective studyimmanent shrinkage factor has been determined and quantification results are adjusted accordingly. PMID:24085270
CombiMotif: A new algorithm for network motifs discovery in protein-protein interaction networks
NASA Astrophysics Data System (ADS)
Luo, Jiawei; Li, Guanghui; Song, Dan; Liang, Cheng
2014-12-01
Discovering motifs in protein-protein interaction networks is becoming a current major challenge in computational biology, since the distribution of the number of network motifs can reveal significant systemic differences among species. However, this task can be computationally expensive because of the involvement of graph isomorphic detection. In this paper, we present a new algorithm (CombiMotif) that incorporates combinatorial techniques to count non-induced occurrences of subgraph topologies in the form of trees. The efficiency of our algorithm is demonstrated by comparing the obtained results with the current state-of-the art subgraph counting algorithms. We also show major differences between unicellular and multicellular organisms. The datasets and source code of CombiMotif are freely available upon request.
Techniques for the correction of topographical effects in scanning Auger electron microscopy
NASA Technical Reports Server (NTRS)
Prutton, M.; Larson, L. A.; Poppa, H.
1983-01-01
A number of ratioing methods for correcting Auger images and linescans for topographical contrast are tested using anisotropically etched silicon substrates covered with Au or Ag. Thirteen well-defined angles of incidence are present on each polyhedron produced on the Si by this etching. If N1 electrons are counted at the energy of an Auger peak and N2 are counted in the background above the peak, then N1, N1 - N2, (N1 - N2)/(N1 + N2) are measured and compared as methods of eliminating topographical contrast. The latter method gives the best compensation but can be further improved by using a measurement of the sample absorption current. Various other improvements are discussed.
Singh, Ramandeep; Bal, M S; Singla, L D; Kaur, Paramjit
2017-06-01
Anthelmintic resistance against commonly used anthelmintic fenbendazole was evaluated by employing faecal egg count reduction test (FECRT) in naturally occurring gastrointestinal (GI) nematodes in the semi organized sheep and goat farms of Ludhiana and Amritsar districts. A total of 80 animals (20 each for sheep and goat in both districts) were randomly selected and their faecal samples were examined by qualitative and quantitative parasitological techniques. Results indicate presence of high level of resistance against fenbendazole in both sheep and goat population of Ludhiana and Amritsar districts. More resistance was observed in the GI nematodes from animals reared in Amritsar district as compared to Ludhiana district. The level of anthelmintic resistance observed was apparently more in sheep than goats.
GraphPrints: Towards a Graph Analytic Method for Network Anomaly Detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
Harshaw, Chris R; Bridges, Robert A; Iannacone, Michael D
This paper introduces a novel graph-analytic approach for detecting anomalies in network flow data called \\textit{GraphPrints}. Building on foundational network-mining techniques, our method represents time slices of traffic as a graph, then counts graphlets\\textemdash small induced subgraphs that describe local topology. By performing outlier detection on the sequence of graphlet counts, anomalous intervals of traffic are identified, and furthermore, individual IPs experiencing abnormal behavior are singled-out. Initial testing of GraphPrints is performed on real network data with an implanted anomaly. Evaluation shows false positive rates bounded by 2.84\\% at the time-interval level, and 0.05\\% at the IP-level with 100\\% truemore » positive rates at both.« less
Counting of oligomers in sequences generated by markov chains for DNA motif discovery.
Shan, Gao; Zheng, Wei-Mou
2009-02-01
By means of the technique of the imbedded Markov chain, an efficient algorithm is proposed to exactly calculate first, second moments of word counts and the probability for a word to occur at least once in random texts generated by a Markov chain. A generating function is introduced directly from the imbedded Markov chain to derive asymptotic approximations for the problem. Two Z-scores, one based on the number of sequences with hits and the other on the total number of word hits in a set of sequences, are examined for discovery of motifs on a set of promoter sequences extracted from A. thaliana genome. Source code is available at http://www.itp.ac.cn/zheng/oligo.c.
Alkali halide microstructured optical fiber for X-ray detection
DOE Office of Scientific and Technical Information (OSTI.GOV)
DeHaven, S. L., E-mail: stanton.l.dehaven@nasa.gov, E-mail: russel.a.wincheski@nasa.gov; Wincheski, R. A., E-mail: stanton.l.dehaven@nasa.gov, E-mail: russel.a.wincheski@nasa.gov; Albin, S., E-mail: salbin@nsu.edu
Microstructured optical fibers containing alkali halide scintillation materials of CsI(Na), CsI(Tl), and NaI(Tl) are presented. The scintillation materials are grown inside the microstructured fibers using a modified Bridgman-Stockbarger technique. The x-ray photon counts of these fibers, with and without an aluminum film coating are compared to the output of a collimated CdTe solid state detector over an energy range from 10 to 40 keV. The photon count results show significant variations in the fiber output based on the materials. The alkali halide fiber output can exceed that of the CdTe detector, dependent upon photon counter efficiency and fiber configuration. Themore » results and associated materials difference are discussed.« less
NASA Astrophysics Data System (ADS)
Goldan, A. H.; Karim, K. S.; Reznik, A.; Caldwell, C. B.; Rowlands, J. A.
2008-03-01
Permanent breast seed implant (PBSI) brachytherapy technique was recently introduced as an alternative to high dose rate (HDR) brachytherapy and involves the permanent implantation of radioactive 103Palladium seeds into the surgical cavity of the breast for cancer treatment. To enable accurate seed implantation, this research introduces a gamma camera based on a hybrid amorphous selenium detector and CMOS readout pixel architecture for real-time imaging of 103Palladium seeds during the PBSI procedure. A prototype chip was designed and fabricated in 0.18-μm n-well CMOS process. We present the experimental results obtained from this integrated photon counting readout pixel.
New methods to detect particle velocity and mass flux in arc-heated ablation/erosion facilities
NASA Technical Reports Server (NTRS)
Brayton, D. B.; Bomar, B. W.; Seibel, B. L.; Elrod, P. D.
1980-01-01
Arc-heated flow facilities with injected particles are used to simulate the erosive and ablative/erosive environments encountered by spacecraft re-entry through fog, clouds, thermo-nuclear explosions, etc. Two newly developed particle diagnostic techniques used to calibrate these facilities are discussed. One technique measures particle velocity and is based on the detection of thermal radiation and/or chemiluminescence from the hot seed particles in a model ablation/erosion facility. The second technique measures a local particle rate, which is proportional to local particle mass flux, in a dust erosion facility by photodetecting and counting the interruptions of a focused laser beam by individual particles.
NASA Astrophysics Data System (ADS)
Spyrison, N. S.; Prommapan, P.; Kim, H.; Maloney, J.; Rustan, G. E.; Kreyssig, A.; Goldman, A. I.; Prozorov, R.
2011-03-01
The incorporation of the Tunnel Diode Resonator (TDR) technique into an ElectroStatic Levitation (ESL) apparatus was explored. The TDR technique is known to operate and behave well at low temperatures with careful attention to coil-sample positioning in a dark, shielded environment. With these specifications a frequency resolution of 10-9 in a few seconds counting time can be achieved. Complications arise when this technique is applied in the ESL chamber where a sample of molten metal is levitating less then 10 mm from the coil in a large electrostatic field. We have tested a variety of coils unconventional to TDR; including Helmholtz pairs and Archimedean spiral coils. Work was supported by the Nation Science Foundation under grant DMR-08-17157
Health diagnosis of arch bridge suspender by acoustic emission technique
NASA Astrophysics Data System (ADS)
Li, Dongsheng; Ou, Jinping
2007-01-01
Conventional non-destructive methods can't be dynamically monitored the suspenders' damage levels and types, so acoustic emission (AE) technique is proposed to monitor its activity. The validity signals are determined by the relationship with risetime and duration. The ambient noise is eliminated using float threshold value and placing a guard sensor. The cement mortar and steel strand damage level is analyzed by AE parameter method and damage types are judged by waveform analyzing technique. Based on these methods, all the suspenders of Sichuan Ebian Dadu river arch bridge have been monitored using AE techniques. The monitoring results show that AE signal amplitude, energy, counts can visually display the suspenders' damage levels, the difference of waveform and frequency range express different damage type. The testing results are well coincide with the practical situation.
Ding, Huanjun; Molloi, Sabee
2012-08-07
A simple and accurate measurement of breast density is crucial for the understanding of its impact in breast cancer risk models. The feasibility to quantify volumetric breast density with a photon-counting spectral mammography system has been investigated using both computer simulations and physical phantom studies. A computer simulation model involved polyenergetic spectra from a tungsten anode x-ray tube and a Si-based photon-counting detector has been evaluated for breast density quantification. The figure-of-merit (FOM), which was defined as the signal-to-noise ratio of the dual energy image with respect to the square root of mean glandular dose, was chosen to optimize the imaging protocols, in terms of tube voltage and splitting energy. A scanning multi-slit photon-counting spectral mammography system has been employed in the experimental study to quantitatively measure breast density using dual energy decomposition with glandular and adipose equivalent phantoms of uniform thickness. Four different phantom studies were designed to evaluate the accuracy of the technique, each of which addressed one specific variable in the phantom configurations, including thickness, density, area and shape. In addition to the standard calibration fitting function used for dual energy decomposition, a modified fitting function has been proposed, which brought the tube voltages used in the imaging tasks as the third variable in dual energy decomposition. For an average sized 4.5 cm thick breast, the FOM was maximized with a tube voltage of 46 kVp and a splitting energy of 24 keV. To be consistent with the tube voltage used in current clinical screening exam (∼32 kVp), the optimal splitting energy was proposed to be 22 keV, which offered a FOM greater than 90% of the optimal value. In the experimental investigation, the root-mean-square (RMS) error in breast density quantification for all four phantom studies was estimated to be approximately 1.54% using standard calibration function. The results from the modified fitting function, which integrated the tube voltage as a variable in the calibration, indicated a RMS error of approximately 1.35% for all four studies. The results of the current study suggest that photon-counting spectral mammography systems may potentially be implemented for an accurate quantification of volumetric breast density, with an RMS error of less than 2%, using the proposed dual energy imaging technique.