Counting conformal correlators
NASA Astrophysics Data System (ADS)
Kravchuk, Petr; Simmons-Duffin, David
2018-02-01
We introduce simple group-theoretic techniques for classifying conformallyinvariant tensor structures. With them, we classify tensor structures of general n-point functions of non-conserved operators, and n ≥ 4-point functions of general conserved currents, with or without permutation symmetries, and in any spacetime dimension d. Our techniques are useful for bootstrap applications. The rules we derive simultaneously count tensor structures for flat-space scattering amplitudes in d + 1 dimensions.
Comparison of line transects and point counts for monitoring spring migration in forested wetlands
Wilson, R.R.; Twedt, D.J.; Elliott, A.B.
2000-01-01
We compared the efficacy of 400-m line transects and sets of three point counts at detecting avian richness and abundance in bottomland hardwood forests and intensively managed cottonwood (Populus deltoides) plantations within the Mississippi Alluvial Valley. We detected more species and more individuals on line transects than on three point counts during 218 paired surveys conducted between 24 March and 3 June, 1996 and 1997. Line transects also yielded more birds per unit of time, even though point counts yielded higher estimates of relative bird density. In structurally more-complex bottomland hardwood forests, we detected more species and individuals on line transects, but in more-open cottonwood plantations, transects surpassed point counts only at detecting species within 50 m of the observer. Species richness and total abundance of Nearctic-Neotropical migrants and temperate migrants were greater on line transects within bottomland hardwood forests. Within cottonwood plantations, however, only species richness of Nearctic-Neotropical migrants and total abundance of temperate migrants were greater on line transects. Because we compared survey techniques using the same observer, within the same forest stand on a given day, we assumed that the technique yielding greater estimates of avian species richness and total abundance per unit of effort is superior. Thus, for monitoring migration within hardwood forests of the Mississippi Alluvial Valley, we recommend using line transects instead of point counts.
Shrestha, Sachin L; Breen, Andrew J; Trimby, Patrick; Proust, Gwénaëlle; Ringer, Simon P; Cairney, Julie M
2014-02-01
The identification and quantification of the different ferrite microconstituents in steels has long been a major challenge for metallurgists. Manual point counting from images obtained by optical and scanning electron microscopy (SEM) is commonly used for this purpose. While classification systems exist, the complexity of steel microstructures means that identifying and quantifying these phases is still a great challenge. Moreover, point counting is extremely tedious, time consuming, and subject to operator bias. This paper presents a new automated identification and quantification technique for the characterisation of complex ferrite microstructures by electron backscatter diffraction (EBSD). This technique takes advantage of the fact that different classes of ferrite exhibit preferential grain boundary misorientations, aspect ratios and mean misorientation, all of which can be detected using current EBSD software. These characteristics are set as criteria for identification and linked to grain size to determine the area fractions. The results of this method were evaluated by comparing the new automated technique with point counting results. The technique could easily be applied to a range of other steel microstructures. © 2013 Published by Elsevier B.V.
A land manager's guide to point counts of birds in the Southeast
Hamel, P.B.; Smith, W.P.; Twedt, D.J.; Woehr, J.R.; Morris, E.; Hamilton, R.B.; Cooper, R.J.
1996-01-01
Current widespread concern for the status of neotropical migratory birds has sparked interest in techniques for inventorying and monitoring populations of these and other birds in southeastern forest habitats. The present guide gives detailed instructions for conducting point counts of birds. It further presents a detailed methodology for the design and conduct of inventorial and monitoring surveys based on point counts, including discussion of sample size determination, distribution of counts among habitats, cooperation among neighboring land managers, vegetation sampling, standard data format, and other topics. Appendices provide additional information, making this guide a stand-alone text for managers interested in developing inventories of bird populations on their lands.
A Land Manager's Guide to Point Counts of Birds in the Southeast
Paul B. Hamel; Winston P. Smith; Daniel J. Twedt; James R. Woehr; Eddie Morris; Robert B. Hamilton; Robert J. Cooper
1996-01-01
Current widespread concern for the status of neotropical migratory birds has sparked interest in techniques for inventorying and monitoring populations of these and other birds in southeastern forest habitats. The present guide gives detailed instructions for conducting point counts of birds. It further presents a detailed methodology for the design and conduct of...
A removal model for estimating detection probabilities from point-count surveys
Farnsworth, G.L.; Pollock, K.H.; Nichols, J.D.; Simons, T.R.; Hines, J.E.; Sauer, J.R.
2002-01-01
Use of point-count surveys is a popular method for collecting data on abundance and distribution of birds. However, analyses of such data often ignore potential differences in detection probability. We adapted a removal model to directly estimate detection probability during point-count surveys. The model assumes that singing frequency is a major factor influencing probability of detection when birds are surveyed using point counts. This may be appropriate for surveys in which most detections are by sound. The model requires counts to be divided into several time intervals. Point counts are often conducted for 10 min, where the number of birds recorded is divided into those first observed in the first 3 min, the subsequent 2 min, and the last 5 min. We developed a maximum-likelihood estimator for the detectability of birds recorded during counts divided into those intervals. This technique can easily be adapted to point counts divided into intervals of any length. We applied this method to unlimited-radius counts conducted in Great Smoky Mountains National Park. We used model selection criteria to identify whether detection probabilities varied among species, throughout the morning, throughout the season, and among different observers. We found differences in detection probability among species. Species that sing frequently such as Winter Wren (Troglodytes troglodytes) and Acadian Flycatcher (Empidonax virescens) had high detection probabilities (∼90%) and species that call infrequently such as Pileated Woodpecker (Dryocopus pileatus) had low detection probability (36%). We also found detection probabilities varied with the time of day for some species (e.g. thrushes) and between observers for other species. We used the same approach to estimate detection probability and density for a subset of the observations with limited-radius point counts.
Nisari, Mehtap; Ertekin, Tolga; Ozçelik, Ozlem; Cınar, Serife; Doğanay, Selim; Acer, Niyazi
2012-11-01
Brain development in early life is thought to be critical period in neurodevelopmental disorder. Knowledge relating to this period is currently quite limited. This study aimed to evaluate the volume relation of total brain (TB), cerebrum, cerebellum and bulbus+pons by the use of Archimedes' principle and stereological (point-counting) method and after that to compare these approaches with each other in newborns. This study was carried out on five newborn cadavers mean weighing 2.220 ± 1.056 g with no signs of neuropathology. The mean (±SD) age of the subjects was 39.7 (±1.5) weeks. The volume and volume fraction of the total brain, cerebrum, cerebellum and bulbus+pons were determined on magnetic resonance (MR) images using the point-counting approach of stereological methods and by the use of fluid displacement technique. The mean (±SD) TB, cerebrum, cerebellum and bulbus+pons volumes by fluid displacement were 271.48 ± 78.3, 256.6 ± 71.8, 12.16 ± 6.1 and 2.72 ± 1.6 cm3, respectively. By the Cavalieri principle (point-counting) using sagittal MRIs, they were 262.01 ± 74.9, 248.11 ± 68.03, 11.68 ± 6.1 and 2.21 ± 1.13 cm3, respectively. The mean (± SD) volumes by point-counting technique using axial MR images were 288.06 ± 88.5, 275.2 ± 83.1, 19.75 ± 5.3 and 2.11 ± 0.7 cm3, respectively. There were no differences between the fluid displacement and point-counting (using axial and sagittal images) for all structures (p > 0.05). This study presents the basic data for studies relative to newborn's brain volume fractions according to two methods. Stereological (point-counting) estimation may be accepted a beneficial and new tool for neurological evaluation in vivo research of the brain. Based on these techniques we introduce here, the clinician may evaluate the growth of the brain in a more efficient and precise manner.
Simple to complex modeling of breathing volume using a motion sensor.
John, Dinesh; Staudenmayer, John; Freedson, Patty
2013-06-01
To compare simple and complex modeling techniques to estimate categories of low, medium, and high ventilation (VE) from ActiGraph™ activity counts. Vertical axis ActiGraph™ GT1M activity counts, oxygen consumption and VE were measured during treadmill walking and running, sports, household chores and labor-intensive employment activities. Categories of low (<19.3 l/min), medium (19.3 to 35.4 l/min) and high (>35.4 l/min) VEs were derived from activity intensity classifications (light <2.9 METs, moderate 3.0 to 5.9 METs and vigorous >6.0 METs). We examined the accuracy of two simple techniques (multiple regression and activity count cut-point analyses) and one complex (random forest technique) modeling technique in predicting VE from activity counts. Prediction accuracy of the complex random forest technique was marginally better than the simple multiple regression method. Both techniques accurately predicted VE categories almost 80% of the time. The multiple regression and random forest techniques were more accurate (85 to 88%) in predicting medium VE. Both techniques predicted the high VE (70 to 73%) with greater accuracy than low VE (57 to 60%). Actigraph™ cut-points for light, medium and high VEs were <1381, 1381 to 3660 and >3660 cpm. There were minor differences in prediction accuracy between the multiple regression and the random forest technique. This study provides methods to objectively estimate VE categories using activity monitors that can easily be deployed in the field. Objective estimates of VE should provide a better understanding of the dose-response relationship between internal exposure to pollutants and disease. Copyright © 2013 Elsevier B.V. All rights reserved.
Estimation of Traffic Variables Using Point Processing Techniques
DOT National Transportation Integrated Search
1978-05-01
An alternative approach to estimating aggregate traffic variables on freeways--spatial mean velocity and density--is presented. Vehicle arrival times at a given location on a roadway, typically a presence detector, are regarded as a point or counting...
A removal model for estimating detection probabilities from point-count surveys
Farnsworth, G.L.; Pollock, K.H.; Nichols, J.D.; Simons, T.R.; Hines, J.E.; Sauer, J.R.
2000-01-01
We adapted a removal model to estimate detection probability during point count surveys. The model assumes one factor influencing detection during point counts is the singing frequency of birds. This may be true for surveys recording forest songbirds when most detections are by sound. The model requires counts to be divided into several time intervals. We used time intervals of 2, 5, and 10 min to develop a maximum-likelihood estimator for the detectability of birds during such surveys. We applied this technique to data from bird surveys conducted in Great Smoky Mountains National Park. We used model selection criteria to identify whether detection probabilities varied among species, throughout the morning, throughout the season, and among different observers. The overall detection probability for all birds was 75%. We found differences in detection probability among species. Species that sing frequently such as Winter Wren and Acadian Flycatcher had high detection probabilities (about 90%) and species that call infrequently such as Pileated Woodpecker had low detection probability (36%). We also found detection probabilities varied with the time of day for some species (e.g. thrushes) and between observers for other species. This method of estimating detectability during point count surveys offers a promising new approach to using count data to address questions of the bird abundance, density, and population trends.
Fast distributed large-pixel-count hologram computation using a GPU cluster.
Pan, Yuechao; Xu, Xuewu; Liang, Xinan
2013-09-10
Large-pixel-count holograms are one essential part for big size holographic three-dimensional (3D) display, but the generation of such holograms is computationally demanding. In order to address this issue, we have built a graphics processing unit (GPU) cluster with 32.5 Tflop/s computing power and implemented distributed hologram computation on it with speed improvement techniques, such as shared memory on GPU, GPU level adaptive load balancing, and node level load distribution. Using these speed improvement techniques on the GPU cluster, we have achieved 71.4 times computation speed increase for 186M-pixel holograms. Furthermore, we have used the approaches of diffraction limits and subdivision of holograms to overcome the GPU memory limit in computing large-pixel-count holograms. 745M-pixel and 1.80G-pixel holograms were computed in 343 and 3326 s, respectively, for more than 2 million object points with RGB colors. Color 3D objects with 1.02M points were successfully reconstructed from 186M-pixel hologram computed in 8.82 s with all the above three speed improvement techniques. It is shown that distributed hologram computation using a GPU cluster is a promising approach to increase the computation speed of large-pixel-count holograms for large size holographic display.
Barnard, P.L.; Rubin, D.M.; Harney, J.; Mustain, N.
2007-01-01
This extensive field test of an autocorrelation technique for determining grain size from digital images was conducted using a digital bed-sediment camera, or 'beachball' camera. Using 205 sediment samples and >1200 images from a variety of beaches on the west coast of the US, grain size ranging from sand to granules was measured from field samples using both the autocorrelation technique developed by Rubin [Rubin, D.M., 2004. A simple autocorrelation algorithm for determining grain size from digital images of sediment. Journal of Sedimentary Research, 74(1): 160-165.] and traditional methods (i.e. settling tube analysis, sieving, and point counts). To test the accuracy of the digital-image grain size algorithm, we compared results with manual point counts of an extensive image data set in the Santa Barbara littoral cell. Grain sizes calculated using the autocorrelation algorithm were highly correlated with the point counts of the same images (r2 = 0.93; n = 79) and had an error of only 1%. Comparisons of calculated grain sizes and grain sizes measured from grab samples demonstrated that the autocorrelation technique works well on high-energy dissipative beaches with well-sorted sediment such as in the Pacific Northwest (r2 ??? 0.92; n = 115). On less dissipative, more poorly sorted beaches such as Ocean Beach in San Francisco, results were not as good (r2 ??? 0.70; n = 67; within 3% accuracy). Because the algorithm works well compared with point counts of the same image, the poorer correlation with grab samples must be a result of actual spatial and vertical variability of sediment in the field; closer agreement between grain size in the images and grain size of grab samples can be achieved by increasing the sampling volume of the images (taking more images, distributed over a volume comparable to that of a grab sample). In all field tests the autocorrelation method was able to predict the mean and median grain size with ???96% accuracy, which is more than adequate for the majority of sedimentological applications, especially considering that the autocorrelation technique is estimated to be at least 100 times faster than traditional methods.
Accelerating the two-point and three-point galaxy correlation functions using Fourier transforms
NASA Astrophysics Data System (ADS)
Slepian, Zachary; Eisenstein, Daniel J.
2016-01-01
Though Fourier transforms (FTs) are a common technique for finding correlation functions, they are not typically used in computations of the anisotropy of the two-point correlation function (2PCF) about the line of sight in wide-angle surveys because the line-of-sight direction is not constant on the Cartesian grid. Here we show how FTs can be used to compute the multipole moments of the anisotropic 2PCF. We also show how FTs can be used to accelerate the 3PCF algorithm of Slepian & Eisenstein. In both cases, these FT methods allow one to avoid the computational cost of pair counting, which scales as the square of the number density of objects in the survey. With the upcoming large data sets of Dark Energy Spectroscopic Instrument, Euclid, and Large Synoptic Survey Telescope, FT techniques will therefore offer an important complement to simple pair or triplet counts.
Sample to answer visualization pipeline for low-cost point-of-care blood cell counting
NASA Astrophysics Data System (ADS)
Smith, Suzanne; Naidoo, Thegaran; Davies, Emlyn; Fourie, Louis; Nxumalo, Zandile; Swart, Hein; Marais, Philip; Land, Kevin; Roux, Pieter
2015-03-01
We present a visualization pipeline from sample to answer for point-of-care blood cell counting applications. Effective and low-cost point-of-care medical diagnostic tests provide developing countries and rural communities with accessible healthcare solutions [1], and can be particularly beneficial for blood cell count tests, which are often the starting point in the process of diagnosing a patient [2]. The initial focus of this work is on total white and red blood cell counts, using a microfluidic cartridge [3] for sample processing. Analysis of the processed samples has been implemented by means of two main optical visualization systems developed in-house: 1) a fluidic operation analysis system using high speed video data to determine volumes, mixing efficiency and flow rates, and 2) a microscopy analysis system to investigate homogeneity and concentration of blood cells. Fluidic parameters were derived from the optical flow [4] as well as color-based segmentation of the different fluids using a hue-saturation-value (HSV) color space. Cell count estimates were obtained using automated microscopy analysis and were compared to a widely accepted manual method for cell counting using a hemocytometer [5]. The results using the first iteration microfluidic device [3] showed that the most simple - and thus low-cost - approach for microfluidic component implementation was not adequate as compared to techniques based on manual cell counting principles. An improved microfluidic design has been developed to incorporate enhanced mixing and metering components, which together with this work provides the foundation on which to successfully implement automated, rapid and low-cost blood cell counting tests.
ERIC Educational Resources Information Center
Bunde, Gary R.
A statistical comparison was made between two automated devices which were used to count data points (words, sentences, and syllables) needed in the Flesch Reading Ease Score to determine the reading grade level of written material. Determination of grade level of all Rate Training Manuals and Non-Resident Career Courses had been requested by the…
The effect of trigger point management by positional release therapy on tension type headache.
Ghanbari, Ali; Rahimijaberi, Abbas; Mohamadi, Marzieh; Abbasi, Leila; Sarvestani, Fahimeh Kamali
2012-01-01
The aim of this study was to compare the effectiveness of trigger points' management by Positional Release Therapy (PRT) and routine medical therapy in treatment of Tension Type Headache. Tension Type Headache is the most frequent headache with the basis of myofascial and trigger point disorders. PRT is an indirect technique that treats trigger points. 30 Patients with active trigger points in cervical muscles entered to the study. They were randomly assigned to PRT or medical therapy group. Headache frequency, intensity and duration and tablet count were recorded by use of a daily headache diary. Sensitivity of trigger points was assessed by numeric pain intensity and by use of a digital force gauge (FG 5020). Both groups showed significant reduction in headache frequency and duration and tablet count after treatment phase. However, the reduction of study variables was persisted only in PRT group after follow up phase. There was no significant reduction in headache intensity, neither in PRT and nor in medication group. Sensitivity of trigger points was significantly reduced. In comparison of the two study groups, there was no significant difference in headache frequency, intensity, duration and tablet count (p> 0.05). Both procedures were equally effective according to the study. Thus, PRT can be a treatment choice for patients with T.T.H.
A general study of techniques for ultraviolet astrophysical studies on space vehicles
NASA Technical Reports Server (NTRS)
Moos, H. W.; Fastie, W. G.; Davidsen, A. F.
1977-01-01
Recent accomplishments in three areas of UV instrumentation for space astronomy are discussed. These areas include reliable UV photometry, sensitive photon-detection techniques, and precise telescope pointing. Calibration facilities for spectrometers designed to operate in the spectral regions above 1200 A and down to 400 A are described which employ a series of diodes calibrated against electron synchrotron radiation as well as other radiometric standards. Improvements in photon-detection sensitivity achieved with the aid of pulse-counting electronics and multispectral detectors are reported, and the technique of precise subarcsecond telescope pointing is briefly noted. Some observational results are presented which demonstrate the advantages and precision of the instruments and techniques considered.
Garment Counting in a Textile Warehouse by Means of a Laser Imaging System
Martínez-Sala, Alejandro Santos; Sánchez-Aartnoutse, Juan Carlos; Egea-López, Esteban
2013-01-01
Textile logistic warehouses are highly automated mechanized places where control points are needed to count and validate the number of garments in each batch. This paper proposes and describes a low cost and small size automated system designed to count the number of garments by processing an image of the corresponding hanger hooks generated using an array of phototransistors sensors and a linear laser beam. The generated image is processed using computer vision techniques to infer the number of garment units. The system has been tested on two logistic warehouses with a mean error in the estimated number of hangers of 0.13%. PMID:23628760
Garment counting in a textile warehouse by means of a laser imaging system.
Martínez-Sala, Alejandro Santos; Sánchez-Aartnoutse, Juan Carlos; Egea-López, Esteban
2013-04-29
Textile logistic warehouses are highly automated mechanized places where control points are needed to count and validate the number of garments in each batch. This paper proposes and describes a low cost and small size automated system designed to count the number of garments by processing an image of the corresponding hanger hooks generated using an array of phototransistors sensors and a linear laser beam. The generated image is processed using computer vision techniques to infer the number of garment units. The system has been tested on two logistic warehouses with a mean error in the estimated number of hangers of 0.13%.
Winston Paul Smith; Daniel J. Twedt; David A. Wiedenfeld; Paul B. Hamel; Robert P. Ford; Robert J. Cooper
1993-01-01
To compare efficacy of point count sampling in bottomland hardwood forests, duration of point count, number of point counts, number of visits to each point during a breeding season, and minimum sample size are examined.
Statistical aspects of point count sampling
Barker, R.J.; Sauer, J.R.; Ralph, C.J.; Sauer, J.R.; Droege, S.
1995-01-01
The dominant feature of point counts is that they do not census birds, but instead provide incomplete counts of individuals present within a survey plot. Considering a simple model for point count sampling, we demon-strate that use of these incomplete counts can bias estimators and testing procedures, leading to inappropriate conclusions. A large portion of the variability in point counts is caused by the incomplete counting, and this within-count variation can be confounded with ecologically meaningful varia-tion. We recommend caution in the analysis of estimates obtained from point counts. Using; our model, we also consider optimal allocation of sampling effort. The critical step in the optimization process is in determining the goals of the study and methods that will be used to meet these goals. By explicitly defining the constraints on sampling and by estimating the relationship between precision and bias of estimators and time spent counting, we can predict the optimal time at a point for each of several monitoring goals. In general, time spent at a point will differ depending on the goals of the study.
Mapping of Bird Distributions from Point Count Surveys
John R. Sauer; Grey W. Pendleton; Sandra Orsillo
1995-01-01
Maps generated from bird survey data are used for a variety of scientific purposes, but little is known about their bias and precision. We review methods for preparing maps from point count data and appropriate sampling methods for maps based on point counts. Maps based on point counts can be affected by bias associated with incomplete counts, primarily due to changes...
Tutorial on X-ray photon counting detector characterization.
Ren, Liqiang; Zheng, Bin; Liu, Hong
2018-01-01
Recent advances in photon counting detection technology have led to significant research interest in X-ray imaging. As a tutorial level review, this paper covers a wide range of aspects related to X-ray photon counting detector characterization. The tutorial begins with a detailed description of the working principle and operating modes of a pixelated X-ray photon counting detector with basic architecture and detection mechanism. Currently available methods and techniques for charactering major aspects including energy response, noise floor, energy resolution, count rate performance (detector efficiency), and charge sharing effect of photon counting detectors are comprehensively reviewed. Other characterization aspects such as point spread function (PSF), line spread function (LSF), contrast transfer function (CTF), modulation transfer function (MTF), noise power spectrum (NPS), detective quantum efficiency (DQE), bias voltage, radiation damage, and polarization effect are also remarked. A cadmium telluride (CdTe) pixelated photon counting detector is employed for part of the characterization demonstration and the results are presented. This review can serve as a tutorial for X-ray imaging researchers and investigators to understand, operate, characterize, and optimize photon counting detectors for a variety of applications.
Point Counts of Birds: What Are We Estimating?
Douglas H. Johnson
1995-01-01
Point counts of birds are made for many reasons, including estimating local densities, determining population trends, assessing habitat preferences, and exploiting the activities of recreational birdwatchers. Problems arise unless there is a clear understanding of what point counts mean in terms of actual populations of birds. Criteria for conducting point counts...
Book review: Bird census techniques, Second edition
Sauer, John R.
2002-01-01
Conservation concerns, federal mandates to monitor birds, and citizen science programs have spawned a variety of surveys that collect information on bird populations. Unfortunately, all too frequently these surveys are poorly designed and use inappropriate counting methods. Some of the flawed approaches reflect a lack of understanding of statistical design; many ornithologists simply are not aware that many of our most entrenched counting methods (such as point counts) cannot appropriately be used in studies that compare densities of birds over space and time. It is likely that most of the readers of The Condor have participated in a bird population survey that has been criticized for poor sampling methods. For example, North American readers may be surprised to read in Bird Census Techniques that the North American Breeding Bird Survey 'is seriously flawed in its design,' and that 'Analysis of trends is impossible from points that are positioned along roads' (p. 109). Our conservation efforts are at risk if we do not acknowledge these concerns and improve our survey designs. Other surveys suffer from a lack of focus. In Bird Census Techniques, the authors emphasize that all surveys require clear statements of objectives and an understanding of appropriate survey designs to meet their objectives. Too often, we view survey design as the realm of ornithologists who know the life histories and logistical issues relevant to counting birds. This view reflects pure hubris: survey design is a collaboration between ornithologists, statisticians, and managers, in which goals based on management needs are met by applying statistical principles for design to the biological context of the species of interest. Poor survey design is often due to exclusion of some of these partners from survey development. Because ornithologists are too frequently unaware of these issues, books such as Bird Census Techniques take on added importance as manuals for educating ornithologists about the relevance of survey design and methods and the often subtle interdisciplinary nature of surveys.Review info: Bird Census Techniques, Second Edition. By Colin J. Bibby, Neil D. Burgess, David A. Hill, and Simon H. Mustoe. 2000. Academic Press, London, UK. xvii 1 302 pp. ISBN 0- 12-095831-7.
Increasing point-count duration increases standard error
Smith, W.P.; Twedt, D.J.; Hamel, P.B.; Ford, R.P.; Wiedenfeld, D.A.; Cooper, R.J.
1998-01-01
We examined data from point counts of varying duration in bottomland forests of west Tennessee and the Mississippi Alluvial Valley to determine if counting interval influenced sampling efficiency. Estimates of standard error increased as point count duration increased both for cumulative number of individuals and species in both locations. Although point counts appear to yield data with standard errors proportional to means, a square root transformation of the data may stabilize the variance. Using long (>10 min) point counts may reduce sample size and increase sampling error, both of which diminish statistical power and thereby the ability to detect meaningful changes in avian populations.
An Exploratory Analysis of Waterfront Force Protection Measures Using Simulation
2002-03-01
LEFT BLANK 75 APPENDIX B. DESIGN POINT DATA Table 16. Design Point One Data breach - count leakers- count numberAv ailablePBs- mean numberInI...0.002469 0.006237 27.63104 7144.875 0.155223 76 Table 17. Design Point Two Data breach - count leakers- count numberAv ailablePBs- mean numberInI...0.001163 4.67E-12 29.80891 6393.874 0.188209 77 Table 18. Design Point Three Data breach - count leakers- count numberAv ailablePBs- mean
NASA Technical Reports Server (NTRS)
Jones, John H.; Hanson, B. Z.
2011-01-01
Petrologic investigation of the shergottites has been hampered by the fact that most of these meteorites are partial cumulates. Two lines of inquiry have been used to evaluate the compositions of parental liquids: (i) perform melting experiments at different pressures and temperatures until the compositions of cumulate crystal cores are reproduced [e.g., 1]; and (ii) use point-counting techniques to reconstruct the compositions of intercumulus liquids [e.g., 2]. The second of these methods is hampered by the approximate nature of the technique. In effect, element maps are used to construct mineral modes; and average mineral compositions are then converted into bulk compositions. This method works well when the mineral phases are homogeneous [3]. However, when minerals are zoned, with narrow rims contributing disproportionately to the mineral volume, this method becomes problematic. Decisions need to be made about the average composition of the various zones within crystals. And, further, the proportions of those zones also need to be defined. We have developed a new microprobe technique to see whether the point-count method of determining intercumulus liquid composition is realistic. In our technique, the approximating decisions of earlier methods are unnecessary because each pixel of our x-ray maps is turned into a complete eleven-element quantitative analysis. The success or failure of our technique can then be determined by experimentation. As discussed earlier, experiments on our point-count composition can then be used to see whether experimental liquidus phases successfully reproduce natural mineral compositions. Regardless of our ultimate outcome in retrieving shergottite parent liquids, we believe our pixel-bypixel analysis technique represents a giant step forward in documenting thin-section modes and compositions. For a third time, we have analyzed the groundmass composition of EET 79001, 68 [Eg]. The first estimate of Eg was made by [4] and later modified by [5], to take phase diagram considerations into account. The Eg composition of [4] was too olivine normative to be the true Eg composition, because the ,68 groundmass contains no forsteritic olivine. A later mapping by [2] basically reconfirmed the modifications of [5]. However, even the modified composition of [5] has olivine on the liquidus for 50 C before low-Ca pyroxene appears [6].
Direct reading of electrocardiograms and respiration rates
NASA Technical Reports Server (NTRS)
Wise, J. P.
1969-01-01
Technique for reading heart and respiration rates is more accurate and direct than the previous method. Index of a plastic calibrated card is aligned with a point on the electrocardiogram. Complexes are counted as indicated on the card and heart or respiration rate is read directly from the appropriate scale.
Spatial patterns in vegetation fires in the Indian region.
Vadrevu, Krishna Prasad; Badarinath, K V S; Anuradha, Eaturu
2008-12-01
In this study, we used fire count datasets derived from Along Track Scanning Radiometer (ATSR) satellite to characterize spatial patterns in fire occurrences across highly diverse geographical, vegetation and topographic gradients in the Indian region. For characterizing the spatial patterns of fire occurrences, observed fire point patterns were tested against the hypothesis of a complete spatial random (CSR) pattern using three different techniques, the quadrat analysis, nearest neighbor analysis and Ripley's K function. Hierarchical nearest neighboring technique was used to depict the 'hotspots' of fire incidents. Of the different states, highest fire counts were recorded in Madhya Pradesh (14.77%) followed by Gujarat (10.86%), Maharastra (9.92%), Mizoram (7.66%), Jharkhand (6.41%), etc. With respect to the vegetation categories, highest number of fires were recorded in agricultural regions (40.26%) followed by tropical moist deciduous vegetation (12.72), dry deciduous vegetation (11.40%), abandoned slash and burn secondary forests (9.04%), tropical montane forests (8.07%) followed by others. Analysis of fire counts based on elevation and slope range suggested that maximum number of fires occurred in low and medium elevation types and in very low to low-slope categories. Results from three different spatial techniques for spatial pattern suggested clustered pattern in fire events compared to CSR. Most importantly, results from Ripley's K statistic suggested that fire events are highly clustered at a lag-distance of 125 miles. Hierarchical nearest neighboring clustering technique identified significant clusters of fire 'hotspots' in different states in northeast and central India. The implications of these results in fire management and mitigation were discussed. Also, this study highlights the potential of spatial point pattern statistics in environmental monitoring and assessment studies with special reference to fire events in the Indian region.
Statistical Aspects of Point Count Sampling
Richard J. Barker; John R. Sauer
1995-01-01
The dominant feature of point counts is that they do not census birds, but instead provide incomplete counts of individuals present within a survey plot. Considering a simple model for point count sampling, we demonstrate that use of these incomplete counts can bias estimators and testing procedures, leading to inappropriate conclusions. A large portion of the...
Acer, Niyazi; Sahin, Bunyamin; Ucar, Tolga; Usanmaz, Mustafa
2009-01-01
The size of the eyeball has been the subject of a few studies. None of them used stereological methods to estimate the volume. In the current study, we estimated the volume of eyeball in normal men and women using the stereological methods. Eyeball volume (EV) was estimated using the Cavalieri principle as a combination of point-counting and planimetry techniques. We used computed tomography scans taken from 36 participants (15 men and 21 women) to estimate the EV. The mean (SD) EV values obtained by planimetry method were 7.49 (0.79) and 7.06 (0.85) cm in men and women, respectively. By using point-counting method, the mean (SD) values were 7.48 (0.85) and 7.21 (0.84) cm in men and women, respectively. There was no statistically significant difference between the findings from the 2 methods (P > 0.05). A weak correlation was found between the axial length of eyeball and the EV estimated by point counting and planimetry (P < 0.05, r = 0.494 and r = 0.523, respectively). The findings of the current study using the stereological methods could provide data for the evaluation of normal and pathologic volumes of the eyeball.
Abdominal fat volume estimation by stereology on CT: a comparison with manual planimetry.
Manios, G E; Mazonakis, M; Voulgaris, C; Karantanas, A; Damilakis, J
2016-03-01
To deploy and evaluate a stereological point-counting technique on abdominal CT for the estimation of visceral (VAF) and subcutaneous abdominal fat (SAF) volumes. Stereological volume estimations based on point counting and systematic sampling were performed on images from 14 consecutive patients who had undergone abdominal CT. For the optimization of the method, five sampling intensities in combination with 100 and 200 points were tested. The optimum stereological measurements were compared with VAF and SAF volumes derived by the standard technique of manual planimetry on the same scans. Optimization analysis showed that the selection of 200 points along with the sampling intensity 1/8 provided efficient volume estimations in less than 4 min for VAF and SAF together. The optimized stereology showed strong correlation with planimetry (VAF: r = 0.98; SAF: r = 0.98). No statistical differences were found between the two methods (VAF: P = 0.81; SAF: P = 0.83). The 95% limits of agreement were also acceptable (VAF: -16.5%, 16.1%; SAF: -10.8%, 10.7%) and the repeatability of stereology was good (VAF: CV = 4.5%, SAF: CV = 3.2%). Stereology may be successfully applied to CT images for the efficient estimation of abdominal fat volume and may constitute a good alternative to the conventional planimetric technique. Abdominal obesity is associated with increased risk of disease and mortality. Stereology may quantify visceral and subcutaneous abdominal fat accurately and consistently. The application of stereology to estimating abdominal volume fat reduces processing time. Stereology is an efficient alternative method for estimating abdominal fat volume.
Farnsworth, G.L.; Nichols, J.D.; Sauer, J.R.; Fancy, S.G.; Pollock, K.H.; Shriner, S.A.; Simons, T.R.; Ralph, C. John; Rich, Terrell D.
2005-01-01
Point counts are a standard sampling procedure for many bird species, but lingering concerns still exist about the quality of information produced from the method. It is well known that variation in observer ability and environmental conditions can influence the detection probability of birds in point counts, but many biologists have been reluctant to abandon point counts in favor of more intensive approaches to counting. However, over the past few years a variety of statistical and methodological developments have begun to provide practical ways of overcoming some of the problems with point counts. We describe some of these approaches, and show how they can be integrated into standard point count protocols to greatly enhance the quality of the information. Several tools now exist for estimation of detection probability of birds during counts, including distance sampling, double observer methods, time-depletion (removal) methods, and hybrid methods that combine these approaches. Many counts are conducted in habitats that make auditory detection of birds much more likely than visual detection. As a framework for understanding detection probability during such counts, we propose separating two components of the probability a bird is detected during a count into (1) the probability a bird vocalizes during the count and (2) the probability this vocalization is detected by an observer. In addition, we propose that some measure of the area sampled during a count is necessary for valid inferences about bird populations. This can be done by employing fixed-radius counts or more sophisticated distance-sampling models. We recommend any studies employing point counts be designed to estimate detection probability and to include a measure of the area sampled.
A habitat-based point-count protocol for terrestrial birds, emphasizing Washington and Oregon.
Mark H. Huff; Kelly A. Bettinger; Howard L. Ferguson; Martin J. Brown; Bob. Altman
2000-01-01
We describe a protocol and provide a summary for point-count monitoring of landbirds that is designed for habitat-based objectives. Presentation is in four steps: preparation and planning, selecting monitoring sites, establishing monitoring stations, and conducting point counts. We describe the basis for doing habitat-based point counts, how they are organized, and how...
Mapping of bird distributions from point count surveys
Sauer, J.R.; Pendleton, G.W.; Orsillo, Sandra; Ralph, C.J.; Sauer, J.R.; Droege, S.
1995-01-01
Maps generated from bird survey data are used for a variety of scientific purposes, but little is known about their bias and precision. We review methods for preparing maps from point count data and appropriate sampling methods for maps based on point counts. Maps based on point counts can be affected by bias associated with incomplete counts, primarily due to changes in proportion counted as a function of observer or habitat differences. Large-scale surveys also generally suffer from regional and temporal variation in sampling intensity. A simulated surface is used to demonstrate sampling principles for maps.
Automatic image acquisition processor and method
Stone, William J.
1986-01-01
A computerized method and point location system apparatus is disclosed for ascertaining the center of a primitive or fundamental object whose shape and approximate location are known. The technique involves obtaining an image of the object, selecting a trial center, and generating a locus of points having a predetermined relationship with the center. Such a locus of points could include a circle. The number of points overlying the object in each quadrant is obtained and the counts of these points per quadrant are compared. From this comparison, error signals are provided to adjust the relative location of the trial center. This is repeated until the trial center overlies the geometric center within the predefined accuracy limits.
Automatic image acquisition processor and method
Stone, W.J.
1984-01-16
A computerized method and point location system apparatus is disclosed for ascertaining the center of a primitive or fundamental object whose shape and approximate location are known. The technique involves obtaining an image of the object, selecting a trial center, and generating a locus of points having a predetermined relationship with the center. Such a locus of points could include a circle. The number of points overlying the object in each quadrant is obtained and the counts of these points per quadrant are compared. From this comparison, error signals are provided to adjust the relative location of the trial center. This is repeated until the trial center overlies the geometric center within the predefined accuracy limits.
Poka-yoke process controller: designed for individuals with cognitive impairments.
Erlandson, R F; Sant, D
1998-01-01
Poka-yoke is a Japanese term meaning "error proofing." Poka-yoke techniques were developed to achieve zero defects in manufacturing and assembly processes. The application of these techniques tends to reduce both the physical and cognitive demands of tasks and thereby make them more accessible. Poka-yoke interventions create a dialogue between the worker and the process, and this dialogue provides the feedback necessary for workers to prevent errors. For individuals with cognitive impairments, weighing and counting tasks can be difficult or impossible. Interventions that provide sufficient feedback to workers without disabilities tend to be too subtle for workers with cognitive impairments; hence, the feedback must be enhanced. The Poka-Yoke Controller (PYC) was designed to assist individuals with counting and weighing tasks. The PYC interfaces to an Ohaus CT6000 digital scale for weighing parts and for counting parts by weight. It also interfaces to sensors and switches for object counting tasks. The PYC interfaces to a variety of programmable voice output devices so that voice feedback or prompting can be provided at specific points in the weighing or counting process. The PYC can also be interfaced to conveyor systems, indexed turntables, and other material handling systems for coordinated counting and material handling operations. In all of our applications to date, we have observed improved worker performance, improved process quality, and greater worker independence. These observed benefits have also significantly reduced the need for staff intervention. The process controller is described and three applications are presented: a weighing task and two counting applications.
Fractal analysis of multiscale spatial autocorrelation among point data
De Cola, L.
1991-01-01
The analysis of spatial autocorrelation among point-data quadrats is a well-developed technique that has made limited but intriguing use of the multiscale aspects of pattern. In this paper are presented theoretical and algorithmic approaches to the analysis of aggregations of quadrats at or above a given density, in which these sets are treated as multifractal regions whose fractal dimension, D, may vary with phenomenon intensity, scale, and location. The technique is illustrated with Matui's quadrat house-count data, which yield measurements consistent with a nonautocorrelated simulated Poisson process but not with an orthogonal unit-step random walk. The paper concludes with a discussion of the implications of such analysis for multiscale geographic analysis systems. -Author
Wheat Ear Detection in Plots by Segmenting Mobile Laser Scanner Data
NASA Astrophysics Data System (ADS)
Velumani, K.; Oude Elberink, S.; Yang, M. Y.; Baret, F.
2017-09-01
The use of Light Detection and Ranging (LiDAR) to study agricultural crop traits is becoming popular. Wheat plant traits such as crop height, biomass fractions and plant population are of interest to agronomists and biologists for the assessment of a genotype's performance in the environment. Among these performance indicators, plant population in the field is still widely estimated through manual counting which is a tedious and labour intensive task. The goal of this study is to explore the suitability of LiDAR observations to automate the counting process by the individual detection of wheat ears in the agricultural field. However, this is a challenging task owing to the random cropping pattern and noisy returns present in the point cloud. The goal is achieved by first segmenting the 3D point cloud followed by the classification of segments into ears and non-ears. In this study, two segmentation techniques: a) voxel-based segmentation and b) mean shift segmentation were adapted to suit the segmentation of plant point clouds. An ear classification strategy was developed to distinguish the ear segments from leaves and stems. Finally, the ears extracted by the automatic methods were compared with reference ear segments prepared by manual segmentation. Both the methods had an average detection rate of 85 %, aggregated over different flowering stages. The voxel-based approach performed well for late flowering stages (wheat crops aged 210 days or more) with a mean percentage accuracy of 94 % and takes less than 20 seconds to process 50,000 points with an average point density of 16 points/cm2. Meanwhile, the mean shift approach showed comparatively better counting accuracy of 95% for early flowering stage (crops aged below 225 days) and takes approximately 4 minutes to process 50,000 points.
Klingbeil, Brian T; Willig, Michael R
2015-01-01
Effective monitoring programs for biodiversity are needed to assess trends in biodiversity and evaluate the consequences of management. This is particularly true for birds and faunas that occupy interior forest and other areas of low human population density, as these are frequently under-sampled compared to other habitats. For birds, Autonomous Recording Units (ARUs) have been proposed as a supplement or alternative to point counts made by human observers to enhance monitoring efforts. We employed two strategies (i.e., simultaneous-collection and same-season) to compare point count and ARU methods for quantifying species richness and composition of birds in temperate interior forests. The simultaneous-collection strategy compares surveys by ARUs and point counts, with methods matched in time, location, and survey duration such that the person and machine simultaneously collect data. The same-season strategy compares surveys from ARUs and point counts conducted at the same locations throughout the breeding season, but methods differ in the number, duration, and frequency of surveys. This second strategy more closely follows the ways in which monitoring programs are likely to be implemented. Site-specific estimates of richness (but not species composition) differed between methods; however, the nature of the relationship was dependent on the assessment strategy. Estimates of richness from point counts were greater than estimates from ARUs in the simultaneous-collection strategy. Woodpeckers in particular, were less frequently identified from ARUs than point counts with this strategy. Conversely, estimates of richness were lower from point counts than ARUs in the same-season strategy. Moreover, in the same-season strategy, ARUs detected the occurrence of passerines at a higher frequency than did point counts. Differences between ARU and point count methods were only detected in site-level comparisons. Importantly, both methods provide similar estimates of species richness and composition for the region. Consequently, if single visits to sites or short-term monitoring are the goal, point counts will likely perform better than ARUs, especially if species are rare or vocalize infrequently. However, if seasonal or annual monitoring of sites is the goal, ARUs offer a viable alternative to standard point-count methods, especially in the context of large-scale or long-term monitoring of temperate forest birds.
Pendleton, G.W.; Ralph, C. John; Sauer, John R.; Droege, Sam
1995-01-01
Many factors affect the use of point counts for monitoring bird populations, including sampling strategies, variation in detection rates, and independence of sample points. The most commonly used sampling plans are stratified sampling, cluster sampling, and systematic sampling. Each of these might be most useful for different objectives or field situations. Variation in detection probabilities and lack of independence among sample points can bias estimates and measures of precision. All of these factors should be con-sidered when using point count methods.
Evaluating the ability of regional models to predict local avian abundance
LeBrun, Jaymi J.; Thogmartin, Wayne E.; Miller, James R.
2012-01-01
Spatial modeling over broad scales can potentially direct conservation efforts to areas with high species-specific abundances. We examined the performance of regional models for predicting bird abundance at spatial scales typically addressed in conservation planning. Specifically, we used point count data on wood thrush (Hylocichla mustelina) and blue-winged warbler (Vermivora cyanoptera) from 2 time periods (1995-1998 and 2006-2007) to evaluate the ability of regional models derived via Bayesian hierarchical techniques to predict bird abundance. We developed models for each species within Bird Conservation Region (BCR) 23 in the upper midwestern United States at 800-ha, 8,000-ha, and approximately 80,000-ha scales. We obtained count data from the Breeding Bird Survey and land cover data from the National Land Cover Dataset (1992). We evaluated predictions from the best models, as defined by an information-theoretic criterion, using point count data collected within an ecological subregion of BCR 23 at 131 count stations in the 1990s and again in 2006-2007. Competing (Deviance Information Criteria rs = 0.57; P = 0.14), the survey period that most closely aligned with the time period of data used for regional model construction. Wood thrush models exhibited positive correlations with point count data for all survey areas and years combined (rs = 0.58, P ≤ 0.001). In comparison, blue-winged warbler models performed worse as time increased between the point count surveys and vintage of the model building data (rs = 0.03, P = 0.92 for Iowa and rs = 0.13, P = 0.51 for all areas, 2006-2007), likely related to the ephemeral nature of their preferred early successional habitat. Species abundance and sensitivity to changing habitat conditions seems to be an important factor in determining the predictive ability of regional models. Hierarchical models can be a useful tool for concentrating efforts at the scale of management units and should be one of many tools used by land managers, but we caution that the utility of such models may decrease over time for species preferring relatively ephemeral habitats if model inputs are not updated accordingly.
Evaluating point count efficiency relative to territory mapping in cropland birds
Andre Cyr; Denis Lepage; Kathryn Freemark
1995-01-01
Species richness, composition, and abundance of farmland birds were compared between point counts (50-m, 100-m, and 150-m radius half circles) and territory mapping on three 40-ha plots in Québec, Canada. Point counts of smaller radii tended to have larger density estimates than counts of larger radii. Territory mapping detected 10 species more than 150-m...
On the use of positron counting for radio-Assay in nuclear pharmaceutical production.
Maneuski, D; Giacomelli, F; Lemaire, C; Pimlott, S; Plenevaux, A; Owens, J; O'Shea, V; Luxen, A
2017-07-01
Current techniques for the measurement of radioactivity at various points during PET radiopharmaceutical production and R&D are based on the detection of the annihilation gamma rays from the radionuclide in the labelled compound. The detection systems to measure these gamma rays are usually variations of NaI or CsF scintillation based systems requiring costly and heavy lead shielding to reduce background noise. These detectors inherently suffer from low detection efficiency, high background noise and very poor linearity. They are also unable to provide any reasonably useful position information. A novel positron counting technique is proposed for the radioactivity assay during radiopharmaceutical manufacturing that overcomes these limitations. Detection of positrons instead of gammas offers an unprecedented level of position resolution of the radiation source (down to sub-mm) thanks to the nature of the positron interaction with matter. Counting capability instead of charge integration in the detector brings the sensitivity down to the statistical limits at the same time as offering very high dynamic range and linearity from zero to any arbitrarily high activity. This paper reports on a quantitative comparison between conventional detector systems and the proposed positron counting detector. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.
Can Detectability Analysis Improve the Utility of Point Counts for Temperate Forest Raptors?
Temperate forest breeding raptors are poorly represented in typical point count surveys because these birds are cryptic and typically breed at low densities. In recent years, many new methods for estimating detectability during point counts have been developed, including distanc...
Super resolution imaging of HER2 gene amplification
NASA Astrophysics Data System (ADS)
Okada, Masaya; Kubo, Takuya; Masumoto, Kanako; Iwanaga, Shigeki
2016-02-01
HER2 positive breast cancer is currently examined by counting HER2 genes using fluorescence in situ hybridization (FISH)-stained breast carcinoma samples. In this research, two-dimensional super resolution fluorescence microscopy based on stochastic optical reconstruction microscopy (STORM), with a spatial resolution of approximately 20 nm in the lateral direction, was used to more precisely distinguish and count HER2 genes in a FISH-stained tissue section. Furthermore, by introducing double-helix point spread function (DH-PSF), an optical phase modulation technique, to super resolution microscopy, three-dimensional images were obtained of HER2 in a breast carcinoma sample approximately 4 μm thick.
Determining the Uncertainty of X-Ray Absorption Measurements
Wojcik, Gary S.
2004-01-01
X-ray absorption (or more properly, x-ray attenuation) techniques have been applied to study the moisture movement in and moisture content of materials like cement paste, mortar, and wood. An increase in the number of x-ray counts with time at a location in a specimen may indicate a decrease in moisture content. The uncertainty of measurements from an x-ray absorption system, which must be known to properly interpret the data, is often assumed to be the square root of the number of counts, as in a Poisson process. No detailed studies have heretofore been conducted to determine the uncertainty of x-ray absorption measurements or the effect of averaging data on the uncertainty. In this study, the Poisson estimate was found to adequately approximate normalized root mean square errors (a measure of uncertainty) of counts for point measurements and profile measurements of water specimens. The Poisson estimate, however, was not reliable in approximating the magnitude of the uncertainty when averaging data from paste and mortar specimens. Changes in uncertainty from differing averaging procedures were well-approximated by a Poisson process. The normalized root mean square errors decreased when the x-ray source intensity, integration time, collimator size, and number of scanning repetitions increased. Uncertainties in mean paste and mortar count profiles were kept below 2 % by averaging vertical profiles at horizontal spacings of 1 mm or larger with counts per point above 4000. Maximum normalized root mean square errors did not exceed 10 % in any of the tests conducted. PMID:27366627
Temporal differences in point counts of bottomland forest landbirds
Smith, W.P.; Twedt, D.J.
1999-01-01
We compared number of avian species and individuals in morning and evening point counts during the breeding season and during winter in a bottomland hardwood forest in west-central Mississippi. USA. In both seasons, more species and individuals were recorded during morning counts than during evening counts. We also compared morning and evening detections for 18 species during the breeding season and 9 species during winter. Blue Jay (Cyanocitta cristata), Mourning Dove (Zenaida macroura), and Red-bellied Woodpecker (Melanerpes carolinus) were detected significantly more often in morning counts than in evening counts during the breeding season. Tufted Titmouse (Baeolophus bicolor) was recorded more often in morning Counts than evening counts during the breeding season and during winter. No species was detected more often in evening counts. Thus, evening point counts of birds during either the breeding season or winter will likely underestimate species richness, overall avian abundance, and the abundance of some individual species in bottomland hardwood forests.
Lens-free microscopy of cerebrospinal fluid for the laboratory diagnosis of meningitis
NASA Astrophysics Data System (ADS)
Delacroix, Robin; Morel, Sophie Nhu An; Hervé, Lionel; Bordy, Thomas; Blandin, Pierre; Dinten, Jean-Marc; Drancourt, Michel; Allier, Cédric
2018-02-01
The cytology of the cerebrospinal fluid is traditionally performed by an operator (physician, biologist) by means of a conventional light microscope. The operator visually counts the leukocytes (white blood cells) present in a sample of cerebrospinal fluid (10 μl). It is a tedious job and the result is operator-dependent. Here in order to circumvent the limitations of manual counting, we approach the question of numeration of erythrocytes and leukocytes for the cytological diagnosis of meningitis by means of lens-free microscopy. In a first step, a prospective counts of leukocytes was performed by five different operators using conventional optical microscopy. The visual counting yielded an overall 16.7% misclassification of 72 cerebrospinal fluid specimens in meningitis/non-meningitis categories using a 10 leukocyte/μL cut-off. In a second step, the lens-free microscopy algorithm was adapted step-by-step for counting cerebrospinal fluid cells and discriminating leukocytes from erythrocytes. The optimization of the automatic lens-free counting was based on the prospective analysis of 215 cerebrospinal fluid specimens. The optimized algorithm yielded a 100% sensitivity and a 86% specificity compared to confirmed diagnostics. In a third step, a blind lens-free microscopic analysis of 116 cerebrospinal fluid specimens, including six cases of microbiology confirmed infectious meningitis, yielded a 100% sensitivity and a 79% specificity. Adapted lens-free microscopy is thus emerging as an operator-independent technique for the rapid numeration of leukocytes and erythrocytes in cerebrospinal fluid. In particular, this technique is well suited to the rapid diagnosis of meningitis at point-of-care laboratories.
Comparison of Point Count Sampling Regimes for Monitoring Forest Birds
William H. Buskirk; Jennifer L. McDonald
1995-01-01
A set of 255 counts was compiled for 13 points using 10-minute periods subtallied at 3 and 6 minutes. The data from each point were subsampled using combinations of count periods, numbers, and schedules to compare the effectiveness of these different regimes at per point coverage. Interspecifically, detection frequencies differed in level and pattern as a function of...
Smith, W.P.; Wiedenfeld, D.A.; Hanel, P.B.; Twedt, D.J.; Ford, R.P.; Cooper, R.J.; Smith, Winston Paul
1993-01-01
To quantify efficacy of point count sampling in bottomland hardwood forests, we examined the influence of point count duration on corresponding estimates of number of individuals and species recorded. To accomplish this we conducted a totalof 82 point counts 7 May-16 May 1992distributed among three habitats (Wet, Mesic, Dry) in each of three regions within the lower Mississippi Alluvial Valley (MAV). Each point count consisted of recording the number of individual birds (all species) seen or heard during the initial three minutes and per each minute thereafter for a period totaling ten minutes. In addition, we included 384 point counts recorded during an 8-week period in each of 3 years (1985-1987) among 56 randomly-selected forest patches within the bottomlands of western Tennessee. Each point count consisted of recording the number of individuals (excluding migrating species) during each of four, 5 minute intervals for a period totaling 20 minutes. To estimate minimum sample size, we determined sampling variation at each level (region, habitat, and locality) with the 82 point counts from the lower (MAV) and applied the procedures of Neter and Wasserman (1974:493; Applied linear statistical models). Neither the cumulative number of individuals nor number of species per sampling interval attained an asymptote after 10 or 20 minutes of sampling. For western Tennessee bottomlands, total individual and species counts relative to point count duration were similar among years and comparable to the pattern observed throughout the lower MAV. Across the MAV, we recorded a total of 1,62 1 birds distributed among 52 species with the majority (8721/1621) representing 8 species. More birds were recorded within 25-50 m than in either of the other distance categories. There was significant variation in numbers of individuals and species among point counts. For both, significant differences between region and patch (nested within region) occurred; neither habitat nor interaction between habitat and region was significant. For = 0.05 and L3 = 0.10, minimum sample size estimates (per factor level) varied by orders of magnitude depending upon the observed or specified range of desired detectable difference. For observed regional variation, 20 and 40 point counts were required to accommodate variability in total birds (MSE = 9.28) and species (MSE = 3.79), respectively; 25 percent of the mean could be achieved with 5 counts per factor level. Corresponding sample sizes required to detect differences of rarer species (e.g., Wood Thrush) were 500; for common species (e.g., Northern Cardinal) this same level of precision could be achieved with 100 counts.
PID techniques: Alternatives to RICH Methods
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vavra, J.; /SLAC
2011-03-01
In this review article we discuss the recent progress in PID techniques other than the RICH methods. In particular we mention the recent progress in the Transition Radiation Detector (TRD), dE/dx cluster counting, and Time Of Flight (TOF) techniques. The TRD technique is mature and has been tried in many hadron colliders. It needs space though, about 20cm of detector radial space for every factor of 10 in the {pi}/e rejection power, and this tends to make such detectors large. Although the cluster counting technique is an old idea, it was never tried in a real physics experiment. Recently, theremore » are efforts to revive it for the SuperB experiment using He-based gases and waveform digitizing electronics. A factor of almost 2 improvement, compared to the classical dE/dx performance, is possible in principle. However, the complexity of the data analysis will be substantial. The TOF technique is well established, but introduction of new fast MCP-PMT and G-APD detectors creates new possibilities. It seems that resolutions below 20-30ps may be possible at some point in the future with relatively small systems, and perhaps this could be pushed down to 10-15ps with very small systems, assuming that one can solve many systematic issues. However, the cost, rate limitation, aging and cross-talk in multi-anode devices at high BW are problems. There are several groups working on these issues, so progress is likely. Table 6 summarizes the author's opinion of pros and cons of various detectors presented in this paper based on their operational capabilities. We refer the reader to Ref.40 for discussion of other more general limits from the PID point of view.« less
Point count length and detection of forest neotropical migrant birds
Dawson, D.K.; Smith, D.R.; Robbins, C.S.; Ralph, C. John; Sauer, John R.; Droege, Sam
1995-01-01
Comparisons of bird abundances among years or among habitats assume that the rates at which birds are detected and counted are constant within species. We use point count data collected in forests of the Mid-Atlantic states to estimate detection probabilities for Neotropical migrant bird species as a function of count length. For some species, significant differences existed among years or observers in both the probability of detecting the species and in the rate at which individuals are counted. We demonstrate the consequence that variability in species' detection probabilities can have on estimates of population change, and discuss ways for reducing this source of bias in point count studies.
Frank R. Thompson; Monica J. Schwalbach
1995-01-01
We report results of a point count survey of breeding birds on Hoosier National Forest in Indiana. We determined sample size requirements to detect differences in means and the effects of count duration and plot size on individual detection rates. Sample size requirements ranged from 100 to >1000 points with Type I and II error rates of <0.1 and 0.2. Sample...
Jean-Pierre L. Savard; Tracey D. Hooper
1995-01-01
We examine the effect of survey length and radius on the results of point count surveys for grassland birds at Williams Lake, British Columbia. Four- and 8-minute counts detected on average 68 percent and 85 percent of the number of birds detected during 12-minute counts. The most efficient sampling duration was 4 minutes, as long as travel time between points was...
Influence of Point Count Length and Repeated Visits on Habitat Model Performance
Randy Dettmers; David A. Buehler; John G. Bartlett; Nathan A. Klaus
1999-01-01
Point counts are commonly used to monitor bird populations, and a substantial amount of research has investigated how conducting counts for different lengths of time affects the accuracy of these counts and the subsequent ability to monitor changes in population trends. However, little work has been done io assess how changes in count duration affect bird-habitat...
Optimizing the duration of point counts for monitoring trends in bird populations
Jared Verner
1988-01-01
Minute-by-minute analysis of point counts of birds in mixed-conifer forests in the Sierra National Forest, central California, showed that cumulative counts of species and individuals increased in a curvilinear fashion but did not reach asymptotes after 10 minutes of counting. Comparison of the expected number of individuals counted per hour with various combinations...
George L. Farnsworth; James D. Nichols; John R. Sauer; Steven G. Fancy; Kenneth H. Pollock; Susan A. Shriner; Theodore R. Simons
2005-01-01
Point counts are a standard sampling procedure for many bird species, but lingering concerns still exist about the quality of information produced from the method. It is well known that variation in observer ability and environmental conditions can influence the detection probability of birds in point counts, but many biologists have been reluctant to abandon point...
Ammersbach, Mélanie; Beaufrère, Hugues; Gionet Rollick, Annick; Tully, Thomas
2015-03-01
While hematologic reference intervals (RI) are available for multiple raptorial species of the order Accipitriformes and Falconiformes, there is a lack of valuable hematologic information in Strigiformes that can be used for diagnostic and health monitoring purposes. The objective was to report RI in Strigiformes for hematologic variables and to assess agreement between manual cell counting techniques. A multi-center prospective study was designed to assess hematologic RI and blood cell morphology in owl species. Samples were collected from individuals representing 13 Strigiformes species, including Great Horned Owl, Snowy Owl, Eurasian Eagle Owl, Barred Owl, Great Gray Owl, Ural Owl, Northern Saw-Whet Owls, Northern Hawk Owl, Spectacled Owl, Barn Owl, Eastern Screech Owl, Long-Eared Owl, and Short-Eared Owl. Red blood cell count was determined manually using a hemocytometer. White blood cell count was determined using 3 manual counting techniques: (1) phloxine B technique, (2) Natt and Herrick technique, and (3) estimation from the smear. Differential counts and blood cell morphology were determined on smears. Reference intervals were determined and agreement between methods was calculated. Important species-specific differences were observed in blood cell counts and granulocyte morphology. Differences in WBC count between species did not appear to be predictable based on phylogenetic relationships. Overall, most boreal owl species exhibited a lower WBC count than other species. Important disagreements were found between different manual WBC counting techniques. Disagreements observed between manual counting techniques suggest that technique-specific RI should be used in Strigiformes. © 2015 American Society for Veterinary Clinical Pathology.
Poisson and negative binomial item count techniques for surveys with sensitive question.
Tian, Guo-Liang; Tang, Man-Lai; Wu, Qin; Liu, Yin
2017-04-01
Although the item count technique is useful in surveys with sensitive questions, privacy of those respondents who possess the sensitive characteristic of interest may not be well protected due to a defect in its original design. In this article, we propose two new survey designs (namely the Poisson item count technique and negative binomial item count technique) which replace several independent Bernoulli random variables required by the original item count technique with a single Poisson or negative binomial random variable, respectively. The proposed models not only provide closed form variance estimate and confidence interval within [0, 1] for the sensitive proportion, but also simplify the survey design of the original item count technique. Most importantly, the new designs do not leak respondents' privacy. Empirical results show that the proposed techniques perform satisfactorily in the sense that it yields accurate parameter estimate and confidence interval.
NASA Astrophysics Data System (ADS)
Bashkov, O. V.; Bryansky, A. A.; Panin, S. V.; Zaikov, V. I.
2016-11-01
Strength properties of the glass fiber reinforced polymers (GFRP) fabricated by vacuum and vacuum autoclave molding techniques were analyzed. Measurements of porosity of the GFRP parts manufactured by various molding techniques were conducted with the help of optical microscopy. On the basis of experimental data obtained by means of acoustic emission hardware/software setup, the technique for running diagnostics and forecasting the bearing capacity of polymeric composite materials based on the result of three-point bending tests has been developed. The operation principle of the technique is underlined by the evaluation of the power function index change which takes place on the dependence of the total acoustic emission counts versus the loading stress.
Setting up a Rayleigh Scattering Based Flow Measuring System in a Large Nozzle Testing Facility
NASA Technical Reports Server (NTRS)
Panda, Jayanta; Gomez, Carlos R.
2002-01-01
A molecular Rayleigh scattering based air density measurement system has been built in a large nozzle testing facility at NASA Glenn Research Center. The technique depends on the light scattering by gas molecules present in air; no artificial seeding is required. Light from a single mode, continuous wave laser was transmitted to the nozzle facility by optical fiber, and light scattered by gas molecules, at various points along the laser beam, is collected and measured by photon-counting electronics. By placing the laser beam and collection optics on synchronized traversing units, the point measurement technique is made effective for surveying density variation over a cross-section of the nozzle plume. Various difficulties associated with dust particles, stray light, high noise level and vibration are discussed. Finally, a limited amount of data from an underexpanded jet are presented and compared with expected variations to validate the technique.
Reviving common standards in point-count surveys for broad inference across studies
Matsuoka, Steven M.; Mahon, C. Lisa; Handel, Colleen M.; Solymos, Peter; Bayne, Erin M.; Fontaine, Patricia C.; Ralph, C.J.
2014-01-01
We revisit the common standards recommended by Ralph et al. (1993, 1995a) for conducting point-count surveys to assess the relative abundance of landbirds breeding in North America. The standards originated from discussions among ornithologists in 1991 and were developed so that point-count survey data could be broadly compared and jointly analyzed by national data centers with the goals of monitoring populations and managing habitat. Twenty years later, we revisit these standards because (1) they have not been universally followed and (2) new methods allow estimation of absolute abundance from point counts, but these methods generally require data beyond the original standards to account for imperfect detection. Lack of standardization and the complications it introduces for analysis become apparent from aggregated data. For example, only 3% of 196,000 point counts conducted during the period 1992-2011 across Alaska and Canada followed the standards recommended for the count period and count radius. Ten-minute, unlimited-count-radius surveys increased the number of birds detected by >300% over 3-minute, 50-m-radius surveys. This effect size, which could be eliminated by standardized sampling, was ≥10 times the published effect sizes of observers, time of day, and date of the surveys. We suggest that the recommendations by Ralph et al. (1995a) continue to form the common standards when conducting point counts. This protocol is inexpensive and easy to follow but still allows the surveys to be adjusted for detection probabilities. Investigators might optionally collect additional information so that they can analyze their data with more flexible forms of removal and time-of-detection models, distance sampling, multiple-observer methods, repeated counts, or combinations of these methods. Maintaining the common standards as a base protocol, even as these study-specific modifications are added, will maximize the value of point-count data, allowing compilation and analysis by regional and national data centers.
MCNP-REN - A Monte Carlo Tool for Neutron Detector Design Without Using the Point Model
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abhold, M.E.; Baker, M.C.
1999-07-25
The development of neutron detectors makes extensive use of the predictions of detector response through the use of Monte Carlo techniques in conjunction with the point reactor model. Unfortunately, the point reactor model fails to accurately predict detector response in common applications. For this reason, the general Monte Carlo N-Particle code (MCNP) was modified to simulate the pulse streams that would be generated by a neutron detector and normally analyzed by a shift register. This modified code, MCNP - Random Exponentially Distributed Neutron Source (MCNP-REN), along with the Time Analysis Program (TAP) predict neutron detector response without using the pointmore » reactor model, making it unnecessary for the user to decide whether or not the assumptions of the point model are met for their application. MCNP-REN is capable of simulating standard neutron coincidence counting as well as neutron multiplicity counting. Measurements of MOX fresh fuel made using the Underwater Coincidence Counter (UWCC) as well as measurements of HEU reactor fuel using the active neutron Research Reactor Fuel Counter (RRFC) are compared with calculations. The method used in MCNP-REN is demonstrated to be fundamentally sound and shown to eliminate the need to use the point model for detector performance predictions.« less
Sample size and allocation of effort in point count sampling of birds in bottomland hardwood forests
Smith, W.P.; Twedt, D.J.; Cooper, R.J.; Wiedenfeld, D.A.; Hamel, P.B.; Ford, R.P.; Ralph, C. John; Sauer, John R.; Droege, Sam
1995-01-01
To examine sample size requirements and optimum allocation of effort in point count sampling of bottomland hardwood forests, we computed minimum sample sizes from variation recorded during 82 point counts (May 7-May 16, 1992) from three localities containing three habitat types across three regions of the Mississippi Alluvial Valley (MAV). Also, we estimated the effect of increasing the number of points or visits by comparing results of 150 four-minute point counts obtained from each of four stands on Delta Experimental Forest (DEF) during May 8-May 21, 1991 and May 30-June 12, 1992. For each stand, we obtained bootstrap estimates of mean cumulative number of species each year from all possible combinations of six points and six visits. ANOVA was used to model cumulative species as a function of number of points visited, number of visits to each point, and interaction of points and visits. There was significant variation in numbers of birds and species between regions and localities (nested within region); neither habitat, nor the interaction between region and habitat, was significant. For a = 0.05 and a = 0.10, minimum sample size estimates (per factor level) varied by orders of magnitude depending upon the observed or specified range of desired detectable difference. For observed regional variation, 20 and 40 point counts were required to accommodate variability in total individuals (MSE = 9.28) and species (MSE = 3.79), respectively, whereas ? 25 percent of the mean could be achieved with five counts per factor level. Sample size sufficient to detect actual differences of Wood Thrush (Hylocichla mustelina) was >200, whereas the Prothonotary Warbler (Protonotaria citrea) required <10 counts. Differences in mean cumulative species were detected among number of points visited and among number of visits to a point. In the lower MAV, mean cumulative species increased with each added point through five points and with each additional visit through four visits. Although no interaction was detected between number of points and number of visits, when paired reciprocals were compared, more points invariably yielded a significantly greater cumulative number of species than more visits to a point. Still, 36 point counts per stand during each of two breeding seasons detected only 52 percent of the known available species pool in DEF.
Monitoring bird populations by point counts
C. John Ralph; John R. Sauer; Sam Droege
1995-01-01
This volume contains in part papers presented at the Symposium on Monitoring Bird Population Trends by Point Counts, which was held November 6-7, 1991, in Beltsville, Md., in response to the need for standardization of methods to monitor bird populations by point counts. Data from various investigators working under a wide variety of conditions are presented, and...
Richard L. Hutto; Sallie J. Hejl; Jeffrey F. Kelly; Sandra M. Pletschet
1995-01-01
We conducted a series of 275 paired (on- and off-road) point counts within 4 distinct vegetation cover types in northwestern Montana. Roadside counts generated a bird list that was essentially the same as the list generated from off-road counts within the same vegetation cover type. Species that were restricted to either on- or off-road counts were rare, suggesting...
Methods of detecting and counting raptors: A review
Fuller, M.R.; Mosher, J.A.; Ralph, C. John; Scott, J. Michael
1981-01-01
Most raptors are wide-ranging, secretive, and occur at relatively low densities. These factors, in conjunction with the nocturnal activity of owls, cause the counting of raptors by most standard census and survey efforts to be very time consuming and expensive. This paper reviews the most common methods of detecting and counting raptors. It is hoped that it will be of use to the ever-increasing number of biologists, land-use planners, and managers that must determine the occurrence, density, or population dynamics of raptors. Road counts of fixed station or continuous transect design are often used to sample large areas. Detection of spontaneous or elicited vocalizations, especially those of owls, provides a means of detecting and estimating raptor numbers. Searches for nests are accomplished from foot surveys, observations from automobiles and boats, or from aircraft when nest structures are conspicuous (e.g., Osprey). Knowledge of nest habitat, historic records, and inquiries of local residents are useful for locating nests. Often several of these techniques are combined to help find nest sites. Aerial searches have also been used to locate or count large raptors (e.g., eagles), or those that may be conspicuous in open habitats (e.g., tundra). Counts of birds entering or leaving nest colonies or colonial roosts have been attempted on a limited basis. Results from Christmas Bird Counts have provided an index of the abundance of some species. Trapping and banding generally has proven to be an inefficient method of detecting raptors or estimating their populations. Concentrations of migrants at strategically located points around the world afford the best opportunity to count many rap tors in a relatively short period of time, but the influence of many unquantified variables has inhibited extensive interpretation of these counts. Few data exist to demonstrate the effectiveness of these methods. We believe more research on sampling techniques, rather than complete counts or intensive searches, will provide adequate yet affordable estimates of raptor numbers in addition to providing methods for detecting the presence of raptors on areas of interest to researchers and managers.
Point Count Length and Detection of Forest Neotropical Migrant Birds
Deanna K. Dawson; David R. Smith; Chandler S. Robbins
1995-01-01
Comparisons of bird abundances among years or among habitats assume that the rates at which birds are detected and counted are constant within species. We use point count data collected in forests of the Mid-Atlantic states to estimate detection probabilities for Neotropical migrant bird species as a function of count length. For some species, significant differences...
Comparison of birds detected from roadside and off-road point counts in the Shenandoah National Park
Keller, C.M.E.; Fuller, M.R.; Ralph, C. John; Sauer, John R.; Droege, Sam
1995-01-01
Roadside point counts are generally used for large surveys to increase the number of samples. We examined differences in species detected from roadside versus off-road (200-m and 400-ha) point counts in the Shenandoah National Park. We also compared the list of species detected in the first 3 minutes to those detected in 10 minutes for potential species biases. Results from 81 paired roadside and off-road counts indicated that roadside counts had higher numbers of several edge species but did not have lower numbers of nonedge forest species. More individuals and species were detected from roadside points because of this increase in edge species. Sixty-five percent of the species detected in 10 minutes were recorded in the first 3 minutes.
Grey W. Pendleton
1995-01-01
Many factors affect the use of point counts for monitoring bird populations, including sampling strategies, variation in detection rates, and independence of sample points. The most commonly used sampling plans are stratified sampling, cluster sampling, and systematic sampling. Each of these might be most useful for different objectives or field situations. Variation...
Reviving common standards in point-count surveys for broad inference across studies
Steven M. Matsuoka; C. Lisa Mahon; Colleen M. Handel; Péter Sólymos; Erin M. Bayne; Patricia C. Fontaine; C. John Ralph
2014-01-01
We revisit the common standards recommended by Ralph et al. (1993, 1995a) for conducting point-count surveys to assess the relative abundance of landbirds breeding in North America. The standards originated from discussions among ornithologists in 1991 and were developed so that point-count survey data could be broadly compared and jointly analyzed by national data...
Bayesian analyses of time-interval data for environmental radiation monitoring.
Luo, Peng; Sharp, Julia L; DeVol, Timothy A
2013-01-01
Time-interval (time difference between two consecutive pulses) analysis based on the principles of Bayesian inference was investigated for online radiation monitoring. Using experimental and simulated data, Bayesian analysis of time-interval data [Bayesian (ti)] was compared with Bayesian and a conventional frequentist analysis of counts in a fixed count time [Bayesian (cnt) and single interval test (SIT), respectively]. The performances of the three methods were compared in terms of average run length (ARL) and detection probability for several simulated detection scenarios. Experimental data were acquired with a DGF-4C system in list mode. Simulated data were obtained using Monte Carlo techniques to obtain a random sampling of the Poisson distribution. All statistical algorithms were developed using the R Project for statistical computing. Bayesian analysis of time-interval information provided a similar detection probability as Bayesian analysis of count information, but the authors were able to make a decision with fewer pulses at relatively higher radiation levels. In addition, for the cases with very short presence of the source (< count time), time-interval information is more sensitive to detect a change than count information since the source data is averaged by the background data over the entire count time. The relationships of the source time, change points, and modifications to the Bayesian approach for increasing detection probability are presented.
Data analysis in emission tomography using emission-count posteriors
NASA Astrophysics Data System (ADS)
Sitek, Arkadiusz
2012-11-01
A novel approach to the analysis of emission tomography data using the posterior probability of the number of emissions per voxel (emission count) conditioned on acquired tomographic data is explored. The posterior is derived from the prior and the Poisson likelihood of the emission-count data by marginalizing voxel activities. Based on emission-count posteriors, examples of Bayesian analysis including estimation and classification tasks in emission tomography are provided. The application of the method to computer simulations of 2D tomography is demonstrated. In particular, the minimum-mean-square-error point estimator of the emission count is demonstrated. The process of finding this estimator can be considered as a tomographic image reconstruction technique since the estimates of the number of emissions per voxel divided by voxel sensitivities and acquisition time are the estimates of the voxel activities. As an example of a classification task, a hypothesis stating that some region of interest (ROI) emitted at least or at most r-times the number of events in some other ROI is tested. The ROIs are specified by the user. The analysis described in this work provides new quantitative statistical measures that can be used in decision making in diagnostic imaging using emission tomography.
Sample Size and Allocation of Effort in Point Count Sampling of Birds in Bottomland Hardwood Forests
Winston P. Smith; Daniel J. Twedt; Robert J. Cooper; David A. Wiedenfeld; Paul B. Hamel; Robert P. Ford
1995-01-01
To examine sample size requirements and optimum allocation of effort in point count sampling of bottomland hardwood forests, we computed minimum sample sizes from variation recorded during 82 point counts (May 7-May 16, 1992) from three localities containing three habitat types across three regions of the Mississippi Alluvial Valley (MAV). Also, we estimated the effect...
Yong Wang; Deborah M. Finch
2002-01-01
We compared consistency of species richness and relative abundance data collected concurrently using mist netting and point counts during migration in riparian habitats along the middle Rio Grande of central New Mexico. Mist netting detected 74% and point counts detected 82% of the 197 species encountered during the study. Species that mist netting failed to capture...
Bui, H N; Bogers, J P A M; Cohen, D; Njo, T; Herruer, M H
2016-12-01
We evaluated the performance of the HemoCue WBC DIFF, a point-of-care device for total and differential white cell count, primarily to test its suitability for the mandatory white blood cell monitoring in clozapine use. Leukocyte count and 5-part differentiation was performed by the point-of-care device and by routine laboratory method in venous EDTA-blood samples from 20 clozapine users, 20 neutropenic patients, and 20 healthy volunteers. From the volunteers, also a capillary sample was drawn. Intra-assay reproducibility and drop-to-drop variation were tested. The correlation between both methods in venous samples was r > 0.95 for leukocyte, neutrophil, and lymphocyte counts. The correlation between point-of-care (capillary sample) and routine (venous sample) methods for these cells was 0.772; 0.817 and 0.798, respectively. Only for leukocyte and neutrophil counts, the intra-assay reproducibility was sufficient. The point-of-care device can be used to screen for leukocyte and neutrophil counts. Because of the relatively high measurement uncertainty and poor correlation with venous samples, we recommend to repeat the measurement with a venous sample if cell counts are in the lower reference range. In case of clozapine therapy, neutropenia can probably be excluded if high neutrophil counts are found and patients can continue their therapy. © 2016 John Wiley & Sons Ltd.
Newell, Felicity L.; Sheehan, James; Wood, Petra Bohall; Rodewald, Amanda D.; Buehler, David A.; Keyser, Patrick D.; Larkin, Jeffrey L.; Beachy, Tiffany A.; Bakermans, Marja H.; Boves, Than J.; Evans, Andrea; George, Gregory A.; McDermott, Molly E.; Perkins, Kelly A.; White, Matthew; Wigley, T. Bently
2013-01-01
Point counts are commonly used to assess changes in bird abundance, including analytical approaches such as distance sampling that estimate density. Point-count methods have come under increasing scrutiny because effects of detection probability and field error are difficult to quantify. For seven forest songbirds, we compared fixed-radii counts (50 m and 100 m) and density estimates obtained from distance sampling to known numbers of birds determined by territory mapping. We applied point-count analytic approaches to a typical forest management question and compared results to those obtained by territory mapping. We used a before–after control impact (BACI) analysis with a data set collected across seven study areas in the central Appalachians from 2006 to 2010. Using a 50-m fixed radius, variance in error was at least 1.5 times that of the other methods, whereas a 100-m fixed radius underestimated actual density by >3 territories per 10 ha for the most abundant species. Distance sampling improved accuracy and precision compared to fixed-radius counts, although estimates were affected by birds counted outside 10-ha units. In the BACI analysis, territory mapping detected an overall treatment effect for five of the seven species, and effects were generally consistent each year. In contrast, all point-count methods failed to detect two treatment effects due to variance and error in annual estimates. Overall, our results highlight the need for adequate sample sizes to reduce variance, and skilled observers to reduce the level of error in point-count data. Ultimately, the advantages and disadvantages of different survey methods should be considered in the context of overall study design and objectives, allowing for trade-offs among effort, accuracy, and power to detect treatment effects.
Power counting to better jet observables
NASA Astrophysics Data System (ADS)
Larkoski, Andrew J.; Moult, Ian; Neill, Duff
2014-12-01
Optimized jet substructure observables for identifying boosted topologies will play an essential role in maximizing the physics reach of the Large Hadron Collider. Ideally, the design of discriminating variables would be informed by analytic calculations in perturbative QCD. Unfortunately, explicit calculations are often not feasible due to the complexity of the observables used for discrimination, and so many validation studies rely heavily, and solely, on Monte Carlo. In this paper we show how methods based on the parametric power counting of the dynamics of QCD, familiar from effective theory analyses, can be used to design, understand, and make robust predictions for the behavior of jet substructure variables. As a concrete example, we apply power counting for discriminating boosted Z bosons from massive QCD jets using observables formed from the n-point energy correlation functions. We show that power counting alone gives a definite prediction for the observable that optimally separates the background-rich from the signal-rich regions of phase space. Power counting can also be used to understand effects of phase space cuts and the effect of contamination from pile-up, which we discuss. As these arguments rely only on the parametric scaling of QCD, the predictions from power counting must be reproduced by any Monte Carlo, which we verify using Pythia 8 and Herwig++. We also use the example of quark versus gluon discrimination to demonstrate the limits of the power counting technique.
Direct measurement of carbon-14 in carbon dioxide by liquid scintillation counting
NASA Technical Reports Server (NTRS)
Horrocks, D. L.
1969-01-01
Liquid scintillation counting technique is applied to the direct measurement of carbon-14 in carbon dioxide. This method has high counting efficiency and eliminates many of the basic problems encountered with previous techniques. The technique can be used to achieve a percent substitution reaction and is of interest as an analytical technique.
Effect of distance-related heterogeneity on population size estimates from point counts
Efford, Murray G.; Dawson, Deanna K.
2009-01-01
Point counts are used widely to index bird populations. Variation in the proportion of birds counted is a known source of error, and for robust inference it has been advocated that counts be converted to estimates of absolute population size. We used simulation to assess nine methods for the conduct and analysis of point counts when the data included distance-related heterogeneity of individual detection probability. Distance from the observer is a ubiquitous source of heterogeneity, because nearby birds are more easily detected than distant ones. Several recent methods (dependent double-observer, time of first detection, time of detection, independent multiple-observer, and repeated counts) do not account for distance-related heterogeneity, at least in their simpler forms. We assessed bias in estimates of population size by simulating counts with fixed radius w over four time intervals (occasions). Detection probability per occasion was modeled as a half-normal function of distance with scale parameter sigma and intercept g(0) = 1.0. Bias varied with sigma/w; values of sigma inferred from published studies were often 50% for a 100-m fixed-radius count. More critically, the bias of adjusted counts sometimes varied more than that of unadjusted counts, and inference from adjusted counts would be less robust. The problem was not solved by using mixture models or including distance as a covariate. Conventional distance sampling performed well in simulations, but its assumptions are difficult to meet in the field. We conclude that no existing method allows effective estimation of population size from point counts.
Point Counts Modifications and Breeding Bird Abundances in Central Appalachian Forests
J. Edwards Gates
1995-01-01
The effects of point count duration and radius on detection of breeding birds were compared by recording all birds seen or heard within two consecutive 5-minute intervals and for fixed-radius (within 30 m) or unlimited radius counts. Counts were conducted on Green Ridge State Forest (GRSF) and Savage River State Forest (SRSF) in western Maryland. More than 70 percent...
Kathryn L. Purcell; Sylvia R. Mori; Mary K. Chase
2005-01-01
We used data from two oak-woodland sites in California to develop guidelines for the design of bird monitoring programs using point counts. We used power analysis to determine sample size adequacy when varying the number of visits, count stations, and years for examining trends in abundance. We assumed an overdispersed Poisson distribution for count data, with...
Monitoring trends in bird populations: addressing background levels of annual variability in counts
Jared Verner; Kathryn L. Purcell; Jennifer G. Turner
1996-01-01
Point counting has been widely accepted as a method for monitoring trends in bird populations. Using a rigorously standardized protocol at 210 counting stations at the San Joaquin Experimental Range, Madera Co., California, we have been studying sources of variability in point counts of birds. Vegetation types in the study area have not changed during the 11 years of...
Comparison of Birds Detected from Roadside and Off-Road Point Counts in the Shenandoah National Park
Cherry M.E. Keller; Mark R. Fuller
1995-01-01
Roadside point counts are generally used for large surveys to increase the number of samples. We examined differences in species detected from roadside versus off-road (200-m and 400-m) point counts in the Shenandoah National Park. We also compared the list of species detected in the first 3 minutes to those detected in 10 minutes for potential species biases. Results...
A miniaturized counting technique for anaerobic bacteria.
Sharpe, A N; Pettipher, G L; Lloyd, G R
1976-12-01
A miniaturized counting technique gave results as good as the pour-plate and Most Probable Number (MPN) techniques for enumeration of clostridia spp. and anaerobic isolates from the gut. Highest counts were obtained when ascorbic acid (1%) and dithiothreitol (0.015%) were added to the reinforced clostridial medium used for counting. This minimized the effect of exposure to air before incubation. The miniature technique allowed up to 40 samples to be plated and incubated in one McIntosh-Filde's-type anaerobic jar, compared with 3 or 4 by the normal pour plate.
NASA Astrophysics Data System (ADS)
Takiue, Makoto; Fujii, Haruo; Ishikawa, Hiroaki
1984-12-01
2, 5-diphenyloxazole (PPO) has been proposed as a wavelength shifter for Cherenkov counting. Since PPO is not incorporated with water, we have introduced the fluor into water in the form of micelle using a PPO-ethanol system. This technique makes it possible to obtain a high Cherenkov counting efficiency under stable sample conditions, attributed to the proper spectrometric features of the PPO. The 32P Cherenkov counting efficiency (68.4%) obtained from this technique is large as that measured with a conventional Cherenkov technique.
Assessment of Intervertebral Disc Degeneration Based on Quantitative MRI Analysis: an in vivo study
Grunert, Peter; Hudson, Katherine D.; Macielak, Michael R.; Aronowitz, Eric; Borde, Brandon H.; Alimi, Marjan; Njoku, Innocent; Ballon, Douglas; Tsiouris, Apostolos John; Bonassar, Lawrence J.; Härtl, Roger
2015-01-01
Study design Animal experimental study Objective To evaluate a novel quantitative imaging technique for assessing disc degeneration. Summary of Background Data T2-relaxation time (T2-RT) measurements have been used to quantitatively assess disc degeneration. T2 values correlate with the water content of inter vertebral disc tissue and thereby allow for the indirect measurement of nucleus pulposus (NP) hydration. Methods We developed an algorithm to subtract out MRI voxels not representing NP tissue based on T2-RT values. Filtered NP voxels were used to measure nuclear size by their amount and nuclear hydration by their mean T2-RT. This technique was applied to 24 rat-tail intervertebral discs’ (IVDs), which had been punctured with an 18-gauge needle according to different techniques to induce varying degrees of degeneration. NP voxel count and average T2-RT were used as parameters to assess the degeneration process at 1 and 3 months post puncture. NP voxel counts were evaluated against X-ray disc height measurements and qualitative MRI studies based on the Pfirrmann grading system. Tails were collected for histology to correlate NP voxel counts to histological disc degeneration grades and to NP cross-sectional area measurements. Results NP voxel count measurements showed strong correlations to qualitative MRI analyses (R2=0.79, p<0.0001), histological degeneration grades (R2=0.902, p<0.0001) and histological NP cross-sectional area measurements (R2=0.887, p<0.0001). In contrast to NP voxel counts, the mean T2-RT for each punctured group remained constant between months 1 and 3. The mean T2-RTs for the punctured groups did not show a statistically significant difference from those of healthy IVDs (63.55ms ±5.88ms month 1 and 62.61ms ±5.02ms) at either time point. Conclusion The NP voxel count proved to be a valid parameter to quantitatively assess disc degeneration in a needle puncture model. The mean NP T2-RT does not change significantly in needle-puncture induced degenerated IVDs. IVDs can be segmented into different tissue components according to their innate T2-RT. PMID:24384655
Identifying and counting point defects in carbon nanotubes.
Fan, Yuwei; Goldsmith, Brett R; Collins, Philip G
2005-12-01
The prevailing conception of carbon nanotubes and particularly single-walled carbon nanotubes (SWNTs) continues to be one of perfectly crystalline wires. Here, we demonstrate a selective electrochemical method that labels point defects and makes them easily visible for quantitative analysis. High-quality SWNTs are confirmed to contain one defect per 4 microm on average, with a distribution weighted towards areas of SWNT curvature. Although this defect density compares favourably to high-quality, silicon single-crystals, the presence of a single defect can have tremendous electronic effects in one-dimensional conductors such as SWNTs. We demonstrate a one-to-one correspondence between chemically active point defects and sites of local electronic sensitivity in SWNT circuits, confirming the expectation that individual defects may be critical to understanding and controlling variability, noise and chemical sensitivity in SWNT electronic devices. By varying the SWNT synthesis technique, we further show that the defect spacing can be varied over orders of magnitude. The ability to detect and analyse point defects, especially at very low concentrations, indicates the promise of this technique for quantitative process analysis, especially in nanoelectronics development.
Point counts are a common method for sampling avian distribution and abundance. Though methods for estimating detection probabilities are available, many analyses use raw counts and do not correct for detectability. We use a removal model of detection within an N-mixture approa...
NASA Astrophysics Data System (ADS)
Powless, Amy J.; Conley, Roxanna J.; Freeman, Karan A.; Muldoon, Timothy J.
2017-03-01
There exists a broad range of techniques that can be used to classify and count white blood cells in a point-of-care (POC) three-part leukocyte differential test. Improvements in lenses, light sources, and cameras for image-based POC systems have renewed interest in acridine orange (AO) as a contrast agent, whereby subpopulations of leukocytes can be differentiated by colorimetric analysis of AO fluorescence emission. We evaluated the effect on test accuracy using different AO staining and postprocessing methods in the context of an image-based POC colorimetric cell classification scheme. Thirty blood specimens were measured for percent cell counts using our POC system and a conventional hematology analyzer for comparison. Controlling the AO concentration used during whole-blood staining, the incubation time with AO, and the colorimetric ratios among the three population of leukocytes yielded a percent deviation of 0.706%, -1.534%, and -0.645% for the lymphocytes, monocytes, and granulocytes, respectively. Overall, we demonstrated that a redshift in AO fluorescence was observed at elevated AO concentrations, which lead to reproducible inaccuracy of cell counts. This study demonstrates there is a need for a strict control of the AO staining and postprocessing methods to improve test accuracy in these POC systems.
Cathryn H. Greenberg; Aimee Livings Tomcho; J. Drew Lanham; Thomas a. Waldrop; Jospeh Tomcho; Ross J. Phillips; Dean Simon
2007-01-01
We compared the effects of 3 fuel reduction techniques and a control on breeding birds during 2001â2005 using 50-m point counts. Four experimental units, each .14 ha, were contained within each of 3 replicate blocks at the Green River Game Land, Polk County, North Carolina, USA. Treatments were 1) prescribed burn, 2) mechanical understory reduction (chainsaw-felling...
Forcucci, Alessandra; Pawlowski, Michal E.; Majors, Catherine; Richards-Kortum, Rebecca; Tkaczyk, Tomasz S.
2015-01-01
Three-part differential white blood cell counts are used for disease diagnosis and monitoring at the point-of-care. A low-cost, miniature achromatic microscope was fabricated for identification of lymphocytes, monocytes, and granulocytes in samples of whole blood stained with acridine orange. The microscope was manufactured using rapid prototyping techniques of diamond turning and 3D printing and is intended for use at the point-of-care in low-resource settings. The custom-designed microscope requires no manual adjustment between samples and was successfully able to classify three white blood cell types (lymphocytes, granulocytes, and monocytes) using samples of peripheral whole blood stained with acridine orange. PMID:26601006
Comparison of UWCC MOX fuel measurements to MCNP-REN calculations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Abhold, M.; Baker, M.; Jie, R.
1998-12-31
The development of neutron coincidence counting has greatly improved the accuracy and versatility of neutron-based techniques to assay fissile materials. Today, the shift register analyzer connected to either a passive or active neutron detector is widely used by both domestic and international safeguards organizations. The continued development of these techniques and detectors makes extensive use of the predictions of detector response through the use of Monte Carlo techniques in conjunction with the point reactor model. Unfortunately, the point reactor model, as it is currently used, fails to accurately predict detector response in highly multiplying mediums such as mixed-oxide (MOX) lightmore » water reactor fuel assemblies. For this reason, efforts have been made to modify the currently used Monte Carlo codes and to develop new analytical methods so that this model is not required to predict detector response. The authors describe their efforts to modify a widely used Monte Carlo code for this purpose and also compare calculational results with experimental measurements.« less
A double-observer approach for estimating detection probability and abundance from point counts
Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Fallon, F.W.; Fallon, J.E.; Heglund, P.J.
2000-01-01
Although point counts are frequently used in ornithological studies, basic assumptions about detection probabilities often are untested. We apply a double-observer approach developed to estimate detection probabilities for aerial surveys (Cook and Jacobson 1979) to avian point counts. At each point count, a designated 'primary' observer indicates to another ('secondary') observer all birds detected. The secondary observer records all detections of the primary observer as well as any birds not detected by the primary observer. Observers alternate primary and secondary roles during the course of the survey. The approach permits estimation of observer-specific detection probabilities and bird abundance. We developed a set of models that incorporate different assumptions about sources of variation (e.g. observer, bird species) in detection probability. Seventeen field trials were conducted, and models were fit to the resulting data using program SURVIV. Single-observer point counts generally miss varying proportions of the birds actually present, and observer and bird species were found to be relevant sources of variation in detection probabilities. Overall detection probabilities (probability of being detected by at least one of the two observers) estimated using the double-observer approach were very high (>0.95), yielding precise estimates of avian abundance. We consider problems with the approach and recommend possible solutions, including restriction of the approach to fixed-radius counts to reduce the effect of variation in the effective radius of detection among various observers and to provide a basis for using spatial sampling to estimate bird abundance on large areas of interest. We believe that most questions meriting the effort required to carry out point counts also merit serious attempts to estimate detection probabilities associated with the counts. The double-observer approach is a method that can be used for this purpose.
Tailoring point counts for inference about avian density: dealing with nondetection and availability
Johnson, Fred A.; Dorazio, Robert M.; Castellón, Traci D.; Martin, Julien; Garcia, Jay O.; Nichols, James D.
2014-01-01
Point counts are commonly used for bird surveys, but interpretation is ambiguous unless there is an accounting for the imperfect detection of individuals. We show how repeated point counts, supplemented by observation distances, can account for two aspects of the counting process: (1) detection of birds conditional on being available for observation and (2) the availability of birds for detection given presence. We propose a hierarchical model that permits the radius in which birds are available for detection to vary with forest stand age (or other relevant habitat features), so that the number of birds available at each location is described by a Poisson-gamma mixture. Conditional on availability, the number of birds detected at each location is modeled by a beta-binomial distribution. We fit this model to repeated point count data of Florida scrub-jays and found evidence that the area in which birds were available for detection decreased with increasing stand age. Estimated density was 0.083 (95%CI: 0.060–0.113) scrub-jays/ha. Point counts of birds have a number of appealing features. Based on our findings, however, an accounting for both components of the counting process may be necessary to ensure that abundance estimates are comparable across time and space. Our approach could easily be adapted to other species and habitats.
LOW LEVEL COUNTING TECHNIQUES WITH SPECIAL REFERENCE TO BIOMEDICAL TRACER PROBLEMS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hosain, F.; Nag, B.D.
1959-12-01
Low-level counting techniques in tracer experiments are discussed with emphasis on the measurement of beta and gamma radiations with Geiger and scintillation counting methods. The basic principles of low-level counting are outlined. Screen-wall counters, internal gas counters, low-level beta counters, scintillation spectrometers, liquid scintillators, and big scintillation installations are described. Biomedical tracer investigations are discussed. Applications of low-level techniques in archaeological dating, biology, and other problems are listed. (M.C.G.)
SURVIVAL OF SALMONELLA SPECIES IN RIVER WATER.
The survival of four Salmonella strains in river water microcosms was monitored using culturing techniques, direct counts, whole cell hybridization, scanning electron microscopy, and resuscitation techniques via the direct viable count method and flow cytrometry. Plate counts of...
Machine Learning Based Single-Frame Super-Resolution Processing for Lensless Blood Cell Counting
Huang, Xiwei; Jiang, Yu; Liu, Xu; Xu, Hang; Han, Zhi; Rong, Hailong; Yang, Haiping; Yan, Mei; Yu, Hao
2016-01-01
A lensless blood cell counting system integrating microfluidic channel and a complementary metal oxide semiconductor (CMOS) image sensor is a promising technique to miniaturize the conventional optical lens based imaging system for point-of-care testing (POCT). However, such a system has limited resolution, making it imperative to improve resolution from the system-level using super-resolution (SR) processing. Yet, how to improve resolution towards better cell detection and recognition with low cost of processing resources and without degrading system throughput is still a challenge. In this article, two machine learning based single-frame SR processing types are proposed and compared for lensless blood cell counting, namely the Extreme Learning Machine based SR (ELMSR) and Convolutional Neural Network based SR (CNNSR). Moreover, lensless blood cell counting prototypes using commercial CMOS image sensors and custom designed backside-illuminated CMOS image sensors are demonstrated with ELMSR and CNNSR. When one captured low-resolution lensless cell image is input, an improved high-resolution cell image will be output. The experimental results show that the cell resolution is improved by 4×, and CNNSR has 9.5% improvement over the ELMSR on resolution enhancing performance. The cell counting results also match well with a commercial flow cytometer. Such ELMSR and CNNSR therefore have the potential for efficient resolution improvement in lensless blood cell counting systems towards POCT applications. PMID:27827837
Virus detection and quantification using electrical parameters
NASA Astrophysics Data System (ADS)
Ahmad, Mahmoud Al; Mustafa, Farah; Ali, Lizna M.; Rizvi, Tahir A.
2014-10-01
Here we identify and quantitate two similar viruses, human and feline immunodeficiency viruses (HIV and FIV), suspended in a liquid medium without labeling, using a semiconductor technique. The virus count was estimated by calculating the impurities inside a defined volume by observing the change in electrical parameters. Empirically, the virus count was similar to the absolute value of the ratio of the change of the virus suspension dopant concentration relative to the mock dopant over the change in virus suspension Debye volume relative to mock Debye volume. The virus type was identified by constructing a concentration-mobility relationship which is unique for each kind of virus, allowing for a fast (within minutes) and label-free virus quantification and identification. For validation, the HIV and FIV virus preparations were further quantified by a biochemical technique and the results obtained by both approaches corroborated well. We further demonstrate that the electrical technique could be applied to accurately measure and characterize silica nanoparticles that resemble the virus particles in size. Based on these results, we anticipate our present approach to be a starting point towards establishing the foundation for label-free electrical-based identification and quantification of an unlimited number of viruses and other nano-sized particles.
James F. Lynch
1995-01-01
Effects of count duration, time-of-day, and aural stimuli were studied in a series of unlimited-radius point counts conducted during winter in Quintana Roo, Mexico. The rate at which new species were detected was approximately three times higher during the first 5 minutes of each 15- minute count than in the final 5 minutes. The number of individuals and species...
Senftle, F.E.; Moxham, R.M.; Tanner, A.B.
1972-01-01
The recent availability of borehole logging sondes employing a source of neutrons and a Ge(Li) detector opens up the possibility of analyzing either decay or capture gamma rays. The most efficient method for a given element can be predicted by calculating the decay-to-capture count ratio for the most prominent peaks in the respective spectra. From a practical point of view such a calculation must be slanted toward short irradiation and count times at each station in a borehole. A simplified method of computation is shown, and the decay-to-capture count ratio has been calculated and tabulated for the optimum value in the decay mode irrespective of the irradiation time, and also for a ten minute irradiation time. Based on analysis of a single peak in each spectrum, the results indicate the preferred technique and the best decay or capture peak to observe for those elements of economic interest. ?? 1972.
SURVIVAL OF SALMONELLA SPECIES IN RIVER WATER
The survival of four Salmonella strains in river water microcosms was monitored by culturing techniques, direct counts, whole-cell hybridization, scanning electron microscopy, and resuscitation techniques via the direct viable count method and flow cytometry. Plate counts of bact...
Mutoh, Yoshikazu; Nishijima, Takeshi; Inaba, Yosuke; Tanaka, Noriko; Kikuchi, Yoshimi; Gatanaga, Hiroyuki; Oka, Shinichi
2018-03-02
The extent and duration of long-term recovery of CD4 count, CD4%, and CD4/CD8 ratio after initiation of combination antiretroviral therapy (cART) in patients with suppressed viral load are largely unknown. HIV-1 infected patients who started cART between January 2004 and January 2012 and showed persistent viral suppression (<200 copies/mL) for at least 4 years were followed up at AIDS Clinical Center, Tokyo. Change point analysis was used to determine the time point where CD4 count recovery shows a plateau, and linear mixed model was applied to estimate CD4 count at the change point. Data of 752 patients were analyzed [93% males, median age 38, median baseline CD4 count 172/µL (IQR, 62-253), CD4% 13.8% (IQR, 7.7-18.5), and CD4/8 ratio 0.23 (IQR, 0.12-0.35)]. The median follow-up period was 81.2 months and 91 (12.1%) patients were followed for >10 years. Change point analysis showed that CD4 count, CD4%, and CD4/CD8 ratio, continued to increase until 78.6, 62.2, and 64.3 months, respectively, with adjusted mean of 590 /µL (95%CI 572-608), 29.5% (29-30.1), and 0.89 (0.86-0.93), respectively, at the change point. Although 73.8% of the study patients achieved CD4 count ≥500 /μL, 48.2% of the patients with baseline CD4 count <100 /μL did not achieve CD4 count ≥500 /μL. Neither CD4% nor CD4/CD8 ratio normalized in a majority of patients. The results showed lack of normalization of CD4 count, CD4%, and CD4/CD8 ratio to the levels seen in healthy individuals even after long-term successful cART in patients with suppressed viral load.
NASA Astrophysics Data System (ADS)
Bilalic, Rusmir
A novel application of support vector machines (SVMs), artificial neural networks (ANNs), and Gaussian processes (GPs) for machine learning (GPML) to model microcontroller unit (MCU) upset due to intentional electromagnetic interference (IEMI) is presented. In this approach, an MCU performs a counting operation (0-7) while electromagnetic interference in the form of a radio frequency (RF) pulse is direct-injected into the MCU clock line. Injection times with respect to the clock signal are the clock low, clock rising edge, clock high, and the clock falling edge periods in the clock window during which the MCU is performing initialization and executing the counting procedure. The intent is to cause disruption in the counting operation and model the probability of effect (PoE) using machine learning tools. Five experiments were executed as part of this research, each of which contained a set of 38,300 training points and 38,300 test points, for a total of 383,000 total points with the following experiment variables: injection times with respect to the clock signal, injected RF power, injected RF pulse width, and injected RF frequency. For the 191,500 training points, the average training error was 12.47%, while for the 191,500 test points the average test error was 14.85%, meaning that on average, the machine was able to predict MCU upset with an 85.15% accuracy. Leaving out the results for the worst-performing model (SVM with a linear kernel), the test prediction accuracy for the remaining machines is almost 89%. All three machine learning methods (ANNs, SVMs, and GPML) showed excellent and consistent results in their ability to model and predict the PoE on an MCU due to IEMI. The GP approach performed best during training with a 7.43% average training error, while the ANN technique was most accurate during the test with a 10.80% error.
Establishing school day pedometer step count cut-points using ROC curves in low-income children.
Burns, Ryan D; Brusseau, Timothy A; Fu, You; Hannon, James C
2016-05-01
Previous research has not established pedometer step count cut-points that discriminate children that meet school day physical activity recommendations using a tri-axial ActiGraph accelerometer criterion. The purpose of this study was to determine step count cut-points that associate with 30min of school day moderate-to-vigorous physical activity (MVPA) in school-aged children. Participants included 1053 school-aged children (mean age=8.4±1.8years) recruited from three low-income schools from the state of Utah in the U.S. Physical activity was assessed using Yamax DigiWalker CW600 pedometers and ActiGraph wGT3X-BT triaxial accelerometers that were concurrently worn during school hours. Data were collected at each school during the 2014-2015 school year. Receiver operating characteristic (ROC) curves were used to determine pedometer step count cut-points that associated with at least 30min of MVPA during school hours. Cut-points were determined using the maximum Youden's J statistic (J max). For the total sample, the area-under-the-curve (AUC) was 0.77 (p<0.001) with a pedometer cut-point of 5505 steps (J max=0.46, Sensitivity=63%, Specificity=84%; Accuracy=76%). Step counts showed greater diagnostic ability in girls (AUC=0.81, p<0.001; Cut-point=5306 steps; Accuracy=78.8%) compared to boys (AUC=0.72, p<0.01; Cut-point=5786 steps; Accuracy=71.4%). Pedometer step counts showed good diagnostic ability in girls and fair diagnostic ability in boys for discriminating children that met at least 30min of MVPA during school hours. Copyright © 2016 Elsevier Inc. All rights reserved.
Tender point count, pain, and mobility in the older population: the mobilize Boston study.
Eggermont, Laura H P; Shmerling, Robert H; Leveille, Suzanne G
2010-01-01
Prevalence of tender points (TP), and widespread pain and fibromyalgia, as well as the relationship between TP and widespread pain and mobility, was examined in 585 community-dwelling older adults (mean age 78.2 years, 63.4% female). Pain was based on location (none, single site, multisite, widespread). Mobility was measured by the Short Physical Performance Battery (SPPB), gait speed, and self-reported (S-R) mobility difficulty. Tender-point count and health characteristics (ie, BMI, chronic conditions, analgesic use, number of medications, depression, and blocks walked per week) were assessed. Several participants had 3 or more TP (22.1%) although prevalence of criteria-based fibromyalgia was low (.3%). Mobility was more limited in persons with higher tender-point counts. After adjustment for pain and other risk factors, higher tender-point count was associated with poorer SPPB performance (score < 10, aOR = 1.09 per TP, 95%CI, 1.01-1.17), and slow gait speed (< .784m/sec, aOR = 1.14 per TP, 95%CI, 1.05-1.24), but not with S-R mobility difficulty. S-R mobility difficulty was associated with more disseminated pain (multisite pain, aOR = 2.01, 95%CI, 1.21-3.34; widespread pain, aOR = 2.47, 95%CI, 1.09-5.62). These findings portray a significant mobility burden related to tender-point count and multisite and widespread pain in the older population. Future studies using longitudinal methods are warranted. Higher tender-point count, multisite pain, and widespread pain are common in community-dwelling older adults and associated with mobility problems. Both the manual tender-point exam and the McGill Pain Map may provide important yet different information about risks for mobility disability in older individuals. Copyright 2010 American Pain Society. All rights reserved.
NASA Technical Reports Server (NTRS)
Haskin, Larry A.; Wang, Alian; Rockow, Kaylynn M.; Jolliff, Bradley L.; Korotev, Randy L.; Viskupic, Karen M.
1997-01-01
Quantification of mineral proportions in rocks and soils by Raman spectroscopy on a planetary surface is best done by taking many narrow-beam spectra from different locations on the rock or soil, with each spectrum yielding peaks from only one or two minerals. The proportion of each mineral in the rock or soil can then be determined from the fraction of the spectra that contain its peaks, in analogy with the standard petrographic technique of point counting. The method can also be used for nondestructive laboratory characterization of rock samples. Although Raman peaks for different minerals seldom overlap each other, it is impractical to obtain proportions of constituent minerals by Raman spectroscopy through analysis of peak intensities in a spectrum obtained by broad-beam sensing of a representative area of the target material. That is because the Raman signal strength produced by a mineral in a rock or soil is not related in a simple way through the Raman scattering cross section of that mineral to its proportion in the rock, and the signal-to-noise ratio of a Raman spectrum is poor when a sample is stimulated by a low-power laser beam of broad diameter. Results obtained by the Raman point-count method are demonstrated for a lunar thin section (14161,7062) and a rock fragment (15273,7039). Major minerals (plagioclase and pyroxene), minor minerals (cristobalite and K-feldspar), and accessory minerals (whitlockite, apatite, and baddeleyite) were easily identified. Identification of the rock types, KREEP basalt or melt rock, from the 100-location spectra was straightforward.
NASA Astrophysics Data System (ADS)
Bruggeman, M.; Baeten, P.; De Boeck, W.; Carchon, R.
1996-02-01
Neutron coincidence counting is commonly used for the non-destructive assay of plutonium bearing waste or for safeguards verification measurements. A major drawback of conventional coincidence counting is related to the fact that a valid calibration is needed to convert a neutron coincidence count rate to a 240Pu equivalent mass ( 240Pu eq). In waste assay, calibrations are made for representative waste matrices and source distributions. The actual waste however may have quite different matrices and source distributions compared to the calibration samples. This often results in a bias of the assay result. This paper presents a new neutron multiplicity sensitive coincidence counting technique including an auto-calibration of the neutron detection efficiency. The coincidence counting principle is based on the recording of one- and two-dimensional Rossi-alpha distributions triggered respectively by pulse pairs and by pulse triplets. Rossi-alpha distributions allow an easy discrimination between real and accidental coincidences and are aimed at being measured by a PC-based fast time interval analyser. The Rossi-alpha distributions can be easily expressed in terms of a limited number of factorial moments of the neutron multiplicity distributions. The presented technique allows an unbiased measurement of the 240Pu eq mass. The presented theory—which will be indicated as Time Interval Analysis (TIA)—is complementary to Time Correlation Analysis (TCA) theories which were developed in the past, but is from the theoretical point of view much simpler and allows a straightforward calculation of deadtime corrections and error propagation. Analytical expressions are derived for the Rossi-alpha distributions as a function of the factorial moments of the efficiency dependent multiplicity distributions. The validity of the proposed theory is demonstrated and verified via Monte Carlo simulations of pulse trains and the subsequent analysis of the simulated data.
Population trends, survival, and sampling methodologies for a population of Rana draytonii
Fellers, Gary M.; Kleeman, Patrick M.; Miller, David A.W.; Halstead, Brian J.
2017-01-01
Estimating population trends provides valuable information for resource managers, but monitoring programs face trade-offs between the quality and quantity of information gained and the number of sites surveyed. We compared the effectiveness of monitoring techniques for estimating population trends of Rana draytonii (California Red-legged Frog) at Point Reyes National Seashore, California, USA, over a 13-yr period. Our primary goals were to: 1) estimate trends for a focal pond at Point Reyes National Seashore, and 2) evaluate whether egg mass counts could reliably estimate an index of abundance relative to more-intensive capture–mark–recapture methods. Capture–mark–recapture (CMR) surveys of males indicated a stable population from 2005 to 2009, despite low annual apparent survival (26.3%). Egg mass counts from 2000 to 2012 indicated that despite some large fluctuations, the breeding female population was generally stable or increasing, with annual abundance varying between 26 and 130 individuals. Minor modifications to egg mass counts, such as marking egg masses, can allow estimation of egg mass detection probabilities necessary to convert counts to abundance estimates, even when closure of egg mass abundance cannot be assumed within a breeding season. High egg mass detection probabilities (mean per-survey detection probability = 0.98 [0.89–0.99]) indicate that egg mass surveys can be an efficient and reliable method for monitoring population trends of federally threatened R. draytonii. Combining egg mass surveys to estimate trends at many sites with CMR methods to evaluate factors affecting adult survival at focal populations is likely a profitable path forward to enhance understanding and conservation of R. draytonii.
NASA Astrophysics Data System (ADS)
Di Mauro, M.; Manconi, S.; Zechlin, H.-S.; Ajello, M.; Charles, E.; Donato, F.
2018-04-01
The Fermi Large Area Telescope (LAT) Collaboration has recently released the Third Catalog of Hard Fermi-LAT Sources (3FHL), which contains 1556 sources detected above 10 GeV with seven years of Pass 8 data. Building upon the 3FHL results, we investigate the flux distribution of sources at high Galactic latitudes (| b| > 20^\\circ ), which are mostly blazars. We use two complementary techniques: (1) a source-detection efficiency correction method and (2) an analysis of pixel photon count statistics with the one-point probability distribution function (1pPDF). With the first method, using realistic Monte Carlo simulations of the γ-ray sky, we calculate the efficiency of the LAT to detect point sources. This enables us to find the intrinsic source-count distribution at photon fluxes down to 7.5 × 10‑12 ph cm‑2 s‑1. With this method, we detect a flux break at (3.5 ± 0.4) × 10‑11 ph cm‑2 s‑1 with a significance of at least 5.4σ. The power-law indexes of the source-count distribution above and below the break are 2.09 ± 0.04 and 1.07 ± 0.27, respectively. This result is confirmed with the 1pPDF method, which has a sensitivity reach of ∼10‑11 ph cm‑2 s‑1. Integrating the derived source-count distribution above the sensitivity of our analysis, we find that (42 ± 8)% of the extragalactic γ-ray background originates from blazars.
Can reliable sage-grouse lek counts be obtained using aerial infrared technology
Gillette, Gifford L.; Coates, Peter S.; Petersen, Steven; Romero, John P.
2013-01-01
More effective methods for counting greater sage-grouse (Centrocercus urophasianus) are needed to better assess population trends through enumeration or location of new leks. We describe an aerial infrared technique for conducting sage-grouse lek counts and compare this method with conventional ground-based lek count methods. During the breeding period in 2010 and 2011, we surveyed leks from fixed-winged aircraft using cryogenically cooled mid-wave infrared cameras and surveyed the same leks on the same day from the ground following a standard lek count protocol. We did not detect significant differences in lek counts between surveying techniques. These findings suggest that using a cryogenically cooled mid-wave infrared camera from an aerial platform to conduct lek surveys is an effective alternative technique to conventional ground-based methods, but further research is needed. We discuss multiple advantages to aerial infrared surveys, including counting in remote areas, representing greater spatial variation, and increasing the number of counted leks per season. Aerial infrared lek counts may be a valuable wildlife management tool that releases time and resources for other conservation efforts. Opportunities exist for wildlife professionals to refine and apply aerial infrared techniques to wildlife monitoring programs because of the increasing reliability and affordability of this technology.
Avalanche photodiode photon counting receivers for space-borne lidars
NASA Technical Reports Server (NTRS)
Sun, Xiaoli; Davidson, Frederic M.
1991-01-01
Avalanche photodiodes (APD) are studied for uses as photon counting detectors in spaceborne lidars. Non-breakdown APD photon counters, in which the APD's are biased below the breakdown point, are shown to outperform: (1) conventional APD photon counters biased above the breakdown point; (2) conventional APD photon counters biased above the breakdown point; and (3) APD's in analog mode when the received optical signal is extremely weak. Non-breakdown APD photon counters were shown experimentally to achieve an effective photon counting quantum efficiency of 5.0 percent at lambda = 820 nm with a dead time of 15 ns and a dark count rate of 7000/s which agreed with the theoretically predicted values. The interarrival times of the counts followed an exponential distribution and the counting statistics appeared to follow a Poisson distribution with no after pulsing. It is predicted that the effective photon counting quantum efficiency can be improved to 18.7 percent at lambda = 820 nm and 1.46 percent at lambda = 1060 nm with a dead time of a few nanoseconds by using more advanced commercially available electronic components.
Fixed-Radius Point Counts in Forests: Factors Influencing Effectiveness and Efficiency
Daniel R. Petit; Lisa J. Petit; Victoria A. Saab; Thomas E. Martin
1995-01-01
The effectiveness of fixed-radius point counts in quantifying abundance and richness of bird species in oak-hickory, pine-hardwoods, mixed-mesophytic, beech-maple, and riparian cottonwood forests was evaluated in Arkansas, Ohio, Kentucky, and Idaho. Effects of count duration and numbers of stations and visits per stand were evaluated in May to July 1991 by conducting...
Point counts from clustered populations: Lessons from an experiment with Hawaiian crows
Hayward, G.D.; Kepler, C.B.; Scott, J.M.
1991-01-01
We designed an experiment to identify factors contributing most to error in counts of Hawaiian Crow or Alala (Corvus hawaiiensis) groups that are detected aurally. Seven observers failed to detect calling Alala on 197 of 361 3-min point counts on four transects extending from cages with captive Alala. A detection curve describing the relation between frequency of flock detection and distance typified the distribution expected in transect or point counts. Failure to detect calling Alala was affected most by distance, observer, and Alala calling frequency. The number of individual Alala calling was not important in detection rate. Estimates of the number of Alala calling (flock size) were biased and imprecise: average difference between number of Alala calling and number heard was 3.24 (.+-. 0.277). Distance, observer, number of Alala calling, and Alala calling frequency all contributed to errors in estimates of group size (P < 0.0001). Multiple regression suggested that number of Alala calling contributed most to errors. These results suggest that well-designed point counts may be used to estimate the number of Alala flocks but cast doubt on attempts to estimate flock size when individuals are counted aurally.
NASA Astrophysics Data System (ADS)
Dudak, J.; Zemlicka, J.; Karch, J.; Hermanova, Z.; Kvacek, J.; Krejci, F.
2017-01-01
Photon counting detectors Timepix are known for their unique properties enabling X-ray imaging with extremely high contrast-to-noise ratio. Their applicability has been recently further improved since a dedicated technique for assembling large area Timepix detector arrays was introduced. Despite the fact that the sensitive area of Timepix detectors has been significantly increased, the pixel pitch is kept unchanged (55 microns). This value is much larger compared to widely used and popular X-ray imaging cameras utilizing scintillation crystals and CCD-based read-out. On the other hand, photon counting detectors provide steeper point-spread function. Therefore, with given effective pixel size of an acquired radiography, Timepix detectors provide higher spatial resolution than X-ray cameras with scintillation-based devices unless the image is affected by penumbral blur. In this paper we take an advance of steep PSF of photon counting detectors and test the possibility to improve the quality of computed tomography reconstruction using finer sampling of reconstructed voxel space. The achieved results are presented in comparison with data acquired under the same conditions using a commercially available state-of-the-art CCD X-ray camera.
Parallel image logical operations using cross correlation
NASA Technical Reports Server (NTRS)
Strong, J. P., III
1972-01-01
Methods are presented for counting areas in an image in a parallel manner using noncoherent optical techniques. The techniques presented include the Levialdi algorithm for counting, optical techniques for binary operations, and cross-correlation.
Validation of FFM PD counts for screening personality pathology and psychopathy in adolescence.
Decuyper, Mieke; De Clercq, Barbara; De Bolle, Marleen; De Fruyt, Filip
2009-12-01
Miller and colleagues (Miller, Bagby, Pilkonis, Reynolds, & Lynam, 2005) recently developed a Five-Factor Model (FFM) personality disorder (PD) count technique for describing and diagnosing PDs and psychopathy in adulthood. This technique conceptualizes PDs relying on general trait models and uses facets from the expert-generated PD prototypes to score the FFM PDs. The present study corroborates on the study of Miller and colleagues (2005) and investigates in Study 1 whether the PD count technique shows discriminant validity to describe PDs in adolescence. Study 2 extends this objective to psychopathy. Results suggest that the FFM PD count technique is equally successful in adolescence as in adulthood to describe PD symptoms, supporting the use of this descriptive method in adolescence. The normative data and accompanying PD count benchmarks enable to use FFM scores for PD screening purposes in adolescence.
Skoruppa, M.K.; Woodin, M.C.; Blacklock, G.
2009-01-01
The segment of the Rio Grande between International Falcon Reservoir and Del Rio, Texas (distance ca. 350 km), remains largely unexplored ornithologically. We surveyed nocturnal birds monthly during February-June 1998 at 19 stations along the Rio Grande (n = 6) and at upland stock ponds (n = 13) in Webb County, Texas. We conducted 10-min point counts (n = 89) after sunset and before moonset. Four species of owls and five species of nightjars were detected. Nightjars, as a group, were nearly five limes more abundant (mean number/count = 2.63) than owls (mean number = 0.55). The most, common owl, the great horned owl (Bubo virginianus), had a mean number of 0.25/point count. The mean for elf owls (Micrathene whitneyi) was 0.16/point count. The most common nightjars were the common poorwill (Phalaenoptilus nuttallii; 1.21/point count) and lesser nighthawk (Chordeiles acutipennir, 1.16/point count). Survey sites on the river supported more species (mean = 2.2) than did upland stock ponds (mean = 1.4). However, only one species (common pauraque, Nyctidromus albicollis) showed a preference for the river sites. Our results establish this segment of the Rio Grande in southern Texas as an area of high diversity of nightjars in the United States, matched (in numbers of species) only by southeastern Arizona and southwestern New Mexico.
ACHCAR, J. A.; MARTINEZ, E. Z.; RUFFINO-NETTO, A.; PAULINO, C. D.; SOARES, P.
2008-01-01
SUMMARY We considered a Bayesian analysis for the prevalence of tuberculosis cases in New York City from 1970 to 2000. This counting dataset presented two change-points during this period. We modelled this counting dataset considering non-homogeneous Poisson processes in the presence of the two-change points. A Bayesian analysis for the data is considered using Markov chain Monte Carlo methods. Simulated Gibbs samples for the parameters of interest were obtained using WinBugs software. PMID:18346287
Muramatsu, Keita; Matsuo, Koichiro; Kawai, Yusuke; Yamamoto, Tsukasa; Hara, Yoshitaka; Shimomura, Yasuyo; Yamashita, Chizuru; Nishida, Osamu
2018-06-26
Endotracheal intubation of critically ill patients increases the risk of aspiration pneumonia, which can be reduced by regular oral care. However, the rinsing of the residual oral contaminants after mechanical cleaning carries the risk of aspirating the residue during the intubation period. Removing the contaminants by wiping with mouth wipes could be an alternative to rinsing with water because of no additional fluid. This study tested: (i) the amount of oral bacteria during endotracheal intubation and after extubation; and (ii) the changes in the bacterial count during oral care procedures. Thirty-five mechanically ventilated patients in the intensive care unit were enrolled. The amount of bacteria on the dorsal tongue surface was counted before and following oral care and then after the elimination of contaminants either by rinsing with water and suctioning or by wiping with mouth wipes. The oral bacterial amount was compared statistically between the intubation and extubation status and among set time points during the oral care procedure. The oral bacterial count was significantly decreased after extubation. During the oral care procedure, the oral bacterial amount was significantly lower after eliminating the contaminants either by rinsing or wiping, with no remarkable difference between the elimination techniques. The findings suggest that the oral bacterial amount is elevated during endotracheal intubation, which could increase the risk of aspiration pneumonia. The significant reduction in the bacterial count by wiping indicates that it might be a suitable alternative to rinsing for mechanically ventilated patients. © 2018 Japan Academy of Nursing Science.
John T. Rotenberry; Steven T. Knick
1995-01-01
Breeding passerine abundances in Great Basin shrubsteppe and grassland habitats were surveyed in southwestern Idaho by using 73 pairs of 200-m radius circular point counts. Points were placed along roads and paired with points 400 m away from roads but in similar habitat. Grassland species such as Horned Larks (Eremophila alpestris) and Western...
An Automated Statistical Technique for Counting Distinct Multiple Sclerosis Lesions.
Dworkin, J D; Linn, K A; Oguz, I; Fleishman, G M; Bakshi, R; Nair, G; Calabresi, P A; Henry, R G; Oh, J; Papinutto, N; Pelletier, D; Rooney, W; Stern, W; Sicotte, N L; Reich, D S; Shinohara, R T
2018-04-01
Lesion load is a common biomarker in multiple sclerosis, yet it has historically shown modest association with clinical outcome. Lesion count, which encapsulates the natural history of lesion formation and is thought to provide complementary information, is difficult to assess in patients with confluent (ie, spatially overlapping) lesions. We introduce a statistical technique for cross-sectionally counting pathologically distinct lesions. MR imaging was used to assess the probability of a lesion at each location. The texture of this map was quantified using a novel technique, and clusters resembling the center of a lesion were counted. Validity compared with a criterion standard count was demonstrated in 60 subjects observed longitudinally, and reliability was determined using 14 scans of a clinically stable subject acquired at 7 sites. The proposed count and the criterion standard count were highly correlated ( r = 0.97, P < .001) and not significantly different (t 59 = -.83, P = .41), and the variability of the proposed count across repeat scans was equivalent to that of lesion load. After accounting for lesion load and age, lesion count was negatively associated ( t 58 = -2.73, P < .01) with the Expanded Disability Status Scale. Average lesion size had a higher association with the Expanded Disability Status Scale ( r = 0.35, P < .01) than lesion load ( r = 0.10, P = .44) or lesion count ( r = -.12, P = .36) alone. This study introduces a novel technique for counting pathologically distinct lesions using cross-sectional data and demonstrates its ability to recover obscured longitudinal information. The proposed count allows more accurate estimation of lesion size, which correlated more closely with disability scores than either lesion load or lesion count alone. © 2018 by American Journal of Neuroradiology.
Sur, Maitreyi; Belthoff, James R.; Bjerre, Emily R.; Millsap, Brian A.; Katzner, Todd
2018-01-01
Wind energy development is rapidly expanding in North America, often accompanied by requirements to survey potential facility locations for existing wildlife. Within the USA, golden eagles (Aquila chrysaetos) are among the most high-profile species of birds that are at risk from wind turbines. To minimize golden eagle fatalities in areas proposed for wind development, modified point count surveys are usually conducted to estimate use by these birds. However, it is not always clear what drives variation in the relationship between on-site point count data and actual use by eagles of a wind energy project footprint. We used existing GPS-GSM telemetry data, collected at 15 min intervals from 13 golden eagles in 2012 and 2013, to explore the relationship between point count data and eagle use of an entire project footprint. To do this, we overlaid the telemetry data on hypothetical project footprints and simulated a variety of point count sampling strategies for those footprints. We compared the time an eagle was found in the sample plots with the time it was found in the project footprint using a metric we called “error due to sampling”. Error due to sampling for individual eagles appeared to be influenced by interactions between the size of the project footprint (20, 40, 90 or 180 km2) and the sampling type (random, systematic or stratified) and was greatest on 90 km2 plots. However, use of random sampling resulted in lowest error due to sampling within intermediate sized plots. In addition sampling intensity and sampling frequency both influenced the effectiveness of point count sampling. Although our work focuses on individual eagles (not the eagle populations typically surveyed in the field), our analysis shows both the utility of simulations to identify specific influences on error and also potential improvements to sampling that consider the context-specific manner that point counts are laid out on the landscape.
Estimation of bearing contact angle in-situ by X-ray kinematography
NASA Technical Reports Server (NTRS)
Fowler, P. H.; Manders, F.
1982-01-01
The mounted, preloaded contact angle of the structural bearings in the assembled design mechanical assembly was measured. A modification of the Turns method is presented, based upon the clarity and definition of moving parts achieved with X-ray technique and cinematic display. Contact angle is estimated by counting the number of bearings passing a given point as a function of number of turns of the shaft. Ball and pitch diameter variations are discussed. Ball train and shaft angle uncertainties are also discussed.
NASA Astrophysics Data System (ADS)
Chhetri, R.; Ekers, R. D.; Morgan, J.; Macquart, J.-P.; Franzen, T. M. O.
2018-06-01
We use Murchison Widefield Array observations of interplanetary scintillation (IPS) to determine the source counts of point (<0.3 arcsecond extent) sources and of all sources with some subarcsecond structure, at 162 MHz. We have developed the methodology to derive these counts directly from the IPS observables, while taking into account changes in sensitivity across the survey area. The counts of sources with compact structure follow the behaviour of the dominant source population above ˜3 Jy but below this they show Euclidean behaviour. We compare our counts to those predicted by simulations and find a good agreement for our counts of sources with compact structure, but significant disagreement for point source counts. Using low radio frequency SEDs from the GLEAM survey, we classify point sources as Compact Steep-Spectrum (CSS), flat spectrum, or peaked. If we consider the CSS sources to be the more evolved counterparts of the peaked sources, the two categories combined comprise approximately 80% of the point source population. We calculate densities of potential calibrators brighter than 0.4 Jy at low frequencies and find 0.2 sources per square degrees for point sources, rising to 0.7 sources per square degree if sources with more complex arcsecond structure are included. We extrapolate to estimate 4.6 sources per square degrees at 0.04 Jy. We find that a peaked spectrum is an excellent predictor for compactness at low frequencies, increasing the number of good calibrators by a factor of three compared to the usual flat spectrum criterion.
Counting Tree Growth Rings Moderately Difficult to Distinguish
C. B. Briscoe; M. Chudnoff
1964-01-01
There is an extensive literature dealing with techniques and gadgets to facilitate counting tree growth rings. A relatively simple method is described below, satisfactory for species too difficult to count in the field, but not sufficiently difficult to require the preparation of microscope slides nor staining techniques.
Absolute calibration of ultraviolet filter photometry
NASA Technical Reports Server (NTRS)
Bless, R. C.; Fairchild, T.; Code, A. D.
1972-01-01
The essential features of the calibration procedure can be divided into three parts. First, the shape of the bandpass of each photometer was determined by measuring the transmissions of the individual optical components and also by measuring the response of the photometer as a whole. Secondly, each photometer was placed in the essentially-collimated synchrotron radiation bundle maintained at a constant intensity level, and the output signal was determined from about 100 points on the objective. Finally, two or three points on the objective were illuminated by synchrotron radiation at several different intensity levels covering the dynamic range of the photometers. The output signals were placed on an absolute basis by the electron counting technique described earlier.
Molecular analyses of two bacterial sampling methods in ligature-induced periodontitis in rats.
Fontana, Carla Raquel; Grecco, Clovis; Bagnato, Vanderlei Salvador; de Freitas, Laura Marise; Boussios, Constantinos I; Soukos, Nikolaos S
2018-02-01
The prevalence profile of periodontal pathogens in dental plaque can vary as a function of the detection method; however, the sampling technique may also play a role in determining dental plaque microbial profiles. We sought to determine the bacterial composition comparing two sampling methods, one well stablished and a new one proposed here. In this study, a ligature-induced periodontitis model was used in 30 rats. Twenty-seven days later, ligatures were removed and microbiological samples were obtained directly from the ligatures as well as from the periodontal pockets using absorbent paper points. Microbial analysis was performed using DNA probes to a panel of 40 periodontal species in the checkerboard assay. The bacterial composition patterns were similar for both sampling methods. However, detection levels for all species were markedly higher for ligatures compared with paper points. Ligature samples provided more bacterial counts than paper points, suggesting that the technique for induction of periodontitis could also be applied for sampling in rats. Our findings may be helpful in designing studies of induced periodontal disease-associated microbiota.
Detectability of Forest Birds from Stationary Points in Northern Wisconsin
Amy T. Wolf; Robert W. Howe; Gregory J. Davis
1995-01-01
Estimation of avian densities from point counts requires information about the distance at which birds can be detected by the observer. Detection distances also are important for designing the spacing of point counts in a regional sampling scheme. We examined the relationship between distance and detectability for forest songbirds in northern Wisconsin. Like previous...
Modification of Point Counts for Surveying Cropland Birds
Kathryn Freemark; Catherine Rogers
1995-01-01
As part of a comparative study of agricultural impacts on wildlife, modifications to the point count method were evaluated for surveying birds in, and adjacent to, cropland during the breeding season (May to early July) in Ontario. Location in the field, observation direction and distance, number of visits, and number of study sites per farm were examined using point...
NASA Astrophysics Data System (ADS)
Shields, C. A.; Ullrich, P. A.; Rutz, J. J.; Wehner, M. F.; Ralph, M.; Ruby, L.
2017-12-01
Atmospheric rivers (ARs) are long, narrow filamentary structures that transport large amounts of moisture in the lower layers of the atmosphere, typically from subtropical regions to mid-latitudes. ARs play an important role in regional hydroclimate by supplying significant amounts of precipitation that can alleviate drought, or in extreme cases, produce dangerous floods. Accurately detecting, or tracking, ARs is important not only for weather forecasting, but is also necessary to understand how these events may change under global warming. Detection algorithms are used on both regional and global scales, and most accurately, using high resolution datasets, or model output. Different detection algorithms can produce different answers. Detection algorithms found in the current literature fall broadly into two categories: "time-stitching", where the AR is tracked with a Lagrangian approach through time and space; and "counting", where ARs are identified for a single point in time for a single location. Counting routines can be further subdivided into algorithms that use absolute thresholds with specific geometry, to algorithms that use relative thresholds, to algorithms based on statistics, to pattern recognition and machine learning techniques. With such a large diversity in detection code, differences in AR tracking and "counts" can vary widely from technique to technique. Uncertainty increases for future climate scenarios, where the difference between relative and absolute thresholding produce vastly different counts, simply due to the moister background state in a warmer world. In an effort to quantify the uncertainty associated with tracking algorithms, the AR detection community has come together to participate in ARTMIP, the Atmospheric River Tracking Method Intercomparison Project. Each participant will provide AR metrics to the greater group by applying their code to a common reanalysis dataset. MERRA2 data was chosen for both temporal and spatial resolution. After completion of this first phase, Tier 1, ARTMIP participants may choose to contribute to Tier 2, which will range from reanalysis uncertainty, to analysis of future climate scenarios from high resolution model output. ARTMIP's experimental design, techniques, and preliminary metrics will be presented.
NASA Astrophysics Data System (ADS)
Regmi, Raju; Mohan, Kavya; Mondal, Partha Pratim
2014-09-01
Visualization of intracellular organelles is achieved using a newly developed high throughput imaging cytometry system. This system interrogates the microfluidic channel using a sheet of light rather than the existing point-based scanning techniques. The advantages of the developed system are many, including, single-shot scanning of specimens flowing through the microfluidic channel at flow rate ranging from micro- to nano- lit./min. Moreover, this opens-up in-vivo imaging of sub-cellular structures and simultaneous cell counting in an imaging cytometry system. We recorded a maximum count of 2400 cells/min at a flow-rate of 700 nl/min, and simultaneous visualization of fluorescently-labeled mitochondrial network in HeLa cells during flow. The developed imaging cytometry system may find immediate application in biotechnology, fluorescence microscopy and nano-medicine.
Improved confidence intervals when the sample is counted an integer times longer than the blank.
Potter, William Edward; Strzelczyk, Jadwiga Jodi
2011-05-01
Past computer solutions for confidence intervals in paired counting are extended to the case where the ratio of the sample count time to the blank count time is taken to be an integer, IRR. Previously, confidence intervals have been named Neyman-Pearson confidence intervals; more correctly they should have been named Neyman confidence intervals or simply confidence intervals. The technique utilized mimics a technique used by Pearson and Hartley to tabulate confidence intervals for the expected value of the discrete Poisson and Binomial distributions. The blank count and the contribution of the sample to the gross count are assumed to be Poisson distributed. The expected value of the blank count, in the sample count time, is assumed known. The net count, OC, is taken to be the gross count minus the product of IRR with the blank count. The probability density function (PDF) for the net count can be determined in a straightforward manner.
Ertekin, T; Değermenci, M; Nisari, M; Unur, E; Coşkun, A
2016-01-01
The anatomy of the human nasal cavity (NC) is complex and its structures are closely related to the functions of the NC. Studies which assessing the mean volumes of NC and conchae are very infrequent. The purpose of current study is to investigate development of NC and conchae according to age and sex by using stereological method. This retrospective volumetric study was carried out on 342 individuals (166 females and 176 males) between 0 and 18 years old with no pathological conditions or medical procedures that affected the skeletal morphology of the NC. Volumetric estimations were determined on computed tomography (CT) images using point-counting approach of stereological methods. NC, inferior nasal conchae (INC) and middle nasal conchae (MNC) volume measurements that obtained using point-counting method were increased with age in both sexes until 15 years old. Regardless of gender; no significant difference was determined between the left and right values for NC, conchae volumes and choanae measurements. Generally, significant differences were determined in NC and INC volumes according to gender after they reached maximum growth period. According to age the volume ratios of INC to NC and MNC to NC were ranged from 18% to 32% and 9% to15%, respectively. The current study demonstrated that the point-counting method is effective in determining volume estimation of NC and is well suited for CT studies. Our results could provide volumetric indexes for the NC and conchae, which could help the physician for both patient selections for surgery, and for the assessment of any surgical technique used to treatment of nasal obstruction. (.
NASA Technical Reports Server (NTRS)
Herzfeld, Ute Christina; McDonald, Brian W.; Neumann, Thomas Allen; Wallin, Bruce F.; Neumann, Thomas A.; Markus, Thorsten; Brenner, Anita; Field, Christopher
2014-01-01
NASA's Ice, Cloud and Land Elevation Satellite-II (ICESat-2) mission is a decadal survey mission (2016 launch). The mission objectives are to measure land ice elevation, sea ice freeboard, and changes in these variables, as well as to collect measurements over vegetation to facilitate canopy height determination. Two innovative components will characterize the ICESat-2 lidar: 1) collection of elevation data by a multibeam system and 2) application of micropulse lidar (photon-counting) technology. A photon-counting altimeter yields clouds of discrete points, resulting from returns of individual photons, and hence new data analysis techniques are required for elevation determination and association of the returned points to reflectors of interest. The objective of this paper is to derive an algorithm that allows detection of ground under dense canopy and identification of ground and canopy levels in simulated ICESat-2 data, based on airborne observations with a Sigma Space micropulse lidar. The mathematical algorithm uses spatial statistical and discrete mathematical concepts, including radial basis functions, density measures, geometrical anisotropy, eigenvectors, and geostatistical classification parameters and hyperparameters. Validation shows that ground and canopy elevation, and hence canopy height, can be expected to be observable with high accuracy by ICESat-2 for all expected beam energies considered for instrument design (93.01%-99.57% correctly selected points for a beam with expected return of 0.93 mean signals per shot (msp), and 72.85%-98.68% for 0.48 msp). The algorithm derived here is generally applicable for elevation determination from photoncounting lidar altimeter data collected over forested areas, land ice, sea ice, and land surfaces, as well as for cloud detection.
Di Mauro, M.; Manconi, S.; Zechlin, H. -S.; ...
2018-03-29
Here, the Fermi Large Area Telescope (LAT) Collaboration has recently released the Third Catalog of Hard Fermi-LAT Sources (3FHL), which contains 1556 sources detected above 10 GeV with seven years of Pass 8 data. Building upon the 3FHL results, we investigate the flux distribution of sources at high Galactic latitudes (more » $$|b| \\gt 20^\\circ $$), which are mostly blazars. We use two complementary techniques: (1) a source-detection efficiency correction method and (2) an analysis of pixel photon count statistics with the one-point probability distribution function (1pPDF). With the first method, using realistic Monte Carlo simulations of the γ-ray sky, we calculate the efficiency of the LAT to detect point sources. This enables us to find the intrinsic source-count distribution at photon fluxes down to 7.5 × 10 –12 ph cm –2 s –1. With this method, we detect a flux break at (3.5 ± 0.4) × 10 –11 ph cm –2 s –1 with a significance of at least 5.4σ. The power-law indexes of the source-count distribution above and below the break are 2.09 ± 0.04 and 1.07 ± 0.27, respectively. This result is confirmed with the 1pPDF method, which has a sensitivity reach of ~10 –11 ph cm –2 s –1. Integrating the derived source-count distribution above the sensitivity of our analysis, we find that (42 ± 8)% of the extragalactic γ-ray background originates from blazars.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Di Mauro, M.; Manconi, S.; Zechlin, H. -S.
Here, the Fermi Large Area Telescope (LAT) Collaboration has recently released the Third Catalog of Hard Fermi-LAT Sources (3FHL), which contains 1556 sources detected above 10 GeV with seven years of Pass 8 data. Building upon the 3FHL results, we investigate the flux distribution of sources at high Galactic latitudes (more » $$|b| \\gt 20^\\circ $$), which are mostly blazars. We use two complementary techniques: (1) a source-detection efficiency correction method and (2) an analysis of pixel photon count statistics with the one-point probability distribution function (1pPDF). With the first method, using realistic Monte Carlo simulations of the γ-ray sky, we calculate the efficiency of the LAT to detect point sources. This enables us to find the intrinsic source-count distribution at photon fluxes down to 7.5 × 10 –12 ph cm –2 s –1. With this method, we detect a flux break at (3.5 ± 0.4) × 10 –11 ph cm –2 s –1 with a significance of at least 5.4σ. The power-law indexes of the source-count distribution above and below the break are 2.09 ± 0.04 and 1.07 ± 0.27, respectively. This result is confirmed with the 1pPDF method, which has a sensitivity reach of ~10 –11 ph cm –2 s –1. Integrating the derived source-count distribution above the sensitivity of our analysis, we find that (42 ± 8)% of the extragalactic γ-ray background originates from blazars.« less
Validation of a quantitative Eimeria spp. PCR for fresh droppings of broiler chickens.
Peek, H W; Ter Veen, C; Dijkman, R; Landman, W J M
2017-12-01
A quantitative Polymerase Chain Reaction (qPCR) for the seven chicken Eimeria spp. was modified and validated for direct use on fresh droppings. The analytical specificity of the qPCR on droppings was 100%. Its analytical sensitivity (non-sporulated oocysts/g droppings) was 41 for E. acervulina, ≤2900 for E. brunetti, 710 for E. praecox, 1500 for E. necatrix, 190 for E. tenella, 640 for E. maxima, and 1100 for E. mitis. Field validation of the qPCR was done using droppings with non-sporulated oocysts from 19 broiler flocks. To reduce the number of qPCR tests five grams of each pooled sample (consisting of ten fresh droppings) per time point were blended into one mixed sample. Comparison of the oocysts per gram (OPG)-counting method with the qPCR using pooled samples (n = 1180) yielded a Pearson's correlation coefficient of 0.78 (95% CI: 0.76-0.80) and a Pearson's correlation coefficient of 0.76 (95% CI: 0.70-0.81) using mixed samples (n = 236). Comparison of the average of the OPG-counts of the five pooled samples with the mixed sample per time point (n = 236) showed a Pearson's correlation coefficient (R) of 0.94 (95% CI: 0.92-0.95) for the OPG-counting method and 0.87 (95% CI: 0.84-0.90) for the qPCR. This indicates that mixed samples are practically equivalent to the mean of five pooled samples. The good correlation between the OPG-counting method and the qPCR was further confirmed by the visual agreement between the total oocyst/g shedding patterns measured with both techniques in the 19 broiler flocks using the mixed samples.
Incorporating availability for detection in estimates of bird abundance
Diefenbach, D.R.; Marshall, M.R.; Mattice, J.A.; Brauning, D.W.
2007-01-01
Several bird-survey methods have been proposed that provide an estimated detection probability so that bird-count statistics can be used to estimate bird abundance. However, some of these estimators adjust counts of birds observed by the probability that a bird is detected and assume that all birds are available to be detected at the time of the survey. We marked male Henslow's Sparrows (Ammodramus henslowii) and Grasshopper Sparrows (A. savannarum) and monitored their behavior during May-July 2002 and 2003 to estimate the proportion of time they were available for detection. We found that the availability of Henslow's Sparrows declined in late June to <10% for 5- or 10-min point counts when a male had to sing and be visible to the observer; but during 20 May-19 June, males were available for detection 39.1% (SD = 27.3) of the time for 5-min point counts and 43.9% (SD = 28.9) of the time for 10-min point counts (n = 54). We detected no temporal changes in availability for Grasshopper Sparrows, but estimated availability to be much lower for 5-min point counts (10.3%, SD = 12.2) than for 10-min point counts (19.2%, SD = 22.3) when males had to be visible and sing during the sampling period (n = 80). For distance sampling, we estimated the availability of Henslow's Sparrows to be 44.2% (SD = 29.0) and the availability of Grasshopper Sparrows to be 20.6% (SD = 23.5). We show how our estimates of availability can be incorporated in the abundance and variance estimators for distance sampling and modify the abundance and variance estimators for the double-observer method. Methods that directly estimate availability from bird counts but also incorporate detection probabilities need further development and will be important for obtaining unbiased estimates of abundance for these species.
A quartz nanopillar hemocytometer for high-yield separation and counting of CD4+ T lymphocytes
NASA Astrophysics Data System (ADS)
Kim, Dong-Joo; Seol, Jin-Kyeong; Wu, Yu; Ji, Seungmuk; Kim, Gil-Sung; Hyung, Jung-Hwan; Lee, Seung-Yong; Lim, Hyuneui; Fan, Rong; Lee, Sang-Kwon
2012-03-01
We report the development of a novel quartz nanopillar (QNP) array cell separation system capable of selectively capturing and isolating a single cell population including primary CD4+ T lymphocytes from the whole pool of splenocytes. Integrated with a photolithographically patterned hemocytometer structure, the streptavidin (STR)-functionalized-QNP (STR-QNP) arrays allow for direct quantitation of captured cells using high content imaging. This technology exhibits an excellent separation yield (efficiency) of ~95.3 +/- 1.1% for the CD4+ T lymphocytes from the mouse splenocyte suspensions and good linear response for quantitating captured CD4+ T-lymphoblasts, which is comparable to flow cytometry and outperforms any non-nanostructured surface capture techniques, i.e. cell panning. This nanopillar hemocytometer represents a simple, yet efficient cell capture and counting technology and may find immediate applications for diagnosis and immune monitoring in the point-of-care setting.We report the development of a novel quartz nanopillar (QNP) array cell separation system capable of selectively capturing and isolating a single cell population including primary CD4+ T lymphocytes from the whole pool of splenocytes. Integrated with a photolithographically patterned hemocytometer structure, the streptavidin (STR)-functionalized-QNP (STR-QNP) arrays allow for direct quantitation of captured cells using high content imaging. This technology exhibits an excellent separation yield (efficiency) of ~95.3 +/- 1.1% for the CD4+ T lymphocytes from the mouse splenocyte suspensions and good linear response for quantitating captured CD4+ T-lymphoblasts, which is comparable to flow cytometry and outperforms any non-nanostructured surface capture techniques, i.e. cell panning. This nanopillar hemocytometer represents a simple, yet efficient cell capture and counting technology and may find immediate applications for diagnosis and immune monitoring in the point-of-care setting. Electronic supplementary information (ESI) available. See DOI: 10.1039/c2nr11338d
Tender Point Count, Pain, and Mobility in the Older Population: The MOBILIZE Boston Study
Eggermont, Laura H.P.; Shmerling, Robert H.; Leveille, Suzanne G.
2011-01-01
Prevalence of tender points (TP), widespread pain and fibromyalgia, as well as the relationship between TP, widespread pain and mobility was examined in 585 community-dwelling older adults (mean age 78.2 years, 63.4% female). Pain was based on location (none, single site, multisite, widespread). Mobility was measured by the Short Physical Performance Battery (SPPB), gait speed, and self-reported (S–R) mobility difficulty. Tender point count and health characteristics (i.e. BMI, chronic conditions, analgesic use, number of medications, depression, and blocks walked per week) were assessed. Results Several participants had 3 or more TP (22.1%) although prevalence of criteria-based fibromyalgia was low (0.3%). Mobility was more limited in persons with higher tender point counts. After adjustment for pain and other risk factors, higher tender point count was associated with poorer SPPB performance (score<10, aOR=1.09 per TP, 95%CI, 1.01–1.17), and slow gait speed (<0.784m/sec, aOR=1.14 per TP, 95%CI, 1.05–1.24), but not with S–R mobility difficulty. S–R mobility difficulty was associated with more disseminated pain (multisite pain, aOR=2.01, 95%CI, 1.21–3.34; widespread pain, aOR=2.47, 95%CI, 1.09–5.62). These findings portray a significant mobility burden related to tender point count and multisite and widespread pain in the older population. Future studies using longitudinal methods are warranted. PMID:19665937
Considerations for monitoring raptor population trends based on counts of migrants
Titus, K.; Fuller, M.R.; Ruos, J.L.; Meyburg, B-U.; Chancellor, R.D.
1989-01-01
Various problems were identified with standardized hawk count data as annually collected at six sites. Some of the hawk lookouts increased their hours of observation from 1979-1985, thereby confounding the total counts. Data recording and missing data hamper coding of data and their use with modern analytical techniques. Coefficients of variation among years in counts averaged about 40%. The advantages and disadvantages of various analytical techniques are discussed including regression, non-parametric rank correlation trend analysis, and moving averages.
NASA Astrophysics Data System (ADS)
Damiani, F.; Maggio, A.; Micela, G.; Sciortino, S.
1997-07-01
We apply to the specific case of images taken with the ROSAT PSPC detector our wavelet-based X-ray source detection algorithm presented in a companion paper. Such images are characterized by the presence of detector ``ribs,'' strongly varying point-spread function, and vignetting, so that their analysis provides a challenge for any detection algorithm. First, we apply the algorithm to simulated images of a flat background, as seen with the PSPC, in order to calibrate the number of spurious detections as a function of significance threshold and to ascertain that the spatial distribution of spurious detections is uniform, i.e., unaffected by the ribs; this goal was achieved using the exposure map in the detection procedure. Then, we analyze simulations of PSPC images with a realistic number of point sources; the results are used to determine the efficiency of source detection and the accuracy of output quantities such as source count rate, size, and position, upon a comparison with input source data. It turns out that sources with 10 photons or less may be confidently detected near the image center in medium-length (~104 s), background-limited PSPC exposures. The positions of sources detected near the image center (off-axis angles < 15') are accurate to within a few arcseconds. Output count rates and sizes are in agreement with the input quantities, within a factor of 2 in 90% of the cases. The errors on position, count rate, and size increase with off-axis angle and for detections of lower significance. We have also checked that the upper limits computed with our method are consistent with the count rates of undetected input sources. Finally, we have tested the algorithm by applying it on various actual PSPC images, among the most challenging for automated detection procedures (crowded fields, extended sources, and nonuniform diffuse emission). The performance of our method in these images is satisfactory and outperforms those of other current X-ray detection techniques, such as those employed to produce the MPE and WGA catalogs of PSPC sources, in terms of both detection reliability and efficiency. We have also investigated the theoretical limit for point-source detection, with the result that even sources with only 2-3 photons may be reliably detected using an efficient method in images with sufficiently high resolution and low background.
NASA Astrophysics Data System (ADS)
Lockhart, M.; Henzlova, D.; Croft, S.; Cutler, T.; Favalli, A.; McGahee, Ch.; Parker, R.
2018-01-01
Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli(DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory and implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. The current paper discusses and presents the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. In order to assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. The DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.
Effects of diatrizoate and iopamidol on spermatogenesis.
Yaghmai, V; Harapanhalli, R S; Patel, Y D; Baker, S R; Rao, D V
1993-12-01
The biological effects of iodinated contrast media were examined by using spermatogenesis in mouse testis as the experimental model. Spermhead survival and abnormality assays were used as the biological end points. Diatrizoate meglumine/diatrizoate sodium and iopamidol were administered intravenously at equal rates and concentrations. Testicular uptake and clearance of these contrast agents were examined by high-performance liquid chromatography techniques. Appropriate mannitol solutions were employed as osmolality controls. Intravenous administration of the contrast agent or its respective mannitol control resulted in approximately a 30% decrease in spermhead count. A dose-related experiment with mannitol demonstrated that the spermhead count decreased rapidly until 600 mOsm/kg was reached, beyond which this decrease was minimal. Clearance of both contrast media was complete in approximately 4 hours. No significant increase in the induction of spermhead abnormalities was observed. Osmotic substances, such as iodinated contrast agents, affect the process of spermatogenesis.
NASA Astrophysics Data System (ADS)
Ding, Xuemei; Wang, Bingyuan; Liu, Dongyuan; Zhang, Yao; He, Jie; Zhao, Huijuan; Gao, Feng
2018-02-01
During the past two decades there has been a dramatic rise in the use of functional near-infrared spectroscopy (fNIRS) as a neuroimaging technique in cognitive neuroscience research. Diffuse optical tomography (DOT) and optical topography (OT) can be employed as the optical imaging techniques for brain activity investigation. However, most current imagers with analogue detection are limited by sensitivity and dynamic range. Although photon-counting detection can significantly improve detection sensitivity, the intrinsic nature of sequential excitations reduces temporal resolution. To improve temporal resolution, sensitivity and dynamic range, we develop a multi-channel continuous-wave (CW) system for brain functional imaging based on a novel lock-in photon-counting technique. The system consists of 60 Light-emitting device (LED) sources at three wavelengths of 660nm, 780nm and 830nm, which are modulated by current-stabilized square-wave signals at different frequencies, and 12 photomultiplier tubes (PMT) based on lock-in photon-counting technique. This design combines the ultra-high sensitivity of the photon-counting technique with the parallelism of the digital lock-in technique. We can therefore acquire the diffused light intensity for all the source-detector pairs (SD-pairs) in parallel. The performance assessments of the system are conducted using phantom experiments, and demonstrate its excellent measurement linearity, negligible inter-channel crosstalk, strong noise robustness and high temporal resolution.
Betti numbers of holomorphic symplectic quotients via arithmetic Fourier transform.
Hausel, Tamás
2006-04-18
A Fourier transform technique is introduced for counting the number of solutions of holomorphic moment map equations over a finite field. This technique in turn gives information on Betti numbers of holomorphic symplectic quotients. As a consequence, simple unified proofs are obtained for formulas of Poincaré polynomials of toric hyperkähler varieties (recovering results of Bielawski-Dancer and Hausel-Sturmfels), Poincaré polynomials of Hilbert schemes of points and twisted Atiyah-Drinfeld-Hitchin-Manin (ADHM) spaces of instantons on C2 (recovering results of Nakajima-Yoshioka), and Poincaré polynomials of all Nakajima quiver varieties. As an application, a proof of a conjecture of Kac on the number of absolutely indecomposable representations of a quiver is announced.
[Main concepts of preventive health care for the air staff of sea-based aviation].
Mel'nik, S G; Chulaevskiĭ, A O
2013-08-01
The authors researched the air-stuff and complex of adverse factors uncharacteristic for the air-staff of land-based aircraft. It was determined that adverse factors affect the air-staff foremost in 4-5 months of a blue-water sailing, except cardiovascular system diseases. In a month of a blue-water sailing was registered a hypotonic state. Systolic blood pressure varied from 100-105 mm Hg and lower, dystolic blood pressure varied from 60-65 mm Hg and lower. The lowest ranges of blood pressure were registered in three months after the beginning of the sailing. In the following, the hypotonic state, registered during the monthly medical examinations, remained till the end of the sailing. Normal averages of blood pressure were restored in two weeks after the end of sailing. Low red cell count (for more than 1100 points) was registered in 61.5% of patients, (for more than 550 points) in 38.4% of patients. Low white cell count (for more than 4800 points) was registered in 33.3% of patients, (for more than 3300 points) in 41% of patients, (for more than 1330 points) in 25% of patients. Input data was: red cell count--4250 points, white cell count--7300 points in 1 ml of blood. After the sailing haematological indices were restored. The authors suggested guidelines for primary and secondary disease prevention.
NASA Astrophysics Data System (ADS)
Croft, Stephen; Favalli, Andrea
2017-10-01
Neutron multiplicity counting using shift-register calculus is an established technique in the science of international nuclear safeguards for the identification, verification, and assay of special nuclear materials. Typically passive counting is used for Pu and mixed Pu-U items and active methods are used for U materials. Three measured counting rates, singles, doubles and triples are measured and, in combination with a simple analytical point-model, are used to calculate characteristics of the measurement item in terms of known detector and nuclear parameters. However, the measurement problem usually involves more than three quantities of interest, but even in cases where the next higher order count rate, quads, is statistically viable, it is not quantitatively applied because corrections for dead time losses are currently not available in the predominant analysis paradigm. In this work we overcome this limitation by extending the commonly used dead time correction method, developed by Dytlewski, to quads. We also give results for pents, which may be of interest for certain special investigations. Extension to still higher orders, may be accomplished by inspection based on the sequence presented. We discuss the foundations of the Dytlewski method, give limiting cases, and highlight the opportunities and implications that these new results expose. In particular there exist a number of ways in which the new results may be combined with other approaches to extract the correlated rates, and this leads to various practical implementations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Croft, Stephen; Favalli, Andrea
Here, neutron multiplicity counting using shift-register calculus is an established technique in the science of international nuclear safeguards for the identification, verification, and assay of special nuclear materials. Typically passive counting is used for Pu and mixed Pu-U items and active methods are used for U materials. Three measured counting rates, singles, doubles and triples are measured and, in combination with a simple analytical point-model, are used to calculate characteristics of the measurement item in terms of known detector and nuclear parameters. However, the measurement problem usually involves more than three quantities of interest, but even in cases where themore » next higher order count rate, quads, is statistically viable, it is not quantitatively applied because corrections for dead time losses are currently not available in the predominant analysis paradigm. In this work we overcome this limitation by extending the commonly used dead time correction method, developed by Dytlewski, to quads. We also give results for pents, which may be of interest for certain special investigations. Extension to still higher orders, may be accomplished by inspection based on the sequence presented. We discuss the foundations of the Dytlewski method, give limiting cases, and highlight the opportunities and implications that these new results expose. In particular there exist a number of ways in which the new results may be combined with other approaches to extract the correlated rates, and this leads to various practical implementations.« less
Croft, Stephen; Favalli, Andrea
2017-07-16
Here, neutron multiplicity counting using shift-register calculus is an established technique in the science of international nuclear safeguards for the identification, verification, and assay of special nuclear materials. Typically passive counting is used for Pu and mixed Pu-U items and active methods are used for U materials. Three measured counting rates, singles, doubles and triples are measured and, in combination with a simple analytical point-model, are used to calculate characteristics of the measurement item in terms of known detector and nuclear parameters. However, the measurement problem usually involves more than three quantities of interest, but even in cases where themore » next higher order count rate, quads, is statistically viable, it is not quantitatively applied because corrections for dead time losses are currently not available in the predominant analysis paradigm. In this work we overcome this limitation by extending the commonly used dead time correction method, developed by Dytlewski, to quads. We also give results for pents, which may be of interest for certain special investigations. Extension to still higher orders, may be accomplished by inspection based on the sequence presented. We discuss the foundations of the Dytlewski method, give limiting cases, and highlight the opportunities and implications that these new results expose. In particular there exist a number of ways in which the new results may be combined with other approaches to extract the correlated rates, and this leads to various practical implementations.« less
A COMPARISON OF GALAXY COUNTING TECHNIQUES IN SPECTROSCOPICALLY UNDERSAMPLED REGIONS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Specian, Mike A.; Szalay, Alex S., E-mail: mspecia1@jhu.edu, E-mail: szalay@jhu.edu
2016-11-01
Accurate measures of galactic overdensities are invaluable for precision cosmology. Obtaining these measurements is complicated when members of one’s galaxy sample lack radial depths, most commonly derived via spectroscopic redshifts. In this paper, we utilize the Sloan Digital Sky Survey’s Main Galaxy Sample to compare seven methods of counting galaxies in cells when many of those galaxies lack redshifts. These methods fall into three categories: assigning galaxies discrete redshifts, scaling the numbers counted using regions’ spectroscopic completeness properties, and employing probabilistic techniques. We split spectroscopically undersampled regions into three types—those inside the spectroscopic footprint, those outside but adjacent to it,more » and those distant from it. Through Monte Carlo simulations, we demonstrate that the preferred counting techniques are a function of region type, cell size, and redshift. We conclude by reporting optimal counting strategies under a variety of conditions.« less
Tacker, M; Hametner, C; Wepner, B
2002-01-01
Packaging materials are often considered a critical control point in HACCP systems of food companies. Methods for the determination of the microbial contamination rate of plastic cups, especially for dairy products, must reliably detect single moulds, yeasts or coliforms. In this study, a comparison of a specially adapted coating method, impedance method, direct inoculation and membrane filter technique was carried out to determine contamination with yeasts, moulds, coliforms and total bacterial counts using the appropriate agar in each case. The coating method is recommended for determining yeasts, moulds and coliforms as it allows the localization of the microorganisms as well as the determination of single microorganisms. For total bacterial count, a direct inoculation technique is proposed. The employing of simple measures in the production and during transport of packaging materials, such as dust-prevention or tight sealing in polyethylene bags, heavily reduces microbial contamination rates of packaging material. To reduce contamination rates further, electron beam irradiation was applied: plastic cups sealed in polyethylene bags were treated with 4-5 kGy, a dose that already leads to sterile polystyrene and polypropylene cups without influencing mechanical characteristics of the packaging material.
NASA Astrophysics Data System (ADS)
Nohtomi, Akihiro; Wakabayashi, Genichiro
2015-11-01
We evaluated the accuracy of a self-activation method with iodine-containing scintillators in quantifying 128I generation in an activation detector; the self-activation method was recently proposed for photo-neutron on-line measurements around X-ray radiotherapy machines. Here, we consider the accuracy of determining the initial count rate R0, observed just after termination of neutron irradiation of the activation detector. The value R0 is directly related to the amount of activity generated by incident neutrons; the detection efficiency of radiation emitted from the activity should be taken into account for such an evaluation. Decay curves of 128I activity were numerically simulated by a computer program for various conditions including different initial count rates (R0) and background rates (RB), as well as counting statistical fluctuations. The data points sampled at minute intervals and integrated over the same period were fit by a non-linear least-squares fitting routine to obtain the value R0 as a fitting parameter with an associated uncertainty. The corresponding background rate RB was simultaneously calculated in the same fitting routine. Identical data sets were also evaluated by a well-known integration algorithm used for conventional activation methods and the results were compared with those of the proposed fitting method. When we fixed RB = 500 cpm, the relative uncertainty σR0 /R0 ≤ 0.02 was achieved for R0/RB ≥ 20 with 20 data points from 1 min to 20 min following the termination of neutron irradiation used in the fitting; σR0 /R0 ≤ 0.01 was achieved for R0/RB ≥ 50 with the same data points. Reasonable relative uncertainties to evaluate initial count rates were reached by the decay-fitting method using practically realistic sampling numbers. These results clarified the theoretical limits of the fitting method. The integration method was found to be potentially vulnerable to short-term variations in background levels, especially instantaneous contaminations by spike-like noise. The fitting method easily detects and removes such spike-like noise.
Laamrani, Ahmed; Pardo Lara, Renato; Berg, Aaron A; Branson, Dave; Joosse, Pamela
2018-02-27
Quantifying the amount of crop residue left in the field after harvest is a key issue for sustainability. Conventional assessment approaches (e.g., line-transect) are labor intensive, time-consuming and costly. Many proximal remote sensing devices and systems have been developed for agricultural applications such as cover crop and residue mapping. For instance, current mobile devices (smartphones & tablets) are usually equipped with digital cameras and global positioning systems and use applications (apps) for in-field data collection and analysis. In this study, we assess the feasibility and strength of a mobile device app developed to estimate crop residue cover. The performance of this novel technique (from here on referred to as "app" method) was compared against two point counting approaches: an established digital photograph-grid method and a new automated residue counting script developed in MATLAB at the University of Guelph. Both photograph-grid and script methods were used to count residue under 100 grid points. Residue percent cover was estimated using the app, script and photograph-grid methods on 54 vertical digital photographs (images of the ground taken from above at a height of 1.5 m) collected from eighteen fields (9 corn and 9 soybean, 3 samples each) located in southern Ontario. Results showed that residue estimates from the app method were in good agreement with those obtained from both photograph-grid and script methods (R² = 0.86 and 0.84, respectively). This study has found that the app underestimates the residue coverage by -6.3% and -10.8% when compared to the photograph-grid and script methods, respectively. With regards to residue type, soybean has a slightly lower bias than corn (i.e., -5.3% vs. -7.4%). For photos with residue <30%, the app derived residue measurements are within ±5% difference (bias) of both photograph-grid- and script-derived residue measurements. These methods could therefore be used to track the recommended minimum soil residue cover of 30%, implemented to reduce farmland topsoil and nutrient losses that impact water quality. Overall, the app method was found to be a good alternative to the point counting methods, which are more time-consuming.
Linear LIDAR versus Geiger-mode LIDAR: impact on data properties and data quality
NASA Astrophysics Data System (ADS)
Ullrich, A.; Pfennigbauer, M.
2016-05-01
LIDAR has become the inevitable technology to provide accurate 3D data fast and reliably even in adverse measurement situations and harsh environments. It provides highly accurate point clouds with a significant number of additional valuable attributes per point. LIDAR systems based on Geiger-mode avalanche photo diode arrays, also called single photon avalanche photo diode arrays, earlier employed for military applications, now seek to enter the commercial market of 3D data acquisition, advertising higher point acquisition speeds from longer ranges compared to conventional techniques. Publications pointing out the advantages of these new systems refer to the other category of LIDAR as "linear LIDAR", as the prime receiver element for detecting the laser echo pulses - avalanche photo diodes - are used in a linear mode of operation. We analyze the differences between the two LIDAR technologies and the fundamental differences in the data they provide. The limitations imposed by physics on both approaches to LIDAR are also addressed and advantages of linear LIDAR over the photon counting approach are discussed.
NASA Technical Reports Server (NTRS)
Wang, Alian; Kuebler, Karla E.; Jolliff, Brad L.
2000-01-01
The distribution of pyroxenes of different Mg' and olivines of different Fo in lithologies A and B were obtained. Three types of olivine formed at different stages of rock formation were found by point counting Raman measurements along linear traverses.
Triple-Label β Liquid Scintillation Counting
Bukowski, Thomas R.; Moffett, Tyler C.; Revkin, James H.; Ploger, James D.; Bassingthwaighte, James B.
2010-01-01
The detection of radioactive compounds by liquid scintillation has revolutionized modern biology, yet few investigators make full use of the power of this technique. Even though multiple isotope counting is considerably more difficult than single isotope counting, many experimental designs would benefit from using more than one isotope. The development of accurate isotope counting techniques enabling the simultaneous use of three β-emitting tracers has facilitated studies in our laboratory using the multiple tracer indicator dilution technique for assessing rates of transmembrane transport and cellular metabolism. The details of sample preparation, and of stabilizing the liquid scintillation spectra of the tracers, are critical to obtaining good accuracy. Reproducibility is enhanced by obtaining detailed efficiency/quench curves for each particular set of tracers and solvent media. The numerical methods for multiple-isotope quantitation depend on avoiding error propagation (inherent to successive subtraction techniques) by using matrix inversion. Experimental data obtained from triple-label β counting illustrate reproducibility and good accuracy even when the relative amounts of different tracers in samples of protein/electrolyte solutions, plasma, and blood are changed. PMID:1514684
Aldinger, Kyle R.; Wood, Petra B.
2015-01-01
Detection probability during point counts and its associated variables are important considerations for bird population monitoring and have implications for conservation planning by influencing population estimates. During 2008–2009, we evaluated variables hypothesized to be associated with detection probability, detection latency, and behavioral responses of male Golden-winged Warblers in pastures in the Monongahela National Forest, West Virginia, USA. This is the first study of male Golden-winged Warbler detection probability, detection latency, or behavioral response based on point-count sampling with known territory locations and identities for all males. During 3-min passive point counts, detection probability decreased as distance to a male's territory and time since sunrise increased. During 3-min point counts with playback, detection probability decreased as distance to a male's territory increased, but remained constant as time since sunrise increased. Detection probability was greater when point counts included type 2 compared with type 1 song playback, particularly during the first 2 min of type 2 song playback. Golden-winged Warblers primarily use type 1 songs (often zee bee bee bee with a higher-pitched first note) in intersexual contexts and type 2 songs (strident, rapid stutter ending with a lower-pitched buzzy note) in intrasexual contexts. Distance to a male's territory, ordinal date, and song playback type were associated with the type of behavioral response to song playback. Overall, ~2 min of type 2 song playback may increase the efficacy of point counts for monitoring populations of Golden-winged Warblers by increasing the conspicuousness of males for visual identification and offsetting the consequences of surveying later in the morning. Because playback may interfere with the ability to detect distant males, it is important to follow playback with a period of passive listening. Our results indicate that even in relatively open pasture vegetation, detection probability of male Golden-winged Warblers is imperfect and highly variable.
Luksamijarulkul, Pipat; Suknongbung, Siranee; Vatanasomboon, Pisit; Sujirarut, Dusit
2017-03-01
A large number of migrants have move to cities in Thailand seeking employment. These people may be at increased risk for environmental health problems. We studied the health status, environmental living conditions and microbial indoor air quality (IAQ) among selected groups of migrant workers and their households in Mueang District, Samut Sakhon, central Thailand. We conducted a cross sectional study of 240 migrant workers and their households randomly selected by multistage sampling. The person responsible for hygiene at each studied household was interviewed using a structured questionnaire. Two indoor air samples were taken from each household (480 indoor air samples) to determine bacterial and fungal counts using a Millipore air tester; 240 outdoor air samples were collected for comparison. Ninety-nine point six percent of study subjects were Myanmar, 74.2% were aged 21-40 years, 91.7% had a primary school level education or lower and 53.7% had stayed in Thailand less than 5 years. Eight point three percent had a history of an underlying disease, 20.8% had a recent history of pulmonary tuberculosis in a family member within the previous year. Forty-three point eight percent had a current illness related to IAQ during a previous month. Twenty-one point three were current cigarette smokers, 15.0% were current alcohol consumers, and 5.0% exercises ≥3 times per week. Forty-nine point two percent never opened the windows of their bedrooms or living rooms for ventilation, 45% never cleaned their window screens, and 38.3% never put their pillows or mattresses in the sunlight. The mean(±SD) air bacterial count was 230(±229) CFU/m3 (outdoor air = 128±82 CFU/ m3), and the mean fungal count was 630(±842) CFU/m3 (outdoor air = 138±94 CFU/ m3). When the bacterial and fungal counts were compared with the guidelines of the American Conference of Governmental Industrial Hygienists, the bacterial counts in 6.5% of houses surveyed and the fungal counts in 28.8% of house surveyed were higher than the recommended levels (<500 CFU/m3). Bacterial and fungal counts in the sample households were not significantly correlated with household hygiene practice scores (p>0.05). There was a positive correlation between bacterial counts and fungal counts in household air samples, r=0.28, p<0.001.
Monosomy 3 by FISH in uveal melanoma: variability in techniques and results.
Aronow, Mary; Sun, Yang; Saunthararajah, Yogen; Biscotti, Charles; Tubbs, Raymond; Triozzi, Pierre; Singh, Arun D
2012-09-01
Tumor monosomy 3 confers a poor prognosis in patients with uveal melanoma. We critically review the techniques used for fluorescence in situ hybridization (FISH) detection of monosomy 3 in order to assess variability in practice patterns and to explain differences in results. Significant variability that has likely affected reported results was found in tissue sampling methods, selection of FISH probes, number of cells counted, and the cut-off point used to determine monosomy 3 status. Clinical parameters and specific techniques employed to report FISH results should be specified so as to allow meta-analysis of published studies. FISH-based detection of monosomy 3 in uveal melanoma has not been performed in a standardized manner, which limits conclusions regarding its clinical utility. FISH is a widely available, versatile technology, and when performed optimally has the potential to be a valuable tool for determining the prognosis of uveal melanoma. Copyright © 2012 Elsevier Inc. All rights reserved.
The endpoint detection technique for deep submicrometer plasma etching
NASA Astrophysics Data System (ADS)
Wang, Wei; Du, Zhi-yun; Zeng, Yong; Lan, Zhong-went
2009-07-01
The availability of reliable optical sensor technology provides opportunities to better characterize and control plasma etching processes in real time, they could play a important role in endpoint detection, fault diagnostics and processes feedback control and so on. The optical emission spectroscopy (OES) method becomes deficient in the case of deep submicrometer gate etching. In the newly developed high density inductively coupled plasma (HD-ICP) etching system, Interferometry endpoint (IEP) is introduced to get the EPD. The IEP fringe count algorithm is investigated to predict the end point, and then its signal is used to control etching rate and to call end point with OES signal in over etching (OE) processes step. The experiment results show that IEP together with OES provide extra process control margin for advanced device with thinner gate oxide.
ERIC Educational Resources Information Center
Walsh, Jeffrey A.; Braithwaite, Jeremy
2008-01-01
This work, drawing on the literature on alcohol consumption, sexual behavior, and researching sensitive topics, tests the efficacy of the unmatched-count technique (UCT) in establishing higher rates of truthful self-reporting when compared to traditional survey techniques. Traditional techniques grossly underestimate the scope of problems…
Designing and implementing a monitoring program and the standards for conducting point counts
C. John Ralph
1993-01-01
Choosing between the apparent plethora of methods for monitoring bird populations is a dilemma for a person contemplating beginning a monitoring program. Cooperrider et al. (1986) and Koskimies and Vaisanen (1991) describe many methods. In the Americas, three methods have been suggested as standard (Butcher 1992). They are: point counts for determining habitat...
High-mountain lakes provide a seasonal niche for migrant American dippers
J. M. Garwood; K. L. Pope; R. M. Bourque; M. D. Larson
2009-01-01
We studied summer use of high elevation lakes by American Dippers (Cinclus mexicanus) in the Trinity Alps Wilderness, California by conducting repeated point-count surveys at 16 study lakes coupled with a 5-year detailed survey of all available aquatic habitats in a single basin. We observed American Dippers during 36% of the point-count surveys...
Code of Federal Regulations, 2014 CFR
2014-07-01
... characteristics of anisotropic particles. Quantitative analysis involves the use of point counting. Point counting... 0.004. • Refractive Index Liquids for Dispersion Staining: high-dispersion series, 1.550, 1.605, 1... hand. Repeat the series. Collect the dispersed solids by centrifugation at 1000 rpm for 5 minutes. Wash...
Code of Federal Regulations, 2010 CFR
2010-07-01
... characteristics of anisotropic particles. Quantitative analysis involves the use of point counting. Point counting... 0.004. • Refractive Index Liquids for Dispersion Staining: high-dispersion series, 1.550, 1.605, 1... hand. Repeat the series. Collect the dispersed solids by centrifugation at 1000 rpm for 5 minutes. Wash...
Code of Federal Regulations, 2011 CFR
2011-07-01
... characteristics of anisotropic particles. Quantitative analysis involves the use of point counting. Point counting... 0.004. • Refractive Index Liquids for Dispersion Staining: high-dispersion series, 1.550, 1.605, 1... hand. Repeat the series. Collect the dispersed solids by centrifugation at 1000 rpm for 5 minutes. Wash...
Code of Federal Regulations, 2012 CFR
2012-07-01
... characteristics of anisotropic particles. Quantitative analysis involves the use of point counting. Point counting... 0.004. • Refractive Index Liquids for Dispersion Staining: high-dispersion series, 1.550, 1.605, 1... hand. Repeat the series. Collect the dispersed solids by centrifugation at 1000 rpm for 5 minutes. Wash...
Code of Federal Regulations, 2013 CFR
2013-07-01
... characteristics of anisotropic particles. Quantitative analysis involves the use of point counting. Point counting... 0.004. • Refractive Index Liquids for Dispersion Staining: high-dispersion series, 1.550, 1.605, 1... hand. Repeat the series. Collect the dispersed solids by centrifugation at 1000 rpm for 5 minutes. Wash...
Lockhart, M.; Henzlova, D.; Croft, S.; ...
2017-09-20
Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli (DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory andmore » implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. Here, we discuss and present the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. To assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. In conclusion, the DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lockhart, M.; Henzlova, D.; Croft, S.
Over the past few decades, neutron multiplicity counting has played an integral role in Special Nuclear Material (SNM) characterization pertaining to nuclear safeguards. Current neutron multiplicity analysis techniques use singles, doubles, and triples count rates because a methodology to extract and dead time correct higher order count rates (i.e. quads and pents) was not fully developed. This limitation is overcome by the recent extension of a popular dead time correction method developed by Dytlewski. This extended dead time correction algorithm, named Dytlewski-Croft-Favalli (DCF), is detailed in reference Croft and Favalli (2017), which gives an extensive explanation of the theory andmore » implications of this new development. Dead time corrected results can then be used to assay SNM by inverting a set of extended point model equations which as well have only recently been formulated. Here, we discuss and present the experimental evaluation of practical feasibility of the DCF dead time correction algorithm to demonstrate its performance and applicability in nuclear safeguards applications. In order to test the validity and effectiveness of the dead time correction for quads and pents, 252Cf and SNM sources were measured in high efficiency neutron multiplicity counters at the Los Alamos National Laboratory (LANL) and the count rates were extracted up to the fifth order and corrected for dead time. To assess the DCF dead time correction, the corrected data is compared to traditional dead time correction treatment within INCC. In conclusion, the DCF dead time correction is found to provide adequate dead time treatment for broad range of count rates available in practical applications.« less
Mapping the acquisition of the number word sequence in the first year of school
NASA Astrophysics Data System (ADS)
Gould, Peter
2017-03-01
Learning to count and to produce the correct sequence of number words in English is not a simple process. In NSW government schools taking part in Early Action for Success, over 800 students in each of the first 3 years of school were assessed every 5 weeks over the school year to determine the highest correct oral count they could produce. Rather than displaying a steady increase in the accurate sequence of the number words produced, the kindergarten data reported here identified clear, substantial hurdles in the acquisition of the counting sequence. The large-scale, longitudinal data also provided evidence of learning to count through the teens being facilitated by the semi-regular structure of the number words in English. Instead of occurring as hurdles to starting the next counting sequence, number words corresponding to some multiples of ten (10, 20 and 100) acted as if they were rest points. These rest points appear to be artefacts of how the counting sequence is acquired.
Mapping the layer count of few-layer hexagonal boron nitride at high lateral spatial resolutions
NASA Astrophysics Data System (ADS)
Mohsin, Ali; Cross, Nicholas G.; Liu, Lei; Watanabe, Kenji; Taniguchi, Takashi; Duscher, Gerd; Gu, Gong
2018-01-01
Layer count control and uniformity of two dimensional (2D) layered materials are critical to the investigation of their properties and to their electronic device applications, but methods to map 2D material layer count at nanometer-level lateral spatial resolutions have been lacking. Here, we demonstrate a method based on two complementary techniques widely available in transmission electron microscopes (TEMs) to map the layer count of multilayer hexagonal boron nitride (h-BN) films. The mass-thickness contrast in high-angle annular dark-field (HAADF) imaging in the scanning transmission electron microscope (STEM) mode allows for thickness determination in atomically clean regions with high spatial resolution (sub-nanometer), but is limited by surface contamination. To complement, another technique based on the boron K ionization edge in the electron energy loss spectroscopy spectrum (EELS) of h-BN is developed to quantify the layer count so that surface contamination does not cause an overestimate, albeit at a lower spatial resolution (nanometers). The two techniques agree remarkably well in atomically clean regions with discrepancies within ±1 layer. For the first time, the layer count uniformity on the scale of nanometers is quantified for a 2D material. The methodology is applicable to layer count mapping of other 2D layered materials, paving the way toward the synthesis of multilayer 2D materials with homogeneous layer count.
Galaxy evolution and large-scale structure in the far-infrared. I - IRAS pointed observations
NASA Astrophysics Data System (ADS)
Lonsdale, Carol J.; Hacking, Perry B.
1989-04-01
Redshifts for 66 galaxies were obtained from a sample of 93 60-micron sources detected serendipitously in 22 IRAS deep pointed observations, covering a total area of 18.4 sq deg. The flux density limit of this survey is 150 mJy, 4 times fainter than the IRAS Point Source Catalog (PSC). The luminosity function is similar in shape with those previously published for samples selected from the PSC, with a median redshift of 0.048 for the fainter sample, but shifted to higher space densities. There is evidence that some of the excess number counts in the deeper sample can be explained in terms of a large-scale density enhancement beyond the Pavo-Indus supercluster. In addition, the faintest counts in the new sample confirm the result of Hacking et al. (1989) that faint IRAS 60-micron source counts lie significantly in excess of an extrapolation of the PSC counts assuming no luminosity or density evolution.
Galaxy evolution and large-scale structure in the far-infrared. I. IRAS pointed observations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lonsdale, C.J.; Hacking, P.B.
1989-04-01
Redshifts for 66 galaxies were obtained from a sample of 93 60-micron sources detected serendipitously in 22 IRAS deep pointed observations, covering a total area of 18.4 sq deg. The flux density limit of this survey is 150 mJy, 4 times fainter than the IRAS Point Source Catalog (PSC). The luminosity function is similar in shape with those previously published for samples selected from the PSC, with a median redshift of 0.048 for the fainter sample, but shifted to higher space densities. There is evidence that some of the excess number counts in the deeper sample can be explained inmore » terms of a large-scale density enhancement beyond the Pavo-Indus supercluster. In addition, the faintest counts in the new sample confirm the result of Hacking et al. (1989) that faint IRAS 60-micron source counts lie significantly in excess of an extrapolation of the PSC counts assuming no luminosity or density evolution. 81 refs.« less
Galaxy evolution and large-scale structure in the far-infrared. I - IRAS pointed observations
NASA Technical Reports Server (NTRS)
Lonsdale, Carol J.; Hacking, Perry B.
1989-01-01
Redshifts for 66 galaxies were obtained from a sample of 93 60-micron sources detected serendipitously in 22 IRAS deep pointed observations, covering a total area of 18.4 sq deg. The flux density limit of this survey is 150 mJy, 4 times fainter than the IRAS Point Source Catalog (PSC). The luminosity function is similar in shape with those previously published for samples selected from the PSC, with a median redshift of 0.048 for the fainter sample, but shifted to higher space densities. There is evidence that some of the excess number counts in the deeper sample can be explained in terms of a large-scale density enhancement beyond the Pavo-Indus supercluster. In addition, the faintest counts in the new sample confirm the result of Hacking et al. (1989) that faint IRAS 60-micron source counts lie significantly in excess of an extrapolation of the PSC counts assuming no luminosity or density evolution.
Estimating pore and cement volumes in thin section
Halley, R.B.
1978-01-01
Point count estimates of pore, grain and cement volumes from thin sections are inaccurate, often by more than 100 percent, even though they may be surprisingly precise (reproducibility + or - 3 percent). Errors are produced by: 1) inclusion of submicroscopic pore space within solid volume and 2) edge effects caused by grain curvature within a 30-micron thick thin section. Submicroscopic porosity may be measured by various physical tests or may be visually estimated from scanning electron micrographs. Edge error takes the form of an envelope around grains and increases with decreasing grain size and sorting, increasing grain irregularity and tighter grain packing. Cements are greatly involved in edge error because of their position at grain peripheries and their generally small grain size. Edge error is minimized by methods which reduce the thickness of the sample viewed during point counting. Methods which effectively reduce thickness include use of ultra-thin thin sections or acetate peels, point counting in reflected light, or carefully focusing and counting on the upper surface of the thin section.
Global epidemiology and public health in the 21st century. Applications of new technology.
Laporte, R E; Barinas, E; Chang, Y F; Libman, I
1996-03-01
Epidemiology and public health need to change for the upcoming problems of the 21st century and beyond. We outline a four-point approach to produce this change. The first one is to take a systems approach to disease. The second approach discussed is the use of new techniques to "count" disease using capture-recapture. The third represents the application of telecommunications, especially the Internet, to public health. The fourth and final component represents the application, at the local health department level, of a total quality approach, as espoused by Deming, for the prevention of disease.
Nonequilibrium Phase Transition in a Periodically Driven XY Spin Chain
NASA Astrophysics Data System (ADS)
Prosen, Tomaž; Ilievski, Enej
2011-08-01
We present a general formulation of Floquet states of periodically time-dependent open Markovian quasifree fermionic many-body systems in terms of a discrete Lyapunov equation. Illustrating the technique, we analyze periodically kicked XY spin-(1)/(2) chain which is coupled to a pair of Lindblad reservoirs at its ends. A complex phase diagram is reported with reentrant phases of long range and exponentially decaying spin-spin correlations as some of the system’s parameters are varied. The structure of phase diagram is reproduced in terms of counting nontrivial stationary points of Floquet quasiparticle dispersion relation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, S.M.; Bayly, R.J.
1986-01-01
This book contains the following chapters: Some prerequisites to the use of radionuclides in haematology; Instrumentation and counting techniques; In vitro techniques; Cell labelling; Protein labelling; Autoradiography; Imaging and quantitative scanning; Whole body counting; Absorption and excretion studies; Blood volume studies; Plasma clearance studies; and Radionuclide blood cell survival studies.
Nathaniel E. Seavy; Suhel Quader; John D. Alexander; C. John Ralph
2005-01-01
The success of avian monitoring programs to effectively guide management decisions requires that studies be efficiently designed and data be properly analyzed. A complicating factor is that point count surveys often generate data with non-normal distributional properties. In this paper we review methods of dealing with deviations from normal assumptions, and we focus...
Habitat relationships of landbirds in the Northern Region, USDA Forest Service
Richard L. Hutto; Jock S. Young
1999-01-01
A series of first-generation habitat-relationships models for 83 bird species were detected in a 3-year study on point counts conducted in association with the USDA Forest Service's Northern Region Landbird Monitoring Program. The models depict probabilities of detection for each of the bird species on 100-m-radius, 10-minute point counts conducted across a series...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Santi, Peter Angelo; Cutler, Theresa Elizabeth; Favalli, Andrea
In order to improve the accuracy and capabilities of neutron multiplicity counting, additional quantifiable information is needed in order to address the assumptions that are present in the point model. Extracting and utilizing higher order moments (Quads and Pents) from the neutron pulse train represents the most direct way of extracting additional information from the measurement data to allow for an improved determination of the physical properties of the item of interest. The extraction of higher order moments from a neutron pulse train required the development of advanced dead time correction algorithms which could correct for dead time effects inmore » all of the measurement moments in a self-consistent manner. In addition, advanced analysis algorithms have been developed to address specific assumptions that are made within the current analysis model, namely that all neutrons are created at a single point within the item of interest, and that all neutrons that are produced within an item are created with the same energy distribution. This report will discuss the current status of implementation and initial testing of the advanced dead time correction and analysis algorithms that have been developed in an attempt to utilize higher order moments to improve the capabilities of correlated neutron measurement techniques.« less
NASA Astrophysics Data System (ADS)
Ofek, Eran O.; Zackay, Barak
2018-04-01
Detection of templates (e.g., sources) embedded in low-number count Poisson noise is a common problem in astrophysics. Examples include source detection in X-ray images, γ-rays, UV, neutrinos, and search for clusters of galaxies and stellar streams. However, the solutions in the X-ray-related literature are sub-optimal in some cases by considerable factors. Using the lemma of Neyman–Pearson, we derive the optimal statistics for template detection in the presence of Poisson noise. We demonstrate that, for known template shape (e.g., point sources), this method provides higher completeness, for a fixed false-alarm probability value, compared with filtering the image with the point-spread function (PSF). In turn, we find that filtering by the PSF is better than filtering the image using the Mexican-hat wavelet (used by wavdetect). For some background levels, our method improves the sensitivity of source detection by more than a factor of two over the popular Mexican-hat wavelet filtering. This filtering technique can also be used for fast PSF photometry and flare detection; it is efficient and straightforward to implement. We provide an implementation in MATLAB. The development of a complete code that works on real data, including the complexities of background subtraction and PSF variations, is deferred for future publication.
A Unified Approach to Electron Counting in Main-Group Clusters
ERIC Educational Resources Information Center
McGrady, John E.
2004-01-01
A presentation of an extensive review of traditional approaches to teaching electron counting is given. The electron-precise clusters are usually taken as a reference point for rationalizing the structures of their electron-rich counterparts, which are characterized by valence electron counts greater than 5n.
Shapes on a plane: Evaluating the impact of projection distortion on spatial binning
Battersby, Sarah E.; Strebe, Daniel “daan”; Finn, Michael P.
2017-01-01
One method for working with large, dense sets of spatial point data is to aggregate the measure of the data into polygonal containers, such as political boundaries, or into regular spatial bins such as triangles, squares, or hexagons. When mapping these aggregations, the map projection must inevitably distort relationships. This distortion can impact the reader’s ability to compare count and density measures across the map. Spatial binning, particularly via hexagons, is becoming a popular technique for displaying aggregate measures of point data sets. Increasingly, we see questionable use of the technique without attendant discussion of its hazards. In this work, we discuss when and why spatial binning works and how mapmakers can better understand the limitations caused by distortion from projecting to the plane. We introduce equations for evaluating distortion’s impact on one common projection (Web Mercator) and discuss how the methods used generalize to other projections. While we focus on hexagonal binning, these same considerations affect spatial bins of any shape, and more generally, any analysis of geographic data performed in planar space.
ERIC Educational Resources Information Center
Denny, Paula J.; Test, David W.
1995-01-01
This study extended use of the One-More-Than technique by using a "cents-pile modification"; one-, five-, and ten-dollar bills; and mixed training of all dollar amounts. Three high school students with moderate mental retardation each learned to use the technique to count out nontrained amounts and to make community purchases. (Author/PB)
Probabilistic techniques for obtaining accurate patient counts in Clinical Data Warehouses
Myers, Risa B.; Herskovic, Jorge R.
2011-01-01
Proposal and execution of clinical trials, computation of quality measures and discovery of correlation between medical phenomena are all applications where an accurate count of patients is needed. However, existing sources of this type of patient information, including Clinical Data Warehouses (CDW) may be incomplete or inaccurate. This research explores applying probabilistic techniques, supported by the MayBMS probabilistic database, to obtain accurate patient counts from a clinical data warehouse containing synthetic patient data. We present a synthetic clinical data warehouse (CDW), and populate it with simulated data using a custom patient data generation engine. We then implement, evaluate and compare different techniques for obtaining patients counts. We model billing as a test for the presence of a condition. We compute billing’s sensitivity and specificity both by conducting a “Simulated Expert Review” where a representative sample of records are reviewed and labeled by experts, and by obtaining the ground truth for every record. We compute the posterior probability of a patient having a condition through a “Bayesian Chain”, using Bayes’ Theorem to calculate the probability of a patient having a condition after each visit. The second method is a “one-shot” approach that computes the probability of a patient having a condition based on whether the patient is ever billed for the condition Our results demonstrate the utility of probabilistic approaches, which improve on the accuracy of raw counts. In particular, the simulated review paired with a single application of Bayes’ Theorem produces the best results, with an average error rate of 2.1% compared to 43.7% for the straightforward billing counts. Overall, this research demonstrates that Bayesian probabilistic approaches improve patient counts on simulated patient populations. We believe that total patient counts based on billing data are one of the many possible applications of our Bayesian framework. Use of these probabilistic techniques will enable more accurate patient counts and better results for applications requiring this metric. PMID:21986292
Chi-squared and C statistic minimization for low count per bin data
NASA Astrophysics Data System (ADS)
Nousek, John A.; Shue, David R.
1989-07-01
Results are presented from a computer simulation comparing two statistical fitting techniques on data samples with large and small counts per bin; the results are then related specifically to X-ray astronomy. The Marquardt and Powell minimization techniques are compared by using both to minimize the chi-squared statistic. In addition, Cash's C statistic is applied, with Powell's method, and it is shown that the C statistic produces better fits in the low-count regime than chi-squared.
Chi-squared and C statistic minimization for low count per bin data. [sampling in X ray astronomy
NASA Technical Reports Server (NTRS)
Nousek, John A.; Shue, David R.
1989-01-01
Results are presented from a computer simulation comparing two statistical fitting techniques on data samples with large and small counts per bin; the results are then related specifically to X-ray astronomy. The Marquardt and Powell minimization techniques are compared by using both to minimize the chi-squared statistic. In addition, Cash's C statistic is applied, with Powell's method, and it is shown that the C statistic produces better fits in the low-count regime than chi-squared.
Ward-Paige, Christine; Mills Flemming, Joanna; Lotze, Heike K.
2010-01-01
Background Increasingly, underwater visual censuses (UVC) are used to assess fish populations. Several studies have demonstrated the effectiveness of protected areas for increasing fish abundance or provided insight into the natural abundance and structure of reef fish communities in remote areas. Recently, high apex predator densities (>100,000 individuals·km−2) and biomasses (>4 tonnes·ha−1) have been reported for some remote islands suggesting the occurrence of inverted trophic biomass pyramids. However, few studies have critically evaluated the methods used for sampling conspicuous and highly mobile fish such as sharks. Ideally, UVC are done instantaneously, however, researchers often count animals that enter the survey area after the survey has started, thus performing non-instantaneous UVC. Methodology/Principal Findings We developed a simulation model to evaluate counts obtained by divers deploying non-instantaneous belt-transect and stationary-point-count techniques. We assessed how fish speed and survey procedure (visibility, diver speed, survey time and dimensions) affect observed fish counts. Results indicate that the bias caused by fish speed alone is huge, while survey procedures had varying effects. Because the fastest fishes tend to be the largest, the bias would have significant implications on their biomass contribution. Therefore, caution is needed when describing abundance, biomass, and community structure based on non-instantaneous UVC, especially for highly mobile species such as sharks. Conclusions/Significance Based on our results, we urge that published literature state explicitly whether instantaneous counts were made and that survey procedures be accounted for when non-instantaneous counts are used. Using published density and biomass values of communities that include sharks we explore the effect of this bias and suggest that further investigation may be needed to determine pristine shark abundances and the existence of inverted biomass pyramids. Because such studies are used to make important management and conservation decisions, incorrect estimates of animal abundance and biomass have serious and significant implications. PMID:20661304
JoAnn M. Hanowski; Gerald J. Niemi
1995-01-01
We established bird monitoring programs in two regions of Minnesota: the Chippewa National Forest and the Superior National Forest. The experimental design defined forest cover types as strata in which samples of forest stands were randomly selected. Subsamples (3 point counts) were placed in each stand to maximize field effort and to assess within-stand and between-...
UAS-based automatic bird count of a common gull colony
NASA Astrophysics Data System (ADS)
Grenzdörffer, G. J.
2013-08-01
The standard procedure to count birds is a manual one. However a manual bird count is a time consuming and cumbersome process, requiring several people going from nest to nest counting the birds and the clutches. High resolution imagery, generated with a UAS (Unmanned Aircraft System) offer an interesting alternative. Experiences and results of UAS surveys for automatic bird count of the last two years are presented for the bird reserve island Langenwerder. For 2011 1568 birds (± 5%) were detected on the image mosaic, based on multispectral image classification and GIS-based post processing. Based on the experiences of 2011 the results and the accuracy of the automatic bird count 2012 became more efficient. For 2012 1938 birds with an accuracy of approx. ± 3% were counted. Additionally a separation of breeding and non-breeding birds was performed with the assumption, that standing birds cause a visible shade. The final section of the paper is devoted to the analysis of the 3D-point cloud. Thereby the point cloud was used to determine the height of the vegetation and the extend and depth of closed sinks, which are unsuitable for breeding birds.
Spatial distribution of Legionella pneumophila MLVA-genotypes in a drinking water system.
Rodríguez-Martínez, Sarah; Sharaby, Yehonatan; Pecellín, Marina; Brettar, Ingrid; Höfle, Manfred; Halpern, Malka
2015-06-15
Bacteria of the genus Legionella cause water-based infections, resulting in severe pneumonia. To improve our knowledge about Legionella spp. ecology, its prevalence and its relationships with environmental factors were studied. Seasonal samples were taken from both water and biofilm at seven sampling points of a small drinking water distribution system in Israel. Representative isolates were obtained from each sample and identified to the species level. Legionella pneumophila was further determined to the serotype and genotype level. High resolution genotyping of L. pneumophila isolates was achieved by Multiple-Locus Variable number of tandem repeat Analysis (MLVA). Within the studied water system, Legionella plate counts were higher in summer and highly variable even between adjacent sampling points. Legionella was present in six out of the seven selected sampling points, with counts ranging from 1.0 × 10(1) to 5.8 × 10(3) cfu/l. Water counts were significantly higher in points where Legionella was present in biofilms. The main fraction of the isolated Legionella was L. pneumophila serogroup 1. Serogroup 3 and Legionella sainthelensis were also isolated. Legionella counts were positively correlated with heterotrophic plate counts at 37 °C and negatively correlated with chlorine. Five MLVA-genotypes of L. pneumophila were identified at different buildings of the sampled area. The presence of a specific genotype, "MLVA-genotype 4", consistently co-occurred with high Legionella counts and seemed to "trigger" high Legionella counts in cold water. Our hypothesis is that both the presence of L. pneumophila in biofilm and the presence of specific genotypes, may indicate and/or even lead to high Legionella concentration in water. This observation deserves further studies in a broad range of drinking water systems to assess its potential for general use in drinking water monitoring and management. Copyright © 2015 Elsevier Ltd. All rights reserved.
Assessment of frequency and duration of point counts when surveying for golden eagle presence
Skipper, Ben R.; Boal, Clint W.; Tsai, Jo-Szu; Fuller, Mark R.
2017-01-01
We assessed the utility of the recommended golden eagle (Aquila chrysaetos) survey methodology in the U.S. Fish and Wildlife Service 2013 Eagle Conservation Plan Guidance. We conducted 800-m radius, 1-hr point-count surveys broken into 20-min segments, during 2 sampling periods in 3 areas within the Intermountain West of the United States over 2 consecutive breeding seasons during 2012 and 2013. Our goal was to measure the influence of different survey time intervals and sampling periods on detectability and use estimates of golden eagles among different locations. Our results suggest that a less intensive effort (i.e., survey duration shorter than 1 hr and point-count survey radii smaller than 800 m) would likely be inadequate for rigorous documentation of golden eagle occurrence pre- or postconstruction of wind energy facilities. Results from a simulation analysis of detection probabilities and survey effort suggest that greater temporal and spatial effort could make point-count surveys more applicable for evaluating golden eagle occurrence in survey areas; however, increased effort would increase financial costs associated with additional person-hours and logistics (e.g., fuel, lodging). Future surveys can benefit from a pilot study and careful consideration of prior information about counts or densities of golden eagles in the survey area before developing a survey design. If information is lacking, survey planning may be best served by assuming low detection rates and increasing the temporal and spatial effort.
Anbar, Tag; Abdel-Rahman, Amal; Hegazy, Rehab; El-Khayyat, Mohamed; Ragaie, Maha
2017-01-01
Re-pigmentation and stabilization are the two ultimate goals of any re-pigmenting plan designed for vitiligo management. Furthermore, whether the improvement of some vitiligo lesions could be considered a guarantee for a similar response and/or stabilization of the rest of the lesions or not, remains to be clarified. To evaluate the behavior of non-segmental vitiligo (NSV), while on narrow band-ultraviolet B (NB-UVB) phototherapy. 25 patients with stable generalized NSV were included and received NB-UVB twice weekly. For the sake of ensuring accuracy of follow up, up to four lesions were randomly chosen in each patient and regularly measured using the point counting technique. The over-all point counting technique of all included patients showed a significant reduction (18.5 ± 8.4 cm 2 to 8.2± 3.1 cm 2 ) after 6 months of therapy (p < .001). Nine patients (36%), showed mixed response in the different lesions. Improvement was documented in some lesions, while other lesions showed no response or even worsening. No significant correlations were detected between the behavior of vitiligo during NB-UVB and any of the demographic or clinical data of the patients. NB-UVB is a pillar in the management of vitiligo, however close follow-up of the patient as a whole and his lesions, by both subjective and objective measures are mandatory to detect activity as early as possible, as vitiligo at many times may not act as one unit. This early detection of activity and the subsequent change in the treatment policy may ultimately change the final outcome of treatment. © 2016 Wiley Periodicals, Inc.
Clustering method for counting passengers getting in a bus with single camera
NASA Astrophysics Data System (ADS)
Yang, Tao; Zhang, Yanning; Shao, Dapei; Li, Ying
2010-03-01
Automatic counting of passengers is very important for both business and security applications. We present a single-camera-based vision system that is able to count passengers in a highly crowded situation at the entrance of a traffic bus. The unique characteristics of the proposed system include, First, a novel feature-point-tracking- and online clustering-based passenger counting framework, which performs much better than those of background-modeling-and foreground-blob-tracking-based methods. Second, a simple and highly accurate clustering algorithm is developed that projects the high-dimensional feature point trajectories into a 2-D feature space by their appearance and disappearance times and counts the number of people through online clustering. Finally, all test video sequences in the experiment are captured from a real traffic bus in Shanghai, China. The results show that the system can process two 320×240 video sequences at a frame rate of 25 fps simultaneously, and can count passengers reliably in various difficult scenarios with complex interaction and occlusion among people. The method achieves high accuracy rates up to 96.5%.
Livieratos, L; Stegger, L; Bloomfield, P M; Schafers, K; Bailey, D L; Camici, P G
2005-07-21
High-resolution cardiac PET imaging with emphasis on quantification would benefit from eliminating the problem of respiratory movement during data acquisition. Respiratory gating on the basis of list-mode data has been employed previously as one approach to reduce motion effects. However, it results in poor count statistics with degradation of image quality. This work reports on the implementation of a technique to correct for respiratory motion in the area of the heart at no extra cost for count statistics and with the potential to maintain ECG gating, based on rigid-body transformations on list-mode data event-by-event. A motion-corrected data set is obtained by assigning, after pre-correction for detector efficiency and photon attenuation, individual lines-of-response to new detector pairs with consideration of respiratory motion. Parameters of respiratory motion are obtained from a series of gated image sets by means of image registration. Respiration is recorded simultaneously with the list-mode data using an inductive respiration monitor with an elasticized belt at chest level. The accuracy of the technique was assessed with point-source data showing a good correlation between measured and true transformations. The technique was applied on phantom data with simulated respiratory motion, showing successful recovery of tracer distribution and contrast on the motion-corrected images, and on patient data with C15O and 18FDG. Quantitative assessment of preliminary C15O patient data showed improvement in the recovery coefficient at the centre of the left ventricle.
Herrmann, F; Hambsch, K; Wolf, T; Rother, P; Müller, P
1989-01-01
There exist some histometric methods for the morphological quantification of different strongly stimulating effects on the thyroid gland induced by drugs and/or other chemical substances in dependence upon dose and duration of application. But in respect of technical and temporal expense and also diagnostic statement, there are considerable differences between these recording procedures. Therefore we examined the 3 mostly used methods synchronously (i.e. determination of thyroid epithelial cell height, nuclear volume in thyrocytes, and estimation of relative volume parts in the thyroid gland by the point counting method) by investigating the thyroid glands of methylthiouracil-(MTU)-stimulated rats and corresponding controls in order to compare the diagnostic value and temporal expense. The largest temporal expense was required in the nuclear volume determination, the smallest in the point-counting method. On principle, all 3 procedures allow the determination of hypertrophic alterations but only by help of the point-counting method, also hyperplastic changes are recognizable. By nuclear volume determination, we found significant differences between central and peripheral parts of the thyroid gland. Therefore, to avoid the subjective error, it will be necessary to measure a large number of nuclei in many planes of the gland. Also the determination of epithelial cell high reinforces the subjective error because of the heterological structure especially in unstimulated thyroid gland. If the number of counting points is exactly determined and, full of sense, limited, the point-counting method allows a nearly complete measuring of the whole object to be tested within an acceptable investigation time. In this way, the heterological structure of thyroid gland will be regarded, and comparability and reproducibility are guaranteed on an high level.
Landis, Sarah; Suruki, Robert; Maskell, Joe; Bonar, Kerina; Hilton, Emma; Compton, Chris
2018-03-20
Blood eosinophil count may be a useful biomarker for predicting response to inhaled corticosteroids and exacerbation risk in chronic obstructive pulmonary disease (COPD) patients. The optimal cut point for categorizing blood eosinophil counts in these contexts remains unclear. We aimed to determine the distribution of blood eosinophil count in COPD patients and matched non-COPD controls, and to describe demographic and clinical characteristics at different cut points. We identified COPD patients within the UK Clinical Practice Research Database aged ≥40 years with a FEV 1 /FVC <0.7, and ≥1 blood eosinophil count recorded during stable disease between January 1, 2010 and December 31, 2012. COPD patients were matched on age, sex, and smoking status to non-COPD controls. Using all blood eosinophil counts recorded during a 12-month period, COPD patients were categorized as "always above," "fluctuating above and below," and "never above" cut points of 100, 150, and 300 cells/μL. The geometric mean blood eosinophil count was statistically significantly higher in COPD patients versus matched controls (196.6 cells/µL vs. 182.1 cells/µL; mean difference 8%, 95% CI: 6.8, 9.2), and in COPD patients with versus without a history of asthma (205.0 cells/µL vs. 192.2 cells/µL; mean difference 6.7%, 95%, CI: 4.9, 8.5). About half of COPD patients had all blood eosinophil counts above 150 cells/μL; this persistent higher eosinophil phenotype was associated with being male, higher body mass index, and history of asthma. In conclusion, COPD patients demonstrated higher blood eosinophil count than non-COPD controls, although there was substantial overlap in the distributions. COPD patients with a history of asthma had significantly higher blood eosinophil count versus those without.
Learning to count begins in infancy: evidence from 18 month olds' visual preferences.
Slaughter, Virginia; Itakura, Shoji; Kutsuki, Aya; Siegal, Michael
2011-10-07
We used a preferential looking paradigm to evaluate infants' preferences for correct versus incorrect counting. Infants viewed a video depicting six fish. In the correct counting sequence, a hand pointed to each fish in turn, accompanied by verbal counting up to six. In the incorrect counting sequence, the hand moved between two of the six fish while there was still verbal counting to six, thereby violating the one-to-one correspondence principle of correct counting. Experiment 1 showed that Australian 18 month olds, but not 15 month olds, significantly preferred to watch the correct counting sequence. In experiment 2, Australian infants' preference for correct counting disappeared when the count words were replaced by beeps or by Japanese count words. In experiment 3, Japanese 18 month olds significantly preferred the correct counting video only when counting was in Japanese. These results show that infants start to acquire the abstract principles governing correct counting prior to producing any counting behaviour.
Learning to count begins in infancy: evidence from 18 month olds' visual preferences
Slaughter, Virginia; Itakura, Shoji; Kutsuki, Aya; Siegal, Michael
2011-01-01
We used a preferential looking paradigm to evaluate infants' preferences for correct versus incorrect counting. Infants viewed a video depicting six fish. In the correct counting sequence, a hand pointed to each fish in turn, accompanied by verbal counting up to six. In the incorrect counting sequence, the hand moved between two of the six fish while there was still verbal counting to six, thereby violating the one-to-one correspondence principle of correct counting. Experiment 1 showed that Australian 18 month olds, but not 15 month olds, significantly preferred to watch the correct counting sequence. In experiment 2, Australian infants' preference for correct counting disappeared when the count words were replaced by beeps or by Japanese count words. In experiment 3, Japanese 18 month olds significantly preferred the correct counting video only when counting was in Japanese. These results show that infants start to acquire the abstract principles governing correct counting prior to producing any counting behaviour. PMID:21325331
A comparison of point-count and mist-net detections of songbirds by habitat and time-of-season
Rich W. Pagen; Frank R., III Thompson; Dirk E. Burhans
2002-01-01
We compared the results of point-count and mist-net surveys during the breeding and post-breeding seasons in four Missouri Ozark habitats: mature upland forest, mature riparian forest, 9- to 10-yr-old upland forest and 3- to 4-yr-old upland forest created by clearcutting. We determined whether differences in abundance estimates among habitats or between breeding and...
How to Quantify Penile Corpus Cavernosum Structures with Histomorphometry: Comparison of Two Methods
Felix-Patrício, Bruno; De Souza, Diogo Benchimol; Gregório, Bianca Martins; Costa, Waldemar Silva; Sampaio, Francisco José
2015-01-01
The use of morphometrical tools in biomedical research permits the accurate comparison of specimens subjected to different conditions, and the surface density of structures is commonly used for this purpose. The traditional point-counting method is reliable but time-consuming, with computer-aided methods being proposed as an alternative. The aim of this study was to compare the surface density data of penile corpus cavernosum trabecular smooth muscle in different groups of rats, measured by two observers using the point-counting or color-based segmentation method. Ten normotensive and 10 hypertensive male rats were used in this study. Rat penises were processed to obtain smooth muscle immunostained histological slices and photomicrographs captured for analysis. The smooth muscle surface density was measured in both groups by two different observers by the point-counting method and by the color-based segmentation method. Hypertensive rats showed an increase in smooth muscle surface density by the two methods, and no difference was found between the results of the two observers. However, surface density values were higher by the point-counting method. The use of either method did not influence the final interpretation of the results, and both proved to have adequate reproducibility. However, as differences were found between the two methods, results obtained by either method should not be compared. PMID:26413547
Felix-Patrício, Bruno; De Souza, Diogo Benchimol; Gregório, Bianca Martins; Costa, Waldemar Silva; Sampaio, Francisco José
2015-01-01
The use of morphometrical tools in biomedical research permits the accurate comparison of specimens subjected to different conditions, and the surface density of structures is commonly used for this purpose. The traditional point-counting method is reliable but time-consuming, with computer-aided methods being proposed as an alternative. The aim of this study was to compare the surface density data of penile corpus cavernosum trabecular smooth muscle in different groups of rats, measured by two observers using the point-counting or color-based segmentation method. Ten normotensive and 10 hypertensive male rats were used in this study. Rat penises were processed to obtain smooth muscle immunostained histological slices and photomicrographs captured for analysis. The smooth muscle surface density was measured in both groups by two different observers by the point-counting method and by the color-based segmentation method. Hypertensive rats showed an increase in smooth muscle surface density by the two methods, and no difference was found between the results of the two observers. However, surface density values were higher by the point-counting method. The use of either method did not influence the final interpretation of the results, and both proved to have adequate reproducibility. However, as differences were found between the two methods, results obtained by either method should not be compared.
Laamrani, Ahmed; Branson, Dave; Joosse, Pamela
2018-01-01
Quantifying the amount of crop residue left in the field after harvest is a key issue for sustainability. Conventional assessment approaches (e.g., line-transect) are labor intensive, time-consuming and costly. Many proximal remote sensing devices and systems have been developed for agricultural applications such as cover crop and residue mapping. For instance, current mobile devices (smartphones & tablets) are usually equipped with digital cameras and global positioning systems and use applications (apps) for in-field data collection and analysis. In this study, we assess the feasibility and strength of a mobile device app developed to estimate crop residue cover. The performance of this novel technique (from here on referred to as “app” method) was compared against two point counting approaches: an established digital photograph-grid method and a new automated residue counting script developed in MATLAB at the University of Guelph. Both photograph-grid and script methods were used to count residue under 100 grid points. Residue percent cover was estimated using the app, script and photograph-grid methods on 54 vertical digital photographs (images of the ground taken from above at a height of 1.5 m) collected from eighteen fields (9 corn and 9 soybean, 3 samples each) located in southern Ontario. Results showed that residue estimates from the app method were in good agreement with those obtained from both photograph–grid and script methods (R2 = 0.86 and 0.84, respectively). This study has found that the app underestimates the residue coverage by −6.3% and −10.8% when compared to the photograph-grid and script methods, respectively. With regards to residue type, soybean has a slightly lower bias than corn (i.e., −5.3% vs. −7.4%). For photos with residue <30%, the app derived residue measurements are within ±5% difference (bias) of both photograph-grid- and script-derived residue measurements. These methods could therefore be used to track the recommended minimum soil residue cover of 30%, implemented to reduce farmland topsoil and nutrient losses that impact water quality. Overall, the app method was found to be a good alternative to the point counting methods, which are more time-consuming. PMID:29495497
For Mole Problems, Call Avogadro: 602-1023.
ERIC Educational Resources Information Center
Uthe, R. E.
2002-01-01
Describes techniques to help introductory students become familiar with Avogadro's number and mole calculations. Techniques involve estimating numbers of common objects then calculating the length of time needed to count large numbers of them. For example, the immense amount of time required to count a mole of sand grains at one grain per second…
Firth, Jacqueline; Balraj, Vinohar; Muliyil, Jayaprakash; Roy, Sheela; Rani, Lilly Michael; Chandresekhar, R.; Kang, Gagandeep
2010-01-01
To assess water contamination and the relative effectiveness of three options for point-of-use water treatment in South India, we conducted a 6-month randomized, controlled intervention trial using chlorine, Moringa oleifera seeds, a closed valved container, and controls. One hundred twenty-six families participated. Approximately 70% of public drinking water sources had thermotolerant coliform counts > 100/100 mL. Neither M. oleifera seeds nor containers reduced coliform counts in water samples from participants' homes. Chlorine reduced thermotolerant coliform counts to potable levels, but was less acceptable to participants. Laboratory testing of M. oleifera seeds in water from the village confirmed the lack of reduction in coliform counts, in contrast to the improvement seen with Escherichia coli seeded distilled water. This discrepancy merits further study, as M. oleifera was effective in reducing coliform counts in other studies and compliance with Moringa use in this study was high. PMID:20439952
Firth, Jacqueline; Balraj, Vinohar; Muliyil, Jayaprakash; Roy, Sheela; Rani, Lilly Michael; Chandresekhar, R; Kang, Gagandeep
2010-05-01
To assess water contamination and the relative effectiveness of three options for point-of-use water treatment in South India, we conducted a 6-month randomized, controlled intervention trial using chlorine, Moringa oleifera seeds, a closed valved container, and controls. One hundred twenty-six families participated. Approximately 70% of public drinking water sources had thermotolerant coliform counts > 100/100 mL. Neither M. oleifera seeds nor containers reduced coliform counts in water samples from participants' homes. Chlorine reduced thermotolerant coliform counts to potable levels, but was less acceptable to participants. Laboratory testing of M. oleifera seeds in water from the village confirmed the lack of reduction in coliform counts, in contrast to the improvement seen with Escherichia coli seeded distilled water. This discrepancy merits further study, as M. oleifera was effective in reducing coliform counts in other studies and compliance with Moringa use in this study was high.
Ryde, S J; al-Agel, F A; Evans, C J; Hancock, D A
2000-05-01
The use of a hydrogen internal standard to enable the estimation of absolute mass during measurement of total body nitrogen by in vivo neutron activation is an established technique. Central to the technique is a determination of the H prompt gamma ray counts arising from the subject. In practice, interference counts from other sources--e.g., neutron shielding--are included. This study reports use of the Monte Carlo computer code, MCNP-4A, to investigate the interference counts arising from shielding both with and without a phantom containing a urea solution. Over a range of phantom size (depth 5 to 30 cm, width 20 to 40 cm), the counts arising from shielding increased by between 4% and 32% compared with the counts without a phantom. For any given depth, the counts increased approximately linearly with width. For any given width, there was little increase for depths exceeding 15 centimeters. The shielding counts comprised between 15% and 26% of those arising from the urea phantom. These results, although specific to the Swansea apparatus, suggest that extraneous hydrogen counts can be considerable and depend strongly on the subject's size.
Pettipher, Graham L.; Mansell, Roderick; McKinnon, Charles H.; Cousins, Christina M.
1980-01-01
Membrane filtration and epifluorescent microscopy were used for the direct enumeration of bacteria in raw milk. Somatic cells were lysed by treatment with trypsin and Triton X-100 so that 2 ml of milk containing up to 5 × 106 somatic cells/ml could be filtered. The majority of the bacteria (ca. 80%) remained intact and were concentrated on the membrane. After being stained with acridine organe, the bacteria fluoresced under ultraviolet light and could easily be counted. The clump count of orange fluorescing cells on the membrane correlated well (r = 0.91) with the corresponding plate count for farm, tanker, and silo milks. Differences between counts obtained by different operators and between the membrane clump count and plate count were not significant. The technique is rapid, taking less than 25 min, inexpensive, costing less than 50 cents per sample, and is suitable for milks containing 5 × 103 to 5 × 108 bacteria per ml. Images PMID:16345515
Scare, J A; Slusarewicz, P; Noel, M L; Wielgus, K M; Nielsen, M K
2017-11-30
Fecal egg counts are emphasized for guiding equine helminth parasite control regimens due to the rise of anthelmintic resistance. This, however, poses further challenges, since egg counting results are prone to issues such as operator dependency, method variability, equipment requirements, and time commitment. The use of image analysis software for performing fecal egg counts is promoted in recent studies to reduce the operator dependency associated with manual counts. In an attempt to remove operator dependency associated with current methods, we developed a diagnostic system that utilizes a smartphone and employs image analysis to generate automated egg counts. The aims of this study were (1) to determine precision of the first smartphone prototype, the modified McMaster and ImageJ; (2) to determine precision, accuracy, sensitivity, and specificity of the second smartphone prototype, the modified McMaster, and Mini-FLOTAC techniques. Repeated counts on fecal samples naturally infected with equine strongyle eggs were performed using each technique to evaluate precision. Triplicate counts on 36 egg count negative samples and 36 samples spiked with strongyle eggs at 5, 50, 500, and 1000 eggs per gram were performed using a second smartphone system prototype, Mini-FLOTAC, and McMaster to determine technique accuracy. Precision across the techniques was evaluated using the coefficient of variation. In regards to the first aim of the study, the McMaster technique performed with significantly less variance than the first smartphone prototype and ImageJ (p<0.0001). The smartphone and ImageJ performed with equal variance. In regards to the second aim of the study, the second smartphone system prototype had significantly better precision than the McMaster (p<0.0001) and Mini-FLOTAC (p<0.0001) methods, and the Mini-FLOTAC was significantly more precise than the McMaster (p=0.0228). Mean accuracies for the Mini-FLOTAC, McMaster, and smartphone system were 64.51%, 21.67%, and 32.53%, respectively. The Mini-FLOTAC was significantly more accurate than the McMaster (p<0.0001) and the smartphone system (p<0.0001), while the smartphone and McMaster counts did not have statistically different accuracies. Overall, the smartphone system compared favorably to manual methods with regards to precision, and reasonably with regards to accuracy. With further refinement, this system could become useful in veterinary practice. Copyright © 2017 Elsevier B.V. All rights reserved.
Attention during active visual tasks: counting, pointing, or simply looking
Wilder, John D.; Schnitzer, Brian S.; Gersch, Timothy M.; Dosher, Barbara A.
2009-01-01
Visual attention and saccades are typically studied in artificial situations, with stimuli presented to the steadily fixating eye, or saccades made along specified paths. By contrast, in the real world saccadic patterns are constrained only by the demands of the motivating task. We studied attention during pauses between saccades made to perform 3 free-viewing tasks: counting dots, pointing to the same dots with a visible cursor, or simply looking at the dots using a freely-chosen path. Attention was assessed by the ability to identify the orientation of a briefly-presented Gabor probe. All primary tasks produced losses in identification performance, with counting producing the largest losses, followed by pointing and then looking-only. Looking-only resulted in a 37% increase in contrast thresholds in the orientation task. Counting produced more severe losses that were not overcome by increasing Gabor contrast. Detection or localization of the Gabor, unlike identification, were largely unaffected by any of the primary tasks. Taken together, these results show that attention is required to control saccades, even with freely-chosen paths, but the attentional demands of saccades are less than those attached to tasks such as counting, which have a significant cognitive load. Counting proved to be a highly demanding task that either exhausted momentary processing capacity (e.g., working memory or executive functions), or, alternatively, encouraged a strategy of filtering out all signals irrelevant to counting itself. The fact that the attentional demands of saccades (as well as those of detection/localization) are relatively modest makes it possible to continually adjust both the spatial and temporal pattern of saccades so as to re-allocate attentional resources as needed to handle the complex and multifaceted demands of real-world environments. PMID:18649913
Protocol for monitoring forest-nesting birds in National Park Service parks
Dawson, Deanna K.; Efford, Murray G.
2013-01-01
These documents detail the protocol for monitoring forest-nesting birds in National Park Service parks in the National Capital Region Network (NCRN). In the first year of sampling, counts of birds should be made at 384 points on the NCRN spatially randomized grid, developed to sample terrestrial resources. Sampling should begin on or about May 20 and continue into early July; on each day the sampling period begins at sunrise and ends five hours later. Each point should be counted twice, once in the first half of the field season and once in the second half, with visits made by different observers, balancing the within-season coverage of points and their spatial coverage by observers, and allowing observer differences to be tested. Three observers, skilled in identifying birds of the region by sight and sound and with previous experience in conducting timed counts of birds, will be needed for this effort. Observers should be randomly assigned to ‘routes’ consisting of eight points, in close proximity and, ideally, in similar habitat, that can be covered in one morning. Counts are 10 minutes in length, subdivided into four 2.5-min intervals. Within each time interval, new birds (i.e., those not already detected) are recorded as within or beyond 50 m of the point, based on where first detected. Binomial distance methods are used to calculate annual estimates of density for species. The data are also amenable to estimation of abundance and detection probability via the removal method. Generalized linear models can be used to assess between-year changes in density estimates or unadjusted count data. This level of sampling is expected to be sufficient to detect a 50% decline in 10 years for approximately 50 bird species, including 14 of 19 species that are priorities for conservation efforts, if analyses are based on unadjusted count data, and for 30 species (6 priority species) if analyses are based on density estimates. The estimates of required sample sizes are based on the mean number of individuals detected per 10 minutes in available data from surveys in three NCRN parks. Once network-wide data from the first year of sampling are available, this and other aspects of the protocol should be re-assessed, and changes made as desired or necessary before the start of the second field season. Thereafter, changes should not be made to the field methods, and sampling should be conducted annually for at least ten years. NCRN staff should keep apprised of new analytical methods developed for analysis of point-count data.
Experimental analysis of the auditory detection process on avian point counts
Simons, T.R.; Alldredge, M.W.; Pollock, K.H.; Wettroth, J.M.
2007-01-01
We have developed a system for simulating the conditions of avian surveys in which birds are identified by sound. The system uses a laptop computer to control a set of amplified MP3 players placed at known locations around a survey point. The system can realistically simulate a known population of songbirds under a range of factors that affect detection probabilities. The goals of our research are to describe the sources and range of variability affecting point-count estimates and to find applications of sampling theory and methodologies that produce practical improvements in the quality of bird-census data. Initial experiments in an open field showed that, on average, observers tend to undercount birds on unlimited-radius counts, though the proportion of birds counted by individual observers ranged from 81% to 132% of the actual total. In contrast to the unlimited-radius counts, when data were truncated at a 50-m radius around the point, observers overestimated the total population by 17% to 122%. Results also illustrate how detection distances decline and identification errors increase with increasing levels of ambient noise. Overall, the proportion of birds heard by observers decreased by 28 ± 4.7% under breezy conditions, 41 ± 5.2% with the presence of additional background birds, and 42 ± 3.4% with the addition of 10 dB of white noise. These findings illustrate some of the inherent difficulties in interpreting avian abundance estimates based on auditory detections, and why estimates that do not account for variations in detection probability will not withstand critical scrutiny.
Flórez, Ana Belén; Mayo, Baltasar
2015-12-02
This work reports the composition and succession of tetracycline- and erythromycin-resistant bacterial communities in a model cheese, monitored by polymerase chain reaction denaturing gradient gel electrophoresis (PCR-DGGE). Bacterial 16S rRNA genes were examined using this technique to detect structural changes in the cheese microbiota over manufacturing and ripening. Total bacterial genomic DNA, used as a template, was extracted from cultivable bacteria grown without and with tetracycline or erythromycin (both at 25 μg ml(-1)) on a non-selective medium used for enumeration of total and viable cells (Plate Count agar with Milk; PCA-M), and from those grown on selective and/or differential agar media used for counting various bacterial groups; i.e., lactic acid bacteria (de Man, Rogosa and Sharpe agar; MRSA), micrococci and staphylococci (Baird-Parker agar; BPA), and enterobacteria (Violet Red Bile Glucose agar; VRBGA). Large numbers of tetracycline- and erythromycin-resistant bacteria were detected in cheese samples at all stages of ripening. Counts of antibiotic-resistant bacteria varied widely depending on the microbial group and the point of sampling. In general, resistant bacteria were 0.5-1.0 Log10 units fewer in number than the corresponding susceptible bacteria. The PCR-DGGE profiles obtained with DNA isolated from the plates for total bacteria and the different bacterial groups suggested Escherichia coli, Lactococcus lactis, Enterococcus faecalis and Staphylococcus spp. as the microbial types resistant to both antibiotics tested. This study shows the suitability of the PCR-DGGE technique for rapidly identifying and tracking antibiotic resistant populations in cheese and, by extension, in other foods. Copyright © 2015 Elsevier B.V. All rights reserved.
Yurt, Kıymet Kübra; Kivrak, Elfide Gizem; Altun, Gamze; Mohamed, Hamza; Ali, Fathelrahman; Gasmalla, Hosam Eldeen; Kaplan, Suleyman
2018-02-26
A quantitative description of a three-dimensional (3D) object based on two-dimensional images can be made using stereological methods These methods involve unbiased approaches and provide reliable results with quantitative data. The quantitative morphology of the nervous system has been thoroughly researched in this context. In particular, various novel methods such as design-based stereological approaches have been applied in neuoromorphological studies. The main foundations of these methods are systematic random sampling and a 3D approach to structures such as tissues and organs. One key point in these methods is that selected samples should represent the entire structure. Quantification of neurons, i.e. particles, is important for revealing degrees of neurodegeneration and regeneration in an organ or system. One of the most crucial morphometric parameters in biological studies is thus the "number". The disector counting method introduced by Sterio in 1984 is an efficient and reliable solution for particle number estimation. In order to obtain precise results by means of stereological analysis, counting items should be seen clearly in the tissue. If an item in the tissue cannot be seen, these cannot be analyzed even using unbiased stereological techniques. Staining and sectioning processes therefore play a critical role in stereological analysis. The purpose of this review is to evaluate current neuroscientific studies using optical and physical disector counting methods and to discuss their definitions and methodological characteristics. Although the efficiency of the optical disector method in light microscopic studies has been revealed in recent years, the physical disector method is more easily performed in electron microscopic studies. Also, we offered to readers summaries of some common basic staining and sectioning methods, which can be used for stereological techniques in this review. Copyright © 2018 Elsevier B.V. All rights reserved.
Techniques employed for detection of hot particles in the marine environment.
Pillsbury, G D
2007-09-01
During the decommissioning of the Maine Yankee nuclear plant, several methods were developed and employed to survey for hot particles in the marine environment surrounding the site. The methods used and the sensitivities achieved in the search for environmentally dispersed particles during the various decommissioning activities performed are described in detail. Surveys were performed on dry soil, exposed marine sediment and submerged marine sediment. Survey techniques ranged from the use of the basic NaI detector coupled to a count rate meter to an intrinsic germanium detector deployed in a submarine housing coupled to a multi-channel analyser. The initial surveys consisted of collecting samples of marine sediment, spreading them out over a 1 m2 surface in a thin layer, and scanning the deposited sediment by hand using a 5 cm by 5 cm NaI detector coupled to a standard count rate meter. This technique was later replaced by walkover scans with the 5 cm by 5 cm NaI detector moved in a serpentine pattern over the sediment surface. By coupling the detector to a 'smart meter', an alarm set point could be used to alert the surveyor to the presence of a particle within the instrument's field of view. A similar technique, with the detector mounted in a watertight housing secured to the end of a pole, was also employed to scan underwater locations. The most sensitive method developed for performing underwater surveys was the use of the intrinsic germanium detector placed in a submarine housing. Detailed descriptions of the methods employed and the results obtained are presented. This work demonstrates that there are several approaches to surveying for discrete particles in the marine environment and the relative merits of each are considered.
1986-01-01
minal area, assess fetal heart tones with stethoscope , record FHTs, and remove equipment from area. or . Fetal Heart Tones, Doppler: Includes time to...activity to total 4 points. 5. Femoral OR pedal pulses OR fetal heart tones must be taken q 4 h . before you count the 2 points, but once again, since the... fetal heart tones OR tilt tests will count 2 points each if doneq 4 h or more frequently. If you had a patient with pedal pulses q 4 h AND fetal heart
Promotion of Physical Activity Using Point-of-Decision Prompts in Berlin Underground Stations
Müller-Riemenschneider, Falk; Nocon, Marc; Reinhold, Thomas; Willich, Stefan N.
2010-01-01
To evaluate point-of-decision prompts in the promotion of stair use in Germany, motivational posters were placed at three underground stations in Berlin. The proportion of passengers using stairs or stairways was counted before, during installation, and two weeks after removal of posters. In total, 5,467 passersby were counted. Stair use increased significantly in women, but not in men. The present pilot study thereby shows that the use of point-of-decision prompts is also feasible in Germany and it provides some evidence of effectiveness. Methodologically rigorous studies are warranted to confirm these findings. PMID:20948947
Dittami, Gregory M; Sethi, Manju; Rabbitt, Richard D; Ayliffe, H Edward
2012-06-21
Particle and cell counting is used for a variety of applications including routine cell culture, hematological analysis, and industrial controls(1-5). A critical breakthrough in cell/particle counting technologies was the development of the Coulter technique by Wallace Coulter over 50 years ago. The technique involves the application of an electric field across a micron-sized aperture and hydrodynamically focusing single particles through the aperture. The resulting occlusion of the aperture by the particles yields a measurable change in electric impedance that can be directly and precisely correlated to cell size/volume. The recognition of the approach as the benchmark in cell/particle counting stems from the extraordinary precision and accuracy of its particle sizing and counts, particularly as compared to manual and imaging based technologies (accuracies on the order of 98% for Coulter counters versus 75-80% for manual and vision-based systems). This can be attributed to the fact that, unlike imaging-based approaches to cell counting, the Coulter Technique makes a true three-dimensional (3-D) measurement of cells/particles which dramatically reduces count interference from debris and clustering by calculating precise volumetric information about the cells/particles. Overall this provides a means for enumerating and sizing cells in a more accurate, less tedious, less time-consuming, and less subjective means than other counting techniques(6). Despite the prominence of the Coulter technique in cell counting, its widespread use in routine biological studies has been prohibitive due to the cost and size of traditional instruments. Although a less expensive Coulter-based instrument has been produced, it has limitations as compared to its more expensive counterparts in the correction for "coincidence events" in which two or more cells pass through the aperture and are measured simultaneously. Another limitation with existing Coulter technologies is the lack of metrics on the overall health of cell samples. Consequently, additional techniques must often be used in conjunction with Coulter counting to assess cell viability. This extends experimental setup time and cost since the traditional methods of viability assessment require cell staining and/or use of expensive and cumbersome equipment such as a flow cytometer. The Moxi Z mini automated cell counter, described here, is an ultra-small benchtop instrument that combines the accuracy of the Coulter Principle with a thin-film sensor technology to enable precise sizing and counting of particles ranging from 3-25 microns, depending on the cell counting cassette used. The M type cassette can be used to count particles from with average diameters of 4 - 25 microns (dynamic range 2 - 34 microns), and the Type S cassette can be used to count particles with and average diameter of 3 - 20 microns (dynamic range 2 - 26 microns). Since the system uses a volumetric measurement method, the 4-25 microns corresponds to a cell volume range of 34 - 8,180 fL and the 3 - 20 microns corresponds to a cell volume range of 14 - 4200 fL, which is relevant when non-spherical particles are being measured. To perform mammalian cell counts using the Moxi Z, the cells to be counted are first diluted with ORFLO or similar diluent. A cell counting cassette is inserted into the instrument, and the sample is loaded into the port of the cassette. Thousands of cells are pulled, single-file through a "Cell Sensing Zone" (CSZ) in the thin-film membrane over 8-15 seconds. Following the run, the instrument uses proprietary curve-fitting in conjunction with a proprietary software algorithm to provide coincidence event correction along with an assessment of overall culture health by determining the ratio of the number of cells in the population of interest to the total number of particles. The total particle counts include shrunken and broken down dead cells, as well as other debris and contaminants. The results are presented in histogram format with an automatic curve fit, with gates that can be adjusted manually as needed. Ultimately, the Moxi Z enables counting with a precision and accuracy comparable to a Coulter Z2, the current gold standard, while providing additional culture health information. Furthermore it achieves these results in less time, with a smaller footprint, with significantly easier operation and maintenance, and at a fraction of the cost of comparable technologies.
Van den Broeck, Joke; Rossi, Gina; De Clercq, Barbara; Dierckx, Eva; Bastiaansen, Leen
2013-01-01
Research on the applicability of the five factor model (FFM) to capture personality pathology coincided with the development of a FFM personality disorder (PD) count technique, which has been validated in adolescent, young, and middle-aged samples. This study extends the literature by validating this technique in an older sample. Five alternative FFM PD counts based upon the Revised NEO Personality Inventory (NEO PI-R) are computed and evaluated in terms of both convergent and divergent validity with the Assessment of DSM-IV Personality Disorders Questionnaire (shortly ADP-IV; DSM-IV, Diagnostic and Statistical Manual of Mental Disorders - Fourth edition). For the best working count for each PD normative data are presented, from which cut-off scores are derived. The validity of these cut-offs and their usefulness as a screening tool is tested against both a categorical (i.e., the DSM-IV - Text Revision), and a dimensional (i.e., the Dimensional Assessment of Personality Pathology; DAPP) measure of personality pathology. All but the Antisocial and Obsessive-Compulsive counts exhibited adequate convergent and divergent validity, supporting the use of this method in older adults. Using the ADP-IV and the DAPP - Short Form as validation criteria, results corroborate the use of the FFM PD count technique to screen for PDs in older adults, in particular for the Paranoid, Borderline, Histrionic, Avoidant, and Dependent PDs. Given the age-neutrality of the NEO PI-R and the considerable lack of valid personality assessment tools, current findings appear to be promising for the assessment of pathology in older adults.
Using DNA to test the utility of pellet-group counts as an index of deer counts
T. J. Brinkman; D. K. Person; W. Smith; F. Stuart Chapin; K. McCoy; M. Leonawicz; K. Hundertmark
2013-01-01
Despite widespread use of fecal pellet-group counts as an index of ungulate density, techniques used to convert pellet-group numbers to ungulate numbers rarely are based on counts of known individuals, seldom evaluated across spatial and temporal scales, and precision is infrequently quantified. Using DNA from fecal pellets to identify individual deer, we evaluated the...
ERIC Educational Resources Information Center
Camos, Valerie; Barrouillet, Pierre; Fayol, Michel
2001-01-01
Tested in three experiments hypothesis that coordinating saying number-words and pointing to each object to count requires use of the central executive and that cost of coordination decreases with age. Found that for 5- and 9-year-olds and adults, manipulating difficulty of each component affected counting performance but did not make coordination…
Fazio, B B
1994-04-01
This study examined the counting abilities of preschool children with specific language impairment compared to language-matched and mental-age-matched peers. In order to determine the nature of the difficulties SLI children exhibited in counting, the subjects participated in a series of oral counting tasks and a series of gestural tasks that used an invented counting system based on pointing to body parts. Despite demonstrating knowledge of many of the rules associated with counting, SLI preschool children displayed marked difficulty in counting objects. On oral counting tasks, they showed difficulty with rote counting, displayed a limited repertoire of number terms, and miscounted sets of objects. However, on gestural counting tasks, SLI children's performance was significantly better. These findings suggest that SLI children have a specific difficulty with the rote sequential aspect of learning number words.
Point-of-care blood eosinophil count in a severe asthma clinic setting.
Heffler, Enrico; Terranova, Giovanni; Chessari, Carlo; Frazzetto, Valentina; Crimi, Claudia; Fichera, Silvia; Picardi, Giuseppe; Nicolosi, Giuliana; Porto, Morena; Intravaia, Rossella; Crimi, Nunzio
2017-07-01
One of the main severe asthma phenotypes is severe eosinophilic or eosinophilic refractory asthma for which novel biologic agents are emerging as therapeutic options. In this context, blood eosinophil counts are one of the most reliable biomarkers. To evaluate the performance of a point-of-care peripheral blood counter in a patients with severe asthma. The blood eosinophil counts of 76 patients with severe asthma were evaluated by point-of-care and standard analyzers. A significant correlation between blood eosinophils assessed by the 2 devices was found (R 2 = 0.854, P < .001); similar correlations were found also for white blood cells, neutrophils, and lymphocytes. The point-of-care device had the ability to predict blood eosinophil cutoffs used to select patients for biologic treatments for severe eosinophilic asthma and the ELEN index, a composite score useful to predict sputum eosinophilia. The results of our study contribute to the validation of a point-of-care device to assess blood eosinophils and open the possibility of using this device for the management of severe asthma management. Copyright © 2017 American College of Allergy, Asthma & Immunology. Published by Elsevier Inc. All rights reserved.
The displacement of the sun from the galactic plane using IRAS and faust source counts
NASA Technical Reports Server (NTRS)
Cohen, Martin
1995-01-01
I determine the displacement of the Sun from the Galactic plane by interpreting IRAS point-source counts at 12 and 25 microns in the Galactic polar caps using the latest version of the SKY model for the point-source sky (Cohen 1994). A value of solar zenith = 15.5 +/- 0.7 pc north of the plane provides the best match to the ensemble of useful IRAS data. Shallow K counts in the north Galactic pole are also best fitted by this offset, while limited FAUST far-ultraviolet counts at 1660 A near the same pole favor a value near 14 pc. Combining the many IRAS determinations with the few FAUST values suggests that a value of solar zenith = 15.0 +/- 0.5 pc (internal error only) would satisfy these high-latitude sets of data in both wavelength regimes, within the context of the SKY model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Geist, William H.
2015-12-01
This set of slides begins by giving background and a review of neutron counting; three attributes of a verification item are discussed: 240Pu eff mass; α, the ratio of (α,n) neutrons to spontaneous fission neutrons; and leakage multiplication. It then takes up neutron detector systems – theory & concepts (coincidence counting, moderation, die-away time); detector systems – some important details (deadtime, corrections); introduction to multiplicity counting; multiplicity electronics and example distributions; singles, doubles, and triples from measured multiplicity distributions; and the point model: multiplicity mathematics.
Measurement of total-body cobalt-57 vitamin B12 absorption with a gamma camera.
Cardarelli, J A; Slingerland, D W; Burrows, B A; Miller, A
1985-08-01
Previously described techniques for the measurement of the absorption of [57Co]vitamin B12 by total-body counting have required an iron room equipped with scanning or multiple detectors. The present study uses simplifying modifications which make the technique more available and include the use of static geometry, the measurement of body thickness to correct for attenuation, a simple formula to convert the capsule-in-air count to a 100% absorption count, and finally the use of an adequately shielded gamma camera obviating the need of an iron room.
Rhudy, Matthew B; Mahoney, Joseph M
2018-04-01
The goal of this work is to compare the differences between various step counting algorithms using both accelerometer and gyroscope measurements from wrist and ankle-mounted sensors. Participants completed four different conditions on a treadmill while wearing an accelerometer and gyroscope on the wrist and the ankle. Three different step counting techniques were applied to the data from each sensor type and mounting location. It was determined that using gyroscope measurements allowed for better performance than the typically used accelerometers, and that ankle-mounted sensors provided better performance than those mounted on the wrist.
The contribution of simple random sampling to observed variations in faecal egg counts.
Torgerson, Paul R; Paul, Michaela; Lewis, Fraser I
2012-09-10
It has been over 100 years since the classical paper published by Gosset in 1907, under the pseudonym "Student", demonstrated that yeast cells suspended in a fluid and measured by a haemocytometer conformed to a Poisson process. Similarly parasite eggs in a faecal suspension also conform to a Poisson process. Despite this there are common misconceptions how to analyse or interpret observations from the McMaster or similar quantitative parasitic diagnostic techniques, widely used for evaluating parasite eggs in faeces. The McMaster technique can easily be shown from a theoretical perspective to give variable results that inevitably arise from the random distribution of parasite eggs in a well mixed faecal sample. The Poisson processes that lead to this variability are described and illustrative examples of the potentially large confidence intervals that can arise from observed faecal eggs counts that are calculated from the observations on a McMaster slide. Attempts to modify the McMaster technique, or indeed other quantitative techniques, to ensure uniform egg counts are doomed to failure and belie ignorance of Poisson processes. A simple method to immediately identify excess variation/poor sampling from replicate counts is provided. Copyright © 2012 Elsevier B.V. All rights reserved.
Dorazio, Robert M.; Martin, Juulien; Edwards, Holly H.
2013-01-01
The class of N-mixture models allows abundance to be estimated from repeated, point count surveys while adjusting for imperfect detection of individuals. We developed an extension of N-mixture models to account for two commonly observed phenomena in point count surveys: rarity and lack of independence induced by unmeasurable sources of variation in the detectability of individuals. Rarity increases the number of locations with zero detections in excess of those expected under simple models of abundance (e.g., Poisson or negative binomial). Correlated behavior of individuals and other phenomena, though difficult to measure, increases the variation in detection probabilities among surveys. Our extension of N-mixture models includes a hurdle model of abundance and a beta-binomial model of detectability that accounts for additional (extra-binomial) sources of variation in detections among surveys. As an illustration, we fit this model to repeated point counts of the West Indian manatee, which was observed in a pilot study using aerial surveys. Our extension of N-mixture models provides increased flexibility. The effects of different sets of covariates may be estimated for the probability of occurrence of a species, for its mean abundance at occupied locations, and for its detectability.
Dorazio, Robert M; Martin, Julien; Edwards, Holly H
2013-07-01
The class of N-mixture models allows abundance to be estimated from repeated, point count surveys while adjusting for imperfect detection of individuals. We developed an extension of N-mixture models to account for two commonly observed phenomena in point count surveys: rarity and lack of independence induced by unmeasurable sources of variation in the detectability of individuals. Rarity increases the number of locations with zero detections in excess of those expected under simple models of abundance (e.g., Poisson or negative binomial). Correlated behavior of individuals and other phenomena, though difficult to measure, increases the variation in detection probabilities among surveys. Our extension of N-mixture models includes a hurdle model of abundance and a beta-binomial model of detectability that accounts for additional (extra-binomial) sources of variation in detections among surveys. As an illustration, we fit this model to repeated point counts of the West Indian manatee, which was observed in a pilot study using aerial surveys. Our extension of N-mixture models provides increased flexibility. The effects of different sets of covariates may be estimated for the probability of occurrence of a species, for its mean abundance at occupied locations, and for its detectability.
Effect of storage temperature on the microbial composition of ready-to-use vegetables.
Caldera, L; Franzetti, L
2014-02-01
Four different salad preparations were investigated from microbiological point of view: two were packaged in air and two under Modified Atmosphere. The samples were stored at 4 and 10 °C, and analysed at established times. Total bacterial count (TBC) was taken as the most relevant index to define their hygiene and quality at both temperatures. Lactic acid bacteria, yeasts and moulds were found only occasionally. In general, the most important factor was the packaging technique: TBC was lower when the product is packed under modified conditions. The packaging technique also influences the microbial population: Gram-negative aerobic rods are dominant in air-packaged products, whilst the presence of Enterobacteriaceae becomes important in salads packaged under Modified Atmosphere. Pseudomonas fluorescens, with all its biovars, was the most frequently found species amongst the aerobic isolates, whilst for the Enterobacteriaceae strains, there was no dominant species.
Analysis techniques for background rejection at the Majorana Demonstrator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cuestra, Clara; Rielage, Keith Robert; Elliott, Steven Ray
2015-06-11
The MAJORANA Collaboration is constructing the MAJORANA DEMONSTRATOR, an ultra-low background, 40-kg modular HPGe detector array to search for neutrinoless double beta decay in 76Ge. In view of the next generation of tonne-scale Ge-based 0νββ-decay searches that will probe the neutrino mass scale in the inverted-hierarchy region, a major goal of the MAJORANA DEMONSTRATOR is to demonstrate a path forward to achieving a background rate at or below 1 count/tonne/year in the 4 keV region of interest around the Q-value at 2039 keV. The background rejection techniques to be applied to the data include cuts based on data reduction, pulsemore » shape analysis, event coincidences, and time correlations. The Point Contact design of the DEMONSTRATOR's germanium detectors allows for significant reduction of gamma background.« less
First oxygen from lunar basalt
NASA Technical Reports Server (NTRS)
Gibson, M. A.; Knudsen, C. W.; Brueneman, D. J.; Kanamori, H.; Ness, R. O.; Sharp, L. L.; Brekke, D. W.; Allen, C. C.; Morris, R. V.; Keller, L. P.
1993-01-01
The Carbotek/Shimizu process to produce oxygen from lunar soils has been successfully demonstrated on actual lunar samples in laboratory facilities at Carbotek with Shimizu funding and support. Apollo sample 70035 containing approximately 25 percent ilmenite (FeTiO3) was used in seven separate reactions with hydrogen varying temperature and pressure: FeTiO3 + H2 yields Fe + TiO2 + H2O. The experiments gave extremely encouraging results as all ilmenite was reduced in every experiment. The lunar ilmenite was found to be about twice as reactive as terrestrial ilmenite samples. Analytical techniques of the lunar and terrestrial ilmenite experiments performed by NASA Johnson Space Center include iron Mossbauer spectroscopy (FeMS), optical microscopy, SEM, TEM, and XRD. The Energy and Environmental Research Center at the University of North Dakota performed three SEM techniques (point count method, morphology determination, elemental mapping), XRD, and optical microscopy.
A conceptual guide to detection probability for point counts and other count-based survey methods
D. Archibald McCallum
2005-01-01
Accurate and precise estimates of numbers of animals are vitally needed both to assess population status and to evaluate management decisions. Various methods exist for counting birds, but most of those used with territorial landbirds yield only indices, not true estimates of population size. The need for valid density estimates has spawned a number of models for...
Comparison of methods for estimating density of forest songbirds from point counts
Jennifer L. Reidy; Frank R. Thompson; J. Wesley. Bailey
2011-01-01
New analytical methods have been promoted for estimating the probability of detection and density of birds from count data but few studies have compared these methods using real data. We compared estimates of detection probability and density from distance and time-removal models and survey protocols based on 5- or 10-min counts and outer radii of 50 or 100 m. We...
A microfluidic biochip for complete blood cell counts at the point-of-care
Hassan, U.; Reddy, B.; Damhorst, G.; Sonoiki, O.; Ghonge, T.; Yang, C.; Bashir, R.
2016-01-01
Complete blood cell counts (CBCs) are one of the most commonly ordered and informative blood tests in hospitals. The results from a CBC, which typically include white blood cell (WBC) counts with differentials, red blood cell (RBC) counts, platelet counts and hemoglobin measurements, can have implications for the diagnosis and screening of hundreds of diseases and treatments. Bulky and expensive hematology analyzers are currently used as a gold standard for acquiring CBCs. For nearly all CBCs performed today, the patient must travel to either a hospital with a large laboratory or to a centralized lab testing facility. There is a tremendous need for an automated, portable point-of-care blood cell counter that could yield results in a matter of minutes from a drop of blood without any trained professionals to operate the instrument. We have developed microfluidic biochips capable of a partial CBC using only a drop of whole blood. Total leukocyte and their 3-part differential count are obtained from 10 μL of blood after on-chip lysing of the RBCs and counting of the leukocytes electrically using microfabricated platinum electrodes. For RBCs and platelets, 1 μL of whole blood is diluted with PBS on-chip and the cells are counted electrically. The total time for measurement is under 20 minutes. We demonstrate a high correlation of blood cell counts compared to results acquired with a commercial hematology analyzer. This technology could potentially have tremendous applications in hospitals at the bedside, private clinics, retail clinics and the developing world. PMID:26909365
A microfluidic biochip for complete blood cell counts at the point-of-care.
Hassan, U; Reddy, B; Damhorst, G; Sonoiki, O; Ghonge, T; Yang, C; Bashir, R
2015-12-01
Complete blood cell counts (CBCs) are one of the most commonly ordered and informative blood tests in hospitals. The results from a CBC, which typically include white blood cell (WBC) counts with differentials, red blood cell (RBC) counts, platelet counts and hemoglobin measurements, can have implications for the diagnosis and screening of hundreds of diseases and treatments. Bulky and expensive hematology analyzers are currently used as a gold standard for acquiring CBCs. For nearly all CBCs performed today, the patient must travel to either a hospital with a large laboratory or to a centralized lab testing facility. There is a tremendous need for an automated, portable point-of-care blood cell counter that could yield results in a matter of minutes from a drop of blood without any trained professionals to operate the instrument. We have developed microfluidic biochips capable of a partial CBC using only a drop of whole blood. Total leukocyte and their 3-part differential count are obtained from 10 μL of blood after on-chip lysing of the RBCs and counting of the leukocytes electrically using microfabricated platinum electrodes. For RBCs and platelets, 1 μL of whole blood is diluted with PBS on-chip and the cells are counted electrically. The total time for measurement is under 20 minutes. We demonstrate a high correlation of blood cell counts compared to results acquired with a commercial hematology analyzer. This technology could potentially have tremendous applications in hospitals at the bedside, private clinics, retail clinics and the developing world.
Comparison of McMaster and FECPAKG2 methods for counting nematode eggs in the faeces of alpacas.
Rashid, Mohammed H; Stevenson, Mark A; Waenga, Shea; Mirams, Greg; Campbell, Angus J D; Vaughan, Jane L; Jabbar, Abdul
2018-05-02
This study aimed to compare the FECPAK G2 and the McMaster techniques for counting of gastrointestinal nematode eggs in the faeces of alpacas using two floatation solutions (saturated sodium chloride and sucrose solutions). Faecal eggs counts from both techniques were compared using the Lin's concordance correlation coefficient and Bland and Altman statistics. Results showed moderate to good agreement between the two methods, with better agreement achieved when saturated sugar is used as a floatation fluid, particularly when faecal egg counts are less than 1000 eggs per gram of faeces. To the best of our knowledge this is the first study to assess agreement of measurements between McMaster and FECPAK G2 methods for estimating faecal eggs in South American camelids.
Jericho, K W; O'Laney, G; Kozub, G C
1998-10-01
To enhance food safety and keeping quality, beef carcasses are cooled immediately after leaving the slaughter floor. Within hazard analysis and critical control point (HACCP) systems, this cooling process needs to be monitored by the industry and verified by regulatory agencies. This study assessed the usefulness of the temperature-function integration technique (TFIT) for the verification of the hygienic adequacy of two cooling processes for beef carcasses at one abattoir. The cooling process passes carcasses through a spray cooler for at least 17 h and a holding cooler for at least 7 h. The TFIT is faster and cheaper than culture methods. For spray cooler 1, the Escherichia coli generations predicted by TFIT for carcass surfaces (pelvic and shank sites) were compared to estimated E. coli counts from 120 surface excision samples (rump, brisket, and sacrum; 5 by 5 by 0.2 cm) before and after cooling. Counts of aerobic bacteria, coliforms, and E. coli were decreased after spray cooler 1 (P < or = 0.001). The number of E. coli generations (with lag) at the pelvic site calculated by TFIT averaged 0.85 +/- 0.19 and 0.15 +/- 0.04 after emerging from spray coolers 1 and 2, respectively. The TFIT (with lag) was considered convenient and appropriate for the inspection service to verify HACCP systems for carcass cooling processes.
Repeatability of paired counts.
Alexander, Neal; Bethony, Jeff; Corrêa-Oliveira, Rodrigo; Rodrigues, Laura C; Hotez, Peter; Brooker, Simon
2007-08-30
The Bland and Altman technique is widely used to assess the variation between replicates of a method of clinical measurement. It yields the repeatability, i.e. the value within which 95 per cent of repeat measurements lie. The valid use of the technique requires that the variance is constant over the data range. This is not usually the case for counts of items such as CD4 cells or parasites, nor is the log transformation applicable to zero counts. We investigate the properties of generalized differences based on Box-Cox transformations. For an example, in a data set of hookworm eggs counted by the Kato-Katz method, the square root transformation is found to stabilize the variance. We show how to back-transform the repeatability on the square root scale to the repeatability of the counts themselves, as an increasing function of the square mean root egg count, i.e. the square of the average of square roots. As well as being more easily interpretable, the back-transformed results highlight the dependence of the repeatability on the sample volume used.
Elders Point East Marsh Island Restoration Monitoring Data Analysis
2017-09-21
Figure 13. Average biomass comparison between fertilizer treatment and non- fertilizer treatment at Elders East...25 ERDC/EL CR-17-1 vi Table 5. Count of benthic organisms ...31 Table 6. Benthic Community Indices: True Taxa Richness, Total Organism Count
Bowser, Jacquelyn E; Costa, Lais R R; Rodil, Alba U; Lopp, Christine T; Johnson, Melanie E; Wills, Robert W; Swiderski, Cyprianna E
2018-03-01
OBJECTIVE To evaluate the effect of 2 bronchoalveolar lavage (BAL) sampling techniques and the use of N-butylscopolammonium bromide (NBB) on the quantity and quality of BAL fluid (BALF) samples obtained from horses with the summer pasture endophenotype of equine asthma. ANIMALS 8 horses with the summer pasture endophenotype of equine asthma. PROCEDURES BAL was performed bilaterally (right and left lung sites) with a flexible videoendoscope passed through the left or right nasal passage. During lavage of the first lung site, a BALF sample was collected by means of either gentle syringe aspiration or mechanical suction with a pressure-regulated wall-mounted suction pump. The endoscope was then maneuvered into the contralateral lung site, and lavage was performed with the alternate fluid retrieval technique. For each horse, BAL was performed bilaterally once with and once without premedication with NBB (21-day interval). The BALF samples retrieved were evaluated for volume, total cell count, differential cell count, RBC count, and total protein concentration. RESULTS Use of syringe aspiration significantly increased total BALF volume (mean volume increase, 40 mL [approx 7.5% yield]) and decreased total RBC count (mean decrease, 142 cells/μL), compared with use of mechanical suction. The BALF nucleated cell count and differential cell count did not differ between BAL procedures. Use of NBB had no effect on BALF retrieval. CONCLUSIONS AND CLINICAL RELEVANCE Results indicated that retrieval of BALF by syringe aspiration may increase yield and reduce barotrauma in horses at increased risk of bronchoconstriction and bronchiolar collapse. Further studies to determine the usefulness of NBB and other bronchodilators during BAL procedures in horses are warranted.
Calibration and comparison of accelerometer cut points in preschool children.
van Cauwenberghe, Eveline; Labarque, Valery; Trost, Stewart G; de Bourdeaudhuij, Ilse; Cardon, Greet
2011-06-01
The present study aimed to develop accelerometer cut points to classify physical activities (PA) by intensity in preschoolers and to investigate discrepancies in PA levels when applying various accelerometer cut points. To calibrate the accelerometer, 18 preschoolers (5.8 ± 0.4 years) performed eleven structured activities and one free play session while wearing a GT1M ActiGraph accelerometer using 15 s epochs. The structured activities were chosen based on the direct observation system Children's Activity Rating Scale (CARS) while the criterion measure of PA intensity during free play was provided using a second-by-second observation protocol (modified CARS). Receiver Operating Characteristic (ROC) curve analyses were used to determine the accelerometer cut points. To examine the classification differences, accelerometer data of four consecutive days from 114 preschoolers (5.5 ± 0.3 years) were classified by intensity according to previously published and the newly developed accelerometer cut points. Differences in predicted PA levels were evaluated using repeated measures ANOVA and Chi Square test. Cut points were identified at 373 counts/15 s for light (sensitivity: 86%; specificity: 91%; Area under ROC curve: 0.95), 585 counts/15 s for moderate (87%; 82%; 0.91) and 881 counts/15 s for vigorous PA (88%; 91%; 0.94). Further, applying various accelerometer cut points to the same data resulted in statistically and biologically significant differences in PA. Accelerometer cut points were developed with good discriminatory power for differentiating between PA levels in preschoolers and the choice of accelerometer cut points can result in large discrepancies.
An integrated circuit floating point accumulator
NASA Technical Reports Server (NTRS)
Goldsmith, T. C.
1977-01-01
Goddard Space Flight Center has developed a large scale integrated circuit (type 623) which can perform pulse counting, storage, floating point compression, and serial transmission, using a single monolithic device. Counts of 27 or 19 bits can be converted to transmitted values of 12 or 8 bits respectively. Use of the 623 has resulted in substantial savaings in weight, volume, and dollar resources on at least 11 scientific instruments to be flown on 4 NASA spacecraft. The design, construction, and application of the 623 are described.
Background compensation for a radiation level monitor
Keefe, D.J.
1975-12-01
Background compensation in a device such as a hand and foot monitor is provided by digital means using a scaler. With no radiation level test initiated, a scaler is down-counted from zero according to the background measured. With a radiation level test initiated, the scaler is up-counted from the previous down-count position according to the radiation emitted from the monitored object and an alarm is generated if, with the scaler having crossed zero in the positive going direction, a particular number is exceeded in a specific time period after initiation of the test. If the test is initiated while the scale is down-counting, the background count from the previous down- count stored in a memory is used as the initial starting point for the up-count.
Urbanová, Petra; Hejna, Petr; Jurda, Mikoláš
2015-05-01
Three-dimensional surface technologies particularly close range photogrammetry and optical surface scanning have recently advanced into affordable, flexible and accurate techniques. Forensic postmortem investigation as performed on a daily basis, however, has not yet fully benefited from their potentials. In the present paper, we tested two approaches to 3D external body documentation - digital camera-based photogrammetry combined with commercial Agisoft PhotoScan(®) software and stereophotogrammetry-based Vectra H1(®), a portable handheld surface scanner. In order to conduct the study three human subjects were selected, a living person, a 25-year-old female, and two forensic cases admitted for postmortem examination at the Department of Forensic Medicine, Hradec Králové, Czech Republic (both 63-year-old males), one dead to traumatic, self-inflicted, injuries (suicide by hanging), the other diagnosed with the heart failure. All three cases were photographed in 360° manner with a Nikon 7000 digital camera and simultaneously documented with the handheld scanner. In addition to having recorded the pre-autopsy phase of the forensic cases, both techniques were employed in various stages of autopsy. The sets of collected digital images (approximately 100 per case) were further processed to generate point clouds and 3D meshes. Final 3D models (a pair per individual) were counted for numbers of points and polygons, then assessed visually and compared quantitatively using ICP alignment algorithm and a cloud point comparison technique based on closest point to point distances. Both techniques were proven to be easy to handle and equally laborious. While collecting the images at autopsy took around 20min, the post-processing was much more time-demanding and required up to 10h of computation time. Moreover, for the full-body scanning the post-processing of the handheld scanner required rather time-consuming manual image alignment. In all instances the applied approaches produced high-resolution photorealistic, real sized or easy to calibrate 3D surface models. Both methods equally failed when the scanned body surface was covered with body hair or reflective moist areas. Still, it can be concluded that single camera close range photogrammetry and optical surface scanning using Vectra H1 scanner represent relatively low-cost solutions which were shown to be beneficial for postmortem body documentation in forensic pathology. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Grizelle Gonzalez; Elianid Espinoza; Zhigang Liu; Xiaoming Zou
2006-01-01
We used a fluorescence technique to mark and re-count the invasive earthworm, Pontoscolex corethrurus from PVC tubes established in a forest and a recently abandoned pasture in Puerto Rico to test the effects of the labeling treatment on earthworm population survival over time. A fluorescent marker was injected into the earthworms in the middle third section of the...
Reliable enumeration of malaria parasites in thick blood films using digital image analysis.
Frean, John A
2009-09-23
Quantitation of malaria parasite density is an important component of laboratory diagnosis of malaria. Microscopy of Giemsa-stained thick blood films is the conventional method for parasite enumeration. Accurate and reproducible parasite counts are difficult to achieve, because of inherent technical limitations and human inconsistency. Inaccurate parasite density estimation may have adverse clinical and therapeutic implications for patients, and for endpoints of clinical trials of anti-malarial vaccines or drugs. Digital image analysis provides an opportunity to improve performance of parasite density quantitation. Accurate manual parasite counts were done on 497 images of a range of thick blood films with varying densities of malaria parasites, to establish a uniformly reliable standard against which to assess the digital technique. By utilizing descriptive statistical parameters of parasite size frequency distributions, particle counting algorithms of the digital image analysis programme were semi-automatically adapted to variations in parasite size, shape and staining characteristics, to produce optimum signal/noise ratios. A reliable counting process was developed that requires no operator decisions that might bias the outcome. Digital counts were highly correlated with manual counts for medium to high parasite densities, and slightly less well correlated with conventional counts. At low densities (fewer than 6 parasites per analysed image) signal/noise ratios were compromised and correlation between digital and manual counts was poor. Conventional counts were consistently lower than both digital and manual counts. Using open-access software and avoiding custom programming or any special operator intervention, accurate digital counts were obtained, particularly at high parasite densities that are difficult to count conventionally. The technique is potentially useful for laboratories that routinely perform malaria parasite enumeration. The requirements of a digital microscope camera, personal computer and good quality staining of slides are potentially reasonably easy to meet.
Toward CMOS image sensor based glucose monitoring.
Devadhasan, Jasmine Pramila; Kim, Sanghyo
2012-09-07
Complementary metal oxide semiconductor (CMOS) image sensor is a powerful tool for biosensing applications. In this present study, CMOS image sensor has been exploited for detecting glucose levels by simple photon count variation with high sensitivity. Various concentrations of glucose (100 mg dL(-1) to 1000 mg dL(-1)) were added onto a simple poly-dimethylsiloxane (PDMS) chip and the oxidation of glucose was catalyzed with the aid of an enzymatic reaction. Oxidized glucose produces a brown color with the help of chromogen during enzymatic reaction and the color density varies with the glucose concentration. Photons pass through the PDMS chip with varying color density and hit the sensor surface. Photon count was recognized by CMOS image sensor depending on the color density with respect to the glucose concentration and it was converted into digital form. By correlating the obtained digital results with glucose concentration it is possible to measure a wide range of blood glucose levels with great linearity based on CMOS image sensor and therefore this technique will promote a convenient point-of-care diagnosis.
A heuristic statistical stopping rule for iterative reconstruction in emission tomography.
Ben Bouallègue, F; Crouzet, J F; Mariano-Goulart, D
2013-01-01
We propose a statistical stopping criterion for iterative reconstruction in emission tomography based on a heuristic statistical description of the reconstruction process. The method was assessed for MLEM reconstruction. Based on Monte-Carlo numerical simulations and using a perfectly modeled system matrix, our method was compared with classical iterative reconstruction followed by low-pass filtering in terms of Euclidian distance to the exact object, noise, and resolution. The stopping criterion was then evaluated with realistic PET data of a Hoffman brain phantom produced using the GATE platform for different count levels. The numerical experiments showed that compared with the classical method, our technique yielded significant improvement of the noise-resolution tradeoff for a wide range of counting statistics compatible with routine clinical settings. When working with realistic data, the stopping rule allowed a qualitatively and quantitatively efficient determination of the optimal image. Our method appears to give a reliable estimation of the optimal stopping point for iterative reconstruction. It should thus be of practical interest as it produces images with similar or better quality than classical post-filtered iterative reconstruction with a mastered computation time.
Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors.
Dutton, Neale A W; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K
2016-07-20
SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed.
Anomaly Detection in Power Quality at Data Centers
NASA Technical Reports Server (NTRS)
Grichine, Art; Solano, Wanda M.
2015-01-01
The goal during my internship at the National Center for Critical Information Processing and Storage (NCCIPS) is to implement an anomaly detection method through the StruxureWare SCADA Power Monitoring system. The benefit of the anomaly detection mechanism is to provide the capability to detect and anticipate equipment degradation by monitoring power quality prior to equipment failure. First, a study is conducted that examines the existing techniques of power quality management. Based on these findings, and the capabilities of the existing SCADA resources, recommendations are presented for implementing effective anomaly detection. Since voltage, current, and total harmonic distortion demonstrate Gaussian distributions, effective set-points are computed using this model, while maintaining a low false positive count.
Sensing and enumerating rare circulating cells with diffuse light
NASA Astrophysics Data System (ADS)
Zettergren, Eric; Vickers, Dwayne; Niedre, Mark
2011-02-01
Detection and quantification of circulating cells in live animals is a challenging and important problem in many areas of biomedical research. Current methods involve extraction of blood samples and counting of cells ex-vivo. Since only small blood volumes are analyzed at specific time points, monitoring of changes in cell populations over time is difficult and rare cells often escape detection. The goal of this research is to develop a method for enumerating very rare circulating cells in the bloodstream non-invasively. This would have many applications in biomedical research, including monitoring of cancer metastasis and tracking of hematopoietic stem cells. In this work we describe the optical configuration of our instrument which allows fluorescence detection of single cells in diffusive media at the mesoscopic scale. Our instrument design consists of two continuous wave laser diode sources and an 8-channel fiber coupled multi-anode photon counting PMT. Fluorescence detector fibers were arranged circularly around the target in a miniaturized ring configuration. Cell-simulating fluorescent microspheres and fluorescently-labeled cells were passed through a limb mimicking phantom with similar optical properties and background fluorescence as a limb of a mouse. Our data shows that we are able to successfully detect and count these with high quantitative accuracy. Future work includes characterization of our instrument using fluorescently labeled cells in-vivo. If successful, this technique would allow several orders of magnitude in vivo detection sensitivity improvement versus current approaches.
Somershoe, S.G.; Twedt, D.J.; Reid, B.
2006-01-01
We combined Breeding Bird Survey point count protocol and distance sampling to survey spring migrant and breeding birds in Vicksburg National Military Park on 33 days between March and June of 2003 and 2004. For 26 of 106 detected species, we used program DISTANCE to estimate detection probabilities and densities from 660 3-min point counts in which detections were recorded within four distance annuli. For most species, estimates of detection probability, and thereby density estimates, were improved through incorporation of the proportion of forest cover at point count locations as a covariate. Our results suggest Breeding Bird Surveys would benefit from the use of distance sampling and a quantitative characterization of habitat at point count locations. During spring migration, we estimated that the most common migrant species accounted for a population of 5000-9000 birds in Vicksburg National Military Park (636 ha). Species with average populations of 300 individuals during migration were: Blue-gray Gnatcatcher (Polioptila caerulea), Cedar Waxwing (Bombycilla cedrorum), White-eyed Vireo (Vireo griseus), Indigo Bunting (Passerina cyanea), and Ruby-crowned Kinglet (Regulus calendula). Of 56 species that bred in Vicksburg National Military Park, we estimated that the most common 18 species accounted for 8150 individuals. The six most abundant breeding species, Blue-gray Gnatcatcher, White-eyed Vireo, Summer Tanager (Piranga rubra), Northern Cardinal (Cardinalis cardinalis), Carolina Wren (Thryothorus ludovicianus), and Brown-headed Cowbird (Molothrus ater), accounted for 5800 individuals.
Neutron Detection With Ultra-Fast Digitizer and Pulse Identification Techniques on DIII-D
NASA Astrophysics Data System (ADS)
Zhu, Y. B.; Heidbrink, W. W.; Piglowski, D. A.
2013-10-01
A prototype system for neutron detection with an ultra-fast digitizer and pulse identification techniques has been implemented on the DIII-D tokamak. The system consists of a cylindrical neutron fission chamber, a charge sensitive amplifier, and a GaGe Octopus 12-bit CompuScope digitizer card installed in a Linux computer. Digital pulse identification techniques have been successfully performed at maximum data acquisition rate of 50 MSPS with on-board memory of 2 GS. Compared to the traditional approach with fast nuclear electronics for pulse counting, this straightforward digital solution has many advantages, including reduced expense, improved accuracy, higher counting rate, and easier maintenance. The system also provides the capability of neutron-gamma pulse shape discrimination and pulse height analysis. Plans for the upgrade of the old DIII-D neutron counting system with these techniques will be presented. Work supported by the US Department of Energy under SC-G903402, and DE-FC02-04ER54698.
Measuring Transmission Efficiencies Of Mass Spectrometers
NASA Technical Reports Server (NTRS)
Srivastava, Santosh K.
1989-01-01
Coincidence counts yield absolute efficiencies. System measures mass-dependent transmission efficiencies of mass spectrometers, using coincidence-counting techniques reminiscent of those used for many years in calibration of detectors for subatomic particles. Coincidences between detected ions and electrons producing them counted during operation of mass spectrometer. Under certain assumptions regarding inelastic scattering of electrons, electron/ion-coincidence count is direct measure of transmission efficiency of spectrometer. When fully developed, system compact, portable, and used routinely to calibrate mass spectrometers.
Class Counts: An Overview and Response to Mr. Cooper's Review
ERIC Educational Resources Information Center
Ornstein, Allan
2009-01-01
This article presents Allan Ornstein's response to highly respected scholar, Bruce Cooper's review of Ornstein's 2007 book, "Class Counts: Education, Inequality and the Shrinking Middle Class." Here Ornstein attempts to elaborate on a few points that he felt Cooper missed in his review.
Time Evolving Fission Chain Theory and Fast Neutron and Gamma-Ray Counting Distributions
Kim, K. S.; Nakae, L. F.; Prasad, M. K.; ...
2015-11-01
Here, we solve a simple theoretical model of time evolving fission chains due to Feynman that generalizes and asymptotically approaches the point model theory. The point model theory has been used to analyze thermal neutron counting data. This extension of the theory underlies fast counting data for both neutrons and gamma rays from metal systems. Fast neutron and gamma-ray counting is now possible using liquid scintillator arrays with nanosecond time resolution. For individual fission chains, the differential equations describing three correlated probability distributions are solved: the time-dependent internal neutron population, accumulation of fissions in time, and accumulation of leaked neutronsmore » in time. Explicit analytic formulas are given for correlated moments of the time evolving chain populations. The equations for random time gate fast neutron and gamma-ray counting distributions, due to randomly initiated chains, are presented. Correlated moment equations are given for both random time gate and triggered time gate counting. There are explicit formulas for all correlated moments are given up to triple order, for all combinations of correlated fast neutrons and gamma rays. The nonlinear differential equations for probabilities for time dependent fission chain populations have a remarkably simple Monte Carlo realization. A Monte Carlo code was developed for this theory and is shown to statistically realize the solutions to the fission chain theory probability distributions. Combined with random initiation of chains and detection of external quanta, the Monte Carlo code generates time tagged data for neutron and gamma-ray counting and from these data the counting distributions.« less
Sheffield, L.M.; Gall, Adrian E.; Roby, D.D.; Irons, D.B.; Dugger, K.M.
2006-01-01
Least Auklets (Aethia pusilla (Pallas, 1811)) are the most abundant species of seabird in the Bering Sea and offer a relatively efficient means of monitoring secondary productivity in the marine environment. Counting auklets on surface plots is the primary method used to track changes in numbers of these crevice-nesters, but counts can be highly variable and may not be representative of the number of nesting individuals. We compared average maximum counts of Least Auklets on surface plots with density estimates based on mark–resight data at a colony on St. Lawrence Island, Alaska, during 2001–2004. Estimates of breeding auklet abundance from mark–resight averaged 8 times greater than those from maximum surface counts. Our results also indicate that average maximum surface counts are poor indicators of breeding auklet abundance and do not vary consistently with auklet nesting density across the breeding colony. Estimates of Least Auklet abundance from mark–resight were sufficiently precise to meet management goals for tracking changes in seabird populations. We recommend establishing multiple permanent banding plots for mark–resight studies on colonies selected for intensive long-term monitoring. Mark–resight is more likely to detect biologically significant changes in size of auklet breeding colonies than traditional surface count techniques.
Hayes, Robert B; Peña, Adan M; Goff, Thomas E
2005-08-01
This paper demonstrates the utility of a portable alpha Continuous Air Monitor (CAM) as a bench top scalar counter for multiple sample types. These include using the CAM to count fixed air sample filters and radiological smears. In counting radiological smears, the CAM is used very much like a gas flow proportional counter (GFPC), albeit with a lower efficiency. Due to the typically low background in this configuration, the minimum detectable activity for a 5-min count should be in the range of about 10 dpm which is acceptably below the 20 dpm limit for transuranic isotopes. When counting fixed air sample filters, the CAM algorithm along with other measurable characteristics can be used to identify and quantify the presence of transuranic isotopes in the samples. When the radiological control technician wants to take some credit from naturally occurring radioactive material contributions due to radon progeny producing higher energy peaks (as in the case with a fixed air sample filter), then more elaborate techniques are required. The techniques presented here will generate a decision level of about 43 dpm for such applications. The calibration for this application should alternatively be done using the default values of channels 92-126 for region of interest 1. This can be done within 10 to 15 min resulting in a method to rapidly evaluate air filters for transuranic activity. When compared to the 1-h count technique described by , the technique presented in the present work demonstrates a technique whereby more than two thirds of samples can be rapidly shown (within 10 to 15 min) to be within regulatory compliant limits. In both cases, however, spectral quality checks are required to insure sample self attenuation is not a significant bias in the activity estimates. This will allow the same level of confidence when using these techniques for activity quantification as is presently available for air monitoring activity quantification using CAMs.
Development of new photon-counting detectors for single-molecule fluorescence microscopy.
Michalet, X; Colyer, R A; Scalia, G; Ingargiola, A; Lin, R; Millaud, J E; Weiss, S; Siegmund, Oswald H W; Tremsin, Anton S; Vallerga, John V; Cheng, A; Levi, M; Aharoni, D; Arisaka, K; Villa, F; Guerrieri, F; Panzeri, F; Rech, I; Gulinatti, A; Zappa, F; Ghioni, M; Cova, S
2013-02-05
Two optical configurations are commonly used in single-molecule fluorescence microscopy: point-like excitation and detection to study freely diffusing molecules, and wide field illumination and detection to study surface immobilized or slowly diffusing molecules. Both approaches have common features, but also differ in significant aspects. In particular, they use different detectors, which share some requirements but also have major technical differences. Currently, two types of detectors best fulfil the needs of each approach: single-photon-counting avalanche diodes (SPADs) for point-like detection, and electron-multiplying charge-coupled devices (EMCCDs) for wide field detection. However, there is room for improvements in both cases. The first configuration suffers from low throughput owing to the analysis of data from a single location. The second, on the other hand, is limited to relatively low frame rates and loses the benefit of single-photon-counting approaches. During the past few years, new developments in point-like and wide field detectors have started addressing some of these issues. Here, we describe our recent progresses towards increasing the throughput of single-molecule fluorescence spectroscopy in solution using parallel arrays of SPADs. We also discuss our development of large area photon-counting cameras achieving subnanosecond resolution for fluorescence lifetime imaging applications at the single-molecule level.
Development of new photon-counting detectors for single-molecule fluorescence microscopy
Michalet, X.; Colyer, R. A.; Scalia, G.; Ingargiola, A.; Lin, R.; Millaud, J. E.; Weiss, S.; Siegmund, Oswald H. W.; Tremsin, Anton S.; Vallerga, John V.; Cheng, A.; Levi, M.; Aharoni, D.; Arisaka, K.; Villa, F.; Guerrieri, F.; Panzeri, F.; Rech, I.; Gulinatti, A.; Zappa, F.; Ghioni, M.; Cova, S.
2013-01-01
Two optical configurations are commonly used in single-molecule fluorescence microscopy: point-like excitation and detection to study freely diffusing molecules, and wide field illumination and detection to study surface immobilized or slowly diffusing molecules. Both approaches have common features, but also differ in significant aspects. In particular, they use different detectors, which share some requirements but also have major technical differences. Currently, two types of detectors best fulfil the needs of each approach: single-photon-counting avalanche diodes (SPADs) for point-like detection, and electron-multiplying charge-coupled devices (EMCCDs) for wide field detection. However, there is room for improvements in both cases. The first configuration suffers from low throughput owing to the analysis of data from a single location. The second, on the other hand, is limited to relatively low frame rates and loses the benefit of single-photon-counting approaches. During the past few years, new developments in point-like and wide field detectors have started addressing some of these issues. Here, we describe our recent progresses towards increasing the throughput of single-molecule fluorescence spectroscopy in solution using parallel arrays of SPADs. We also discuss our development of large area photon-counting cameras achieving subnanosecond resolution for fluorescence lifetime imaging applications at the single-molecule level. PMID:23267185
Hart, Teresa L; Brusseau, Timothy; Kulinna, Pamela Hodges; McClain, James J; Tudor-Locke, Catrine
2011-12-01
This study compared step counts detected by four, low-cost, objective, physical-activity-assessment instruments and evaluated their ability to detect moderate-to-vigorous physical activity (MVPA) compared to the ActiGraph accelerometer (AG). Thirty-six 10-11-year-old children wore the NL-1000, Yamax Digiwalker SW 200, Omron HJ-151, and Walk4Life MVP concurrently with the AG during school hours on a single day. AG MVPA was derived from activity count data using previously validated cut points. Two of the evaluated instruments provided similar group mean MVPA and step counts compared to AG (dependent on cut point). Low-cost instruments may be useful for measurement of both MVPA and steps in children's physical activity interventions and program evaluation.
Testing the importance of auditory detections in avian point counts
Brewster, J.P.; Simons, T.R.
2009-01-01
Recent advances in the methods used to estimate detection probability during point counts suggest that the detection process is shaped by the types of cues available to observers. For example, models of the detection process based on distance-sampling or time-of-detection methods may yield different results for auditory versus visual cues because of differences in the factors that affect the transmission of these cues from a bird to an observer or differences in an observer's ability to localize cues. Previous studies suggest that auditory detections predominate in forested habitats, but it is not clear how often observers hear birds prior to detecting them visually. We hypothesized that auditory cues might be even more important than previously reported, so we conducted an experiment in a forested habitat in North Carolina that allowed us to better separate auditory and visual detections. Three teams of three observers each performed simultaneous 3-min unlimited-radius point counts at 30 points in a mixed-hardwood forest. One team member could see, but not hear birds, one could hear, but not see, and the third was nonhandicapped. Of the total number of birds detected, 2.9% were detected by deafened observers, 75.1% by blinded observers, and 78.2% by nonhandicapped observers. Detections by blinded and nonhandicapped observers were the same only 54% of the time. Our results suggest that the detection of birds in forest habitats is almost entirely by auditory cues. Because many factors affect the probability that observers will detect auditory cues, the accuracy and precision of avian point count estimates are likely lower than assumed by most field ornithologists. ?? 2009 Association of Field Ornithologists.
NASA Technical Reports Server (NTRS)
1975-01-01
A photometer is examined which combines several features from separate instruments into a single package. The design presented has both point and area photometry capability with provision for inserting filters to provide spectral discrimination. The electronics provide for photon counting mode for the point detectors and both photon counting and analog modes for the area detector. The area detector also serves as a target locating device for the point detectors. Topics discussed include: (1) electronic equipment requirements, (2) optical properties, (3) structural housing for the instrument, (4) motors and other mechanical components, (5) ground support equipment, and (6) environment control for the instrument. Engineering drawings and block diagrams are shown.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kim, K. S.; Nakae, L. F.; Prasad, M. K.
Here, we solve a simple theoretical model of time evolving fission chains due to Feynman that generalizes and asymptotically approaches the point model theory. The point model theory has been used to analyze thermal neutron counting data. This extension of the theory underlies fast counting data for both neutrons and gamma rays from metal systems. Fast neutron and gamma-ray counting is now possible using liquid scintillator arrays with nanosecond time resolution. For individual fission chains, the differential equations describing three correlated probability distributions are solved: the time-dependent internal neutron population, accumulation of fissions in time, and accumulation of leaked neutronsmore » in time. Explicit analytic formulas are given for correlated moments of the time evolving chain populations. The equations for random time gate fast neutron and gamma-ray counting distributions, due to randomly initiated chains, are presented. Correlated moment equations are given for both random time gate and triggered time gate counting. There are explicit formulas for all correlated moments are given up to triple order, for all combinations of correlated fast neutrons and gamma rays. The nonlinear differential equations for probabilities for time dependent fission chain populations have a remarkably simple Monte Carlo realization. A Monte Carlo code was developed for this theory and is shown to statistically realize the solutions to the fission chain theory probability distributions. Combined with random initiation of chains and detection of external quanta, the Monte Carlo code generates time tagged data for neutron and gamma-ray counting and from these data the counting distributions.« less
Reliability of a rapid hematology stain for sputum cytology*
Gonçalves, Jéssica; Pizzichini, Emilio; Pizzichini, Marcia Margaret Menezes; Steidle, Leila John Marques; Rocha, Cristiane Cinara; Ferreira, Samira Cardoso; Zimmermann, Célia Tânia
2014-01-01
Objective: To determine the reliability of a rapid hematology stain for the cytological analysis of induced sputum samples. Methods: This was a cross-sectional study comparing the standard technique (May-Grünwald-Giemsa stain) with a rapid hematology stain (Diff-Quik). Of the 50 subjects included in the study, 21 had asthma, 19 had COPD, and 10 were healthy (controls). From the induced sputum samples collected, we prepared four slides: two were stained with May-Grünwald-Giemsa, and two were stained with Diff-Quik. The slides were read independently by two trained researchers blinded to the identification of the slides. The reliability for cell counting using the two techniques was evaluated by determining the intraclass correlation coefficients (ICCs) for intraobserver and interobserver agreement. Agreement in the identification of neutrophilic and eosinophilic sputum between the observers and between the stains was evaluated with kappa statistics. Results: In our comparison of the two staining techniques, the ICCs indicated almost perfect interobserver agreement for neutrophil, eosinophil, and macrophage counts (ICC: 0.98-1.00), as well as substantial agreement for lymphocyte counts (ICC: 0.76-0.83). Intraobserver agreement was almost perfect for neutrophil, eosinophil, and macrophage counts (ICC: 0.96-0.99), whereas it was moderate to substantial for lymphocyte counts (ICC = 0.65 and 0.75 for the two observers, respectively). Interobserver agreement for the identification of eosinophilic and neutrophilic sputum using the two techniques ranged from substantial to almost perfect (kappa range: 0.91-1.00). Conclusions: The use of Diff-Quik can be considered a reliable alternative for the processing of sputum samples. PMID:25029648
Concentration Independent Calibration of β-γ Coincidence Detector Using 131mXe and 133Xe
DOE Office of Scientific and Technical Information (OSTI.GOV)
McIntyre, Justin I.; Cooper, Matthew W.; Carman, April J.
Absolute efficiency calibration of radiometric detectors is frequently difficult and requires careful detector modeling and accurate knowledge of the radioactive source used. In the past we have calibrated the b-g coincidence detector of the Automated Radioxenon Sampler/Analyzer (ARSA) using a variety of sources and techniques which have proven to be less than desirable.[1] A superior technique has been developed that uses the conversion-electron (CE) and x-ray coincidence of 131mXe to provide a more accurate absolute gamma efficiency of the detector. The 131mXe is injected directly into the beta cell of the coincident counting system and no knowledge of absolute sourcemore » strength is required. In addition, 133Xe is used to provide a second independent means to obtain the absolute efficiency calibration. These two data points provide the necessary information for calculating the detector efficiency and can be used in conjunction with other noble gas isotopes to completely characterize and calibrate the ARSA nuclear detector. In this paper we discuss the techniques and results that we have obtained.« less
NASA Astrophysics Data System (ADS)
Stogdale, Nick; Hollock, Steve; Johnson, Neil; Sumpter, Neil
2003-09-01
A 16x16 element un-cooled pyroelectric detector array has been developed which, when allied with advanced tracking and detection algorithms, has created a universal detector with multiple applications. Low-cost manufacturing techniques are used to fabricate a hybrid detector, intended for economic use in commercial markets. The detector has found extensive application in accurate people counting, detection, tracking, secure area protection, directional sensing and area violation; topics which are all pertinent to the provision of Homeland Security. The detection and tracking algorithms have, when allied with interpolation techniques, allowed a performance much higher than might be expected from a 16x16 array. This paper reviews the technology, with particular attention to the array structure, algorithms and interpolation techniques and outlines its application in a number of challenging market areas. Viewed from above, moving people are seen as 'hot blobs' moving through the field of view of the detector; background clutter or stationary objects are not seen and the detector works irrespective of lighting or environmental conditions. Advanced algorithms detect the people and extract size, shape, direction and velocity vectors allowing the number of people to be detected and their trajectories of motion to be tracked. Provision of virtual lines in the scene allows bi-directional counting of people flowing in and out of an entrance or area. Definition of a virtual closed area in the scene allows counting of the presence of stationary people within a defined area. Definition of 'counting lines' allows the counting of people, the ability to augment access control devices by confirming a 'one swipe one entry' judgement and analysis of the flow and destination of moving people. For example, passing the 'wrong way' up a denied passageway can be detected. Counting stationary people within a 'defined area' allows the behaviour and size of groups of stationary people to be analysed and counted, an alarm condition can also be generated when people stray into such areas.
Matsche, Mark A; Arnold, Jill; Jenkins, Erin; Townsend, Howard; Rosemary, Kevin
2014-09-01
The imperiled status of Atlantic sturgeon (Acipenser oxyrinchus oxyrinchus), a large, long-lived, anadromous fish found along the Atlantic coast of North America, has prompted efforts at captive propagation for research and stock enhancement. The purpose of this study was to establish hematology and plasma chemistry reference intervals of captive Atlantic sturgeon maintained under different culture conditions. Blood specimens were collected from a total of 119 fish at 3 hatcheries: Lamar, PA (n = 36, ages 10-14 years); Chalk Point, MD (n = 40, siblings of Lamar); and Horn Point, Cambridge, MD (n = 43, mixed population from Chesapeake Bay). Reference intervals (using robust techniques), median, mean, and standard deviations were determined for WBC, RBC, thrombocytes, PCV, HGB, MCV, MCH, MCHC, and absolute counts for lymphocytes (L), neutrophils (N), monocytes, and eosinophils. Chemistry analytes included concentrations of total proteins, albumin, glucose, urea, calcium, phosphate, sodium, potassium, chloride, and globulins, AST, CK, and LDH activities, and osmolality. Mean concentrations of total proteins, albumin, and glucose were at or below the analytic range. Statistical comparisons showed significant differences among hatcheries for each remaining plasma chemistry analyte and for PCV, RBC, MCHC, MCH, eosinophil and monocyte counts, and N:L ratio throughout all 3 groups. Therefore, reference intervals were calculated separately for each population. Reference intervals for fish maintained under differing conditions should be established per population. © 2014 American Society for Veterinary Clinical Pathology and European Society for Veterinary Clinical Pathology.
Reconfigurable Computing As an Enabling Technology for Single-Photon-Counting Laser Altimetry
NASA Technical Reports Server (NTRS)
Powell, Wesley; Hicks, Edward; Pinchinat, Maxime; Dabney, Philip; McGarry, Jan; Murray, Paul
2003-01-01
Single-photon-counting laser altimetry is a new measurement technique offering significant advantages in vertical resolution, reducing instrument size, mass, and power, and reducing laser complexity as compared to analog or threshold detection laser altimetry techniques. However, these improvements come at the cost of a dramatically increased requirement for onboard real-time data processing. Reconfigurable computing has been shown to offer considerable performance advantages in performing this processing. These advantages have been demonstrated on the Multi-KiloHertz Micro-Laser Altimeter (MMLA), an aircraft based single-photon-counting laser altimeter developed by NASA Goddard Space Flight Center with several potential spaceflight applications. This paper describes how reconfigurable computing technology was employed to perform MMLA data processing in real-time under realistic operating constraints, along with the results observed. This paper also expands on these prior results to identify concepts for using reconfigurable computing to enable spaceflight single-photon-counting laser altimeter instruments.
Rain volume estimation over areas using satellite and radar data
NASA Technical Reports Server (NTRS)
Doneaud, A. A.; Vonderhaar, T. H.
1985-01-01
An investigation of the feasibility of rain volume estimation using satellite data following a technique recently developed with radar data called the Arera Time Integral was undertaken. Case studies were selected on the basis of existing radar and satellite data sets which match in space and time. Four multicell clusters were analyzed. Routines for navigation remapping amd smoothing of satellite images were performed. Visible counts were normalized for solar zenith angle. A radar sector of interest was defined to delineate specific radar echo clusters for each radar time throughout the radar echo cluster lifetime. A satellite sector of interest was defined by applying small adjustments to the radar sector using a manual processing technique. The radar echo area, the IR maximum counts and the IR counts matching radar echo areas were found to evolve similarly, except for the decaying phase of the cluster where the cirrus debris keeps the IR counts high.
Photon Counting - One More Time
NASA Astrophysics Data System (ADS)
Stanton, Richard H.
2012-05-01
Photon counting has been around for more than 60 years, and has been available to amateurs for most of that time. In most cases single photons are detected using photomultiplier tubes, "old technology" that became available after the Second World War. But over the last couple of decades the perfection of CCD devices has given amateurs the ability to perform accurate photometry with modest telescopes. Is there any reason to still count photons? This paper discusses some of the strengths of current photon counting technology, particularly relating to the search for fast optical transients. Technology advances in counters and photomultiplier modules are briefly mentioned. Illustrative data are presented including FFT analysis of bright star photometry and a technique for finding optical pulses in a large file of noisy data. This latter technique is shown to enable the discovery of a possible optical flare on the polar variable AM Her.
Uncertainty of quantitative microbiological methods of pharmaceutical analysis.
Gunar, O V; Sakhno, N G
2015-12-30
The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.
Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors
Dutton, Neale A. W.; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K.
2016-01-01
SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed. PMID:27447643
Chavali, Pooja; Uppin, Megha S; Uppin, Shantveer G; Challa, Sundaram
2017-01-01
The most reliable histological correlate of recurrence risk in meningiomas is increased mitotic activity. Proliferative index with Ki-67 immunostaining is a helpful adjunct to manual counting. However, both show considerable inter-observer variability. A new immunohistochemical method for counting mitotic figures, using antibody against the phosphohistone H3 (PHH3) protein was introduced. Similarly, a computer based automated counting for Ki-67 labelling index (LI) is available. To study the use of these new techniques in the objective assessment of proliferation indices in meningiomas. This was a retrospective study of intracranial meningiomas diagnosed during the year 2013.The hematoxylin and eosin (H and E) sections and immunohistochemistry (IHC) with Ki-67 were reviewed by two pathologists. Photomicrographs of the representative areas were subjected to Ki-67 analysis by Immunoratio (IR) software. Mean Ki-67 LI, both manual and by IR were calculated. IHC with PHH3 was performed. PHH3 positive nuclei were counted and mean values calculated. Data analysis was done using SPSS software. A total of 64 intracranial meningiomas were diagnosed. Evaluation on H and E, PHH3, Ki-67 LI (both manual and IR) were done in 32 cases (22 grade I and 10 grade II meningiomas). Statistically significant correlation was seen between the mitotic count in each grade and PHH3 values and also between the grade of the tumor and values of Ki-67 and PHH3. Both the techniques used in the study had advantage over, as well as, correlated well with the existing techniques and hence, can be applied to routine use.
NASA Technical Reports Server (NTRS)
Panda, J.; Seasholtz, R. G.
2004-01-01
The flow fields of unheated, supersonic free jets from convergent and convergent-divergent nozzles operating at M = 0.99, 1.4, and 1.6 were measured using spectrally resolved Rayleigh scattering technique. The axial component of velocity and temperature data as well as density data obtained from a previous experiment are presented in a systematic way with the goal of producing a database useful for validating computational fluid dynamics codes. The Rayleigh scattering process from air molecules provides a fundamental means of measuring flow properties in a non-intrusive, particle free manner. In the spectrally resolved application, laser light scattered by the air molecules is collected and analyzed using a Fabry-Perot interferometer (FPI). The difference between the incident laser frequency and the peak of the Rayleigh spectrum provides a measure of gas velocity. The temperature is measured from the spectral broadening caused by the random thermal motion and density is measured from the total light intensity. The present point measurement technique uses a CW laser, a scanning FPI and photon counting electronics. The 1 mm long probe volume is moved from point to point to survey the flow fields. Additional arrangements were made to remove particles from the main as well as the entrained flow and to isolate FPI from the high sound and vibration levels produced by the supersonic jets. In general, velocity is measured within +/- 10 m/s accuracy and temperature within +/- 10 K accuracy.
Radiation Discrimination in LiBaF3 Scintillator Using Digital Signal Processing Techniques
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aalseth, Craig E.; Bowyer, Sonya M.; Reeder, Paul L.
2002-11-01
The new scintillator material LiBaF3:Ce offers the possibility of measuring neutron or alpha count rates and energy spectra simultaneously while measuring gamma count rates and spectra using a single detector.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Faby, Sebastian, E-mail: sebastian.faby@dkfz.de; Kuchenbecker, Stefan; Sawall, Stefan
2015-07-15
Purpose: To study the performance of different dual energy computed tomography (DECT) techniques, which are available today, and future multi energy CT (MECT) employing novel photon counting detectors in an image-based material decomposition task. Methods: The material decomposition performance of different energy-resolved CT acquisition techniques is assessed and compared in a simulation study of virtual non-contrast imaging and iodine quantification. The material-specific images are obtained via a statistically optimal image-based material decomposition. A projection-based maximum likelihood approach was used for comparison with the authors’ image-based method. The different dedicated dual energy CT techniques are simulated employing realistic noise models andmore » x-ray spectra. The authors compare dual source DECT with fast kV switching DECT and the dual layer sandwich detector DECT approach. Subsequent scanning and a subtraction method are studied as well. Further, the authors benchmark future MECT with novel photon counting detectors in a dedicated DECT application against the performance of today’s DECT using a realistic model. Additionally, possible dual source concepts employing photon counting detectors are studied. Results: The DECT comparison study shows that dual source DECT has the best performance, followed by the fast kV switching technique and the sandwich detector approach. Comparing DECT with future MECT, the authors found noticeable material image quality improvements for an ideal photon counting detector; however, a realistic detector model with multiple energy bins predicts a performance on the level of dual source DECT at 100 kV/Sn 140 kV. Employing photon counting detectors in dual source concepts can improve the performance again above the level of a single realistic photon counting detector and also above the level of dual source DECT. Conclusions: Substantial differences in the performance of today’s DECT approaches were found for the application of virtual non-contrast and iodine imaging. Future MECT with realistic photon counting detectors currently can only perform comparably to dual source DECT at 100 kV/Sn 140 kV. Dual source concepts with photon counting detectors could be a solution to this problem, promising a better performance.« less
Kumar, A Yudhistra; Reddy, M Vikram
2008-01-01
Most Probable Number (MPN) of Total Coliforms (TC) and Faecal Coliforms (FC), and the physicochemical variables - temperature, Dissolved Oxygen (D.O.), Biochemical Oxygen Demand (B.O.D.), Chemical Oxygen Demand (C.O.D.), nitrates, phosphates and chlorides of municipal raw sewage and that of aeration tank and secondary clarifier of the Sewage Treatment Plant (STP), in relation to water at the treated sewage out-fall point, down-stream and up-stream of the Buckingham Canal at Kalpakkam were analyzed. Total Coliform and Faecal Coliform MPN counts were higher, 170 and 70/100 mL respectively in the raw sewage. However, the counts of the former in the aeration tank though remained similar, that of FC decreased to 50/100 mL; both of the counts further decreased to 30 and 44/100 mL respectively, in the secondary clarifier and were 110 and 23/100 mL, respectively at the treated sewage out-fall point in the canal. Total coliforms MPN was more than 18 times less in the water at the up-stream than that of the treated sewage out-fall point in the canal. Interestingly MPN of the FC in the up-stream water was nil while it was 8/100 mL in the canal's down-stream point. It is concluded that the FC, B.O.D., C.O.D., nitrates, phosphates and chlorides decreased and the D.O. increased in the treated-sewage due to the treatment of raw sewage through the STP.
Payment, P; Franco, E; Richardson, L; Siemiatycki, J
1991-01-01
During a prospective epidemiological study of gastrointestinal health effects associated with the consumption of drinking water produced by reverse-osmosis domestic units, a correlation was demonstrated between the bacterial counts on R2A medium incubated at 35 degrees C and the reported gastrointestinal symptoms in families who used these units. A univariate correlation was found with bacterial counts on R2A medium at 20 degrees C but was confounded by the bacterial counts at 35 degrees C. Other variables, such as family size and amount of water consumed, were not independently explanatory of the rate of illness. These observations raise concerns for the possibility of increased disease associated with certain point-of-use treatment devices for domestic use when high levels of bacterial growth occur. PMID:2059052
Aizawa, Emiko; Tsuji, Hirokazu; Asahara, Takashi; Takahashi, Takuya; Teraishi, Toshiya; Yoshida, Sumiko; Ota, Miho; Koga, Norie; Hattori, Kotaro; Kunugi, Hiroshi
2016-09-15
Bifidobacterium and Lactobacillus in the gut have been suggested to have a beneficial effect on stress response and depressive disorder. We examined whether these bacterial counts are reduced in patients with major depressive disorder (MDD) than in healthy controls. Bifidobacterium and Lactobacillus counts in fecal samples were estimated in 43 patients and 57 controls using bacterial rRNA-targeted reverse transcription-quantitative polymerase chain reaction The patients had significantly lower Bifidobacterium counts (P=0.012) and tended to have lower Lactobacillus counts (P=0.067) than the controls. Individuals whose bacterial counts below the optimal cut-off point (9.53 and 6.49log10 cells/g for Bifidobacterium and Lactobacillus, respectively) were significantly more common in the patients than in the controls for both bacteria (Bifidobacterium: odds ratio 3.23, 95% confidence interval [CI] 1.38-7.54, P=0.010; Lactobacillus: 2.57, 95% CI 1.14-5.78, P=0.027). Using the same cut-off points, we observed an association between the bacterial counts and Irritable bowel syndrome. Frequency of fermented milk consumption was associated with higher Bifidobacterium counts in the patients. The findings should be interpreted with caution since effects of gender and diet were not fully taken into account in the analysis. Our results provide direct evidence, for the first time, that individuals with lower Bifidobacterium and/or Lactobacillus counts are more common in patients with MDD compared to controls. Our findings provide new insight into the pathophysiology of MDD and will enhance future research on the use of pro- and prebiotics in the treatment of MDD. Copyright © 2016 Elsevier B.V. All rights reserved.
Digital holographic microscopy for detection of Trypanosoma cruzi parasites in fresh blood mounts
NASA Astrophysics Data System (ADS)
Romero, G. G.; Monaldi, A. C.; Alanís, E. E.
2012-03-01
An off-axis holographic microscope, in a transmission mode, calibrated to automatically detect the presence of Trypanosoma cruzi in blood is developed as an alternative diagnosis tool for Chagas disease. Movements of the microorganisms are detected by measuring the phase shift they produce on the transmitted wave front. A thin layer of blood infected by Trypanosoma cruzi parasites is examined in the holographic microscope, the images of the visual field being registered with a CCD camera. Two consecutive holograms of the same visual field are subtracted point by point and a phase contrast image of the resulting hologram is reconstructed by means of the angular spectrum propagation algorithm. This method enables the measurement of phase distributions corresponding to temporal differences between digital holograms in order to detect whether parasites are present or not. Experimental results obtained using this technique show that it is an efficient alternative that can be incorporated successfully as a part of a fully automatic system for detection and counting of this type of microorganisms.
Spatially explicit models for inference about density in unmarked or partially marked populations
Chandler, Richard B.; Royle, J. Andrew
2013-01-01
Recently developed spatial capture–recapture (SCR) models represent a major advance over traditional capture–recapture (CR) models because they yield explicit estimates of animal density instead of population size within an unknown area. Furthermore, unlike nonspatial CR methods, SCR models account for heterogeneity in capture probability arising from the juxtaposition of animal activity centers and sample locations. Although the utility of SCR methods is gaining recognition, the requirement that all individuals can be uniquely identified excludes their use in many contexts. In this paper, we develop models for situations in which individual recognition is not possible, thereby allowing SCR concepts to be applied in studies of unmarked or partially marked populations. The data required for our model are spatially referenced counts made on one or more sample occasions at a collection of closely spaced sample units such that individuals can be encountered at multiple locations. Our approach includes a spatial point process for the animal activity centers and uses the spatial correlation in counts as information about the number and location of the activity centers. Camera-traps, hair snares, track plates, sound recordings, and even point counts can yield spatially correlated count data, and thus our model is widely applicable. A simulation study demonstrated that while the posterior mean exhibits frequentist bias on the order of 5–10% in small samples, the posterior mode is an accurate point estimator as long as adequate spatial correlation is present. Marking a subset of the population substantially increases posterior precision and is recommended whenever possible. We applied our model to avian point count data collected on an unmarked population of the northern parula (Parula americana) and obtained a density estimate (posterior mode) of 0.38 (95% CI: 0.19–1.64) birds/ha. Our paper challenges sampling and analytical conventions in ecology by demonstrating that neither spatial independence nor individual recognition is needed to estimate population density—rather, spatial dependence can be informative about individual distribution and density.
Assessment of cell concentration and viability of isolated hepatocytes using flow cytometry.
Wigg, Alan J; Phillips, John W; Wheatland, Loretta; Berry, Michael N
2003-06-01
The assessment of cell concentration and viability of freshly isolated hepatocyte preparations has been traditionally performed using manual counting with a Neubauer counting chamber and staining for trypan blue exclusion. Despite the simple and rapid nature of this assessment, concerns about the accuracy of these methods exist. Simple flow cytometry techniques which determine cell concentration and viability are available yet surprisingly have not been extensively used or validated with isolated hepatocyte preparations. We therefore investigated the use of flow cytometry using TRUCOUNT Tubes and propidium iodide staining to measure cell concentration and viability of isolated rat hepatocytes in suspension. Analysis using TRUCOUNT Tubes provided more accurate and reproducible measurement of cell concentration than manual cell counting. Hepatocyte viability, assessed using propidium iodide, correlated more closely than did trypan blue exclusion with all indicators of hepatocyte integrity and function measured (lactate dehydrogenase leakage, cytochrome p450 content, cellular ATP concentration, ammonia and lactate removal, urea and albumin synthesis). We conclude that flow cytometry techniques can be used to measure cell concentration and viability of isolated hepatocyte preparations. The techniques are simple, rapid, and more accurate than manual cell counting and trypan blue staining and the results are not affected by protein-containing media.
Characterization of the 2012-044C Briz-M Upper Stage Breakup
NASA Technical Reports Server (NTRS)
Hamilton, Joseph A.; Matney, Mark
2013-01-01
The NASA breakup model prediction was close to the observed population for catalog objects. The NASA breakup model predicted a larger population than was observed for objects under 10 cm. The stare technique produces low observation counts, but is readily comparable to model predictions. Customized stare parameters (Az, El, Range) were effective to increase the opportunities for HAX to observe the debris cloud. Other techniques to increase observation count will be considered for future breakup events.
Peterson, Bonnie L.; Kus, Barbara E.; Deutschman, Douglas H.
2004-01-01
We compared three methods to determine nest predators of the Least Bell's Vireo (Vireo bellii pusillus) in San Diego County, California, during spring and summer 2000. Point counts and tracking stations were used to identify potential predators and video photography to document actual nest predators. Parental behavior at depredated nests was compared to that at successful nests to determine whether activity (frequency of trips to and from the nest) and singing vs. non-singing on the nest affected nest predation. Yellow-breasted Chats (Icteria virens) were the most abundant potential avian predator, followed by Western Scrub-Jays (Aphelocoma californica). Coyotes (Canis latrans) were abundant, with smaller mammalian predators occurring in low abundance. Cameras documented a 48% predation rate with scrub-jays as the major nest predators (67%), but Virginia opossums (Didelphis virginiana, 17%), gopher snakes (Pituophis melanoleucus, 8%) and Argentine ants (Linepithema humile, 8%) were also confirmed predators. Identification of potential predators from tracking stations and point counts demonstrated only moderate correspondence with actual nest predators. Parental behavior at the nest prior to depredation was not related to nest outcome.
McFarland, Kent P; Lloyd, John D; Hardy, Spencer P
2017-06-04
We conducted point counts in the alpine zone of the Presidential Range of the White Mountains, New Hampshire, USA, to estimate the distribution and density of the rare endemic White Mountain Fritillary ( Boloria chariclea montinus ). Incidence of occurrence and density of the endemic White Mountain Fritillary during surveys in 2012 and 2013 were greatest in the herbaceous-snowbank plant community. Densities at points in the heath-shrub-rush plant community were lower, but because this plant community is more widespread in the alpine zone, it likely supports the bulk of adult fritillaries. White Mountain Fritillary used cushion-tussock, the other alpine plant community suspected of providing habitat, only sparingly. Detectability of White Mountain Fritillaries varied as a consequence of weather conditions during the survey and among observers, suggesting that raw counts yield biased estimates of density and abundance. Point counts, commonly used to study and monitor populations of birds, were an effective means of sampling White Mountain Fritillary in the alpine environment where patches of habitat are small, irregularly shaped, and widely spaced, rendering line-transect methods inefficient and difficult to implement.
Funk, Anna L; Boisson, Sophie; Clasen, Thomas; Ensink, Jeroen H J
2013-06-01
The Kato-Katz, conventional ethyl-acetate sedimentation, and Midi Parasep(®) methods for diagnosing infection with soil-transmitted helminths were compared. The Kato-Katz technique gave the best overall diagnostic performance with the highest results in all measures (prevalence, faecal egg count, sensitivity) followed by the conventional ethyl-acetate and then the Midi Parasep(®) technique. The Kato-Katz technique showed a significantly higher faecal egg count and sensitivity for both hookworm and Trichuris as compared to the Midi Parasep(®) technique. The conventional ethyl-acetate technique produced smaller pellets and showed lower pellet mobility as compared to the Midi Parasep(®). Copyright © 2013 Elsevier B.V. All rights reserved.
Hsu, Guo-Liang; Tang, Jung-Chang; Hwang, Wu-Yuin
2014-08-01
The one-more-than technique is an effective strategy for individuals with intellectual disabilities (ID) to use when making purchases. However, the heavy cognitive demands of money counting skills potentially limit how individuals with ID shop. This study employed a multiple-probe design across participants and settings, via the assistance of a mobile purchasing assistance system (MPAS), to assess the effectiveness of the one-more-than technique on independent purchases for items with prices beyond the participants' money counting skills. Results indicated that the techniques with the MPAS could effectively convert participants' initial money counting problems into useful advantages for successfully promoting the independent purchasing skills of three secondary school students with ID. Also noteworthy is the fact that mobile technologies could be a permanent prompt for those with ID to make purchases in their daily lives. The treatment effects could be maintained for eight weeks and generalized across three community settings. Implications for practice and future studies are provided. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Akashi-Ronquest, M.; Amaudruz, P.-A.; Batygov, M.; Beltran, B.; Bodmer, M.; Boulay, M. G.; Broerman, B.; Buck, B.; Butcher, A.; Cai, B.; Caldwell, T.; Chen, M.; Chen, Y.; Cleveland, B.; Coakley, K.; Dering, K.; Duncan, F. A.; Formaggio, J. A.; Gagnon, R.; Gastler, D.; Giuliani, F.; Gold, M.; Golovko, V. V.; Gorel, P.; Graham, K.; Grace, E.; Guerrero, N.; Guiseppe, V.; Hallin, A. L.; Harvey, P.; Hearns, C.; Henning, R.; Hime, A.; Hofgartner, J.; Jaditz, S.; Jillings, C. J.; Kachulis, C.; Kearns, E.; Kelsey, J.; Klein, J. R.; Kuźniak, M.; LaTorre, A.; Lawson, I.; Li, O.; Lidgard, J. J.; Liimatainen, P.; Linden, S.; McFarlane, K.; McKinsey, D. N.; MacMullin, S.; Mastbaum, A.; Mathew, R.; McDonald, A. B.; Mei, D.-M.; Monroe, J.; Muir, A.; Nantais, C.; Nicolics, K.; Nikkel, J. A.; Noble, T.; O'Dwyer, E.; Olsen, K.; Orebi Gann, G. D.; Ouellet, C.; Palladino, K.; Pasuthip, P.; Perumpilly, G.; Pollmann, T.; Rau, P.; Retière, F.; Rielage, K.; Schnee, R.; Seibert, S.; Skensved, P.; Sonley, T.; Vázquez-Jáuregui, E.; Veloce, L.; Walding, J.; Wang, B.; Wang, J.; Ward, M.; Zhang, C.
2015-05-01
Many current and future dark matter and neutrino detectors are designed to measure scintillation light with a large array of photomultiplier tubes (PMTs). The energy resolution and particle identification capabilities of these detectors depend in part on the ability to accurately identify individual photoelectrons in PMT waveforms despite large variability in pulse amplitudes and pulse pileup. We describe a Bayesian technique that can identify the times of individual photoelectrons in a sampled PMT waveform without deconvolution, even when pileup is present. To demonstrate the technique, we apply it to the general problem of particle identification in single-phase liquid argon dark matter detectors. Using the output of the Bayesian photoelectron counting algorithm described in this paper, we construct several test statistics for rejection of backgrounds for dark matter searches in argon. Compared to simpler methods based on either observed charge or peak finding, the photoelectron counting technique improves both energy resolution and particle identification of low energy events in calibration data from the DEAP-1 detector and simulation of the larger MiniCLEAN dark matter detector.
Field evaluation of distance-estimation error during wetland-dependent bird surveys
Nadeau, Christopher P.; Conway, Courtney J.
2012-01-01
Context: The most common methods to estimate detection probability during avian point-count surveys involve recording a distance between the survey point and individual birds detected during the survey period. Accurately measuring or estimating distance is an important assumption of these methods; however, this assumption is rarely tested in the context of aural avian point-count surveys. Aims: We expand on recent bird-simulation studies to document the error associated with estimating distance to calling birds in a wetland ecosystem. Methods: We used two approaches to estimate the error associated with five surveyor's distance estimates between the survey point and calling birds, and to determine the factors that affect a surveyor's ability to estimate distance. Key results: We observed biased and imprecise distance estimates when estimating distance to simulated birds in a point-count scenario (x̄error = -9 m, s.d.error = 47 m) and when estimating distances to real birds during field trials (x̄error = 39 m, s.d.error = 79 m). The amount of bias and precision in distance estimates differed among surveyors; surveyors with more training and experience were less biased and more precise when estimating distance to both real and simulated birds. Three environmental factors were important in explaining the error associated with distance estimates, including the measured distance from the bird to the surveyor, the volume of the call and the species of bird. Surveyors tended to make large overestimations to birds close to the survey point, which is an especially serious error in distance sampling. Conclusions: Our results suggest that distance-estimation error is prevalent, but surveyor training may be the easiest way to reduce distance-estimation error. Implications: The present study has demonstrated how relatively simple field trials can be used to estimate the error associated with distance estimates used to estimate detection probability during avian point-count surveys. Evaluating distance-estimation errors will allow investigators to better evaluate the accuracy of avian density and trend estimates. Moreover, investigators who evaluate distance-estimation errors could employ recently developed models to incorporate distance-estimation error into analyses. We encourage further development of such models, including the inclusion of such models into distance-analysis software.
42 CFR 493.1276 - Standard: Clinical cytogenetics.
Code of Federal Regulations, 2010 CFR
2010-10-01
... of accessioning, cell preparation, photographing or other image reproduction technique, photographic... records that document the following: (1) The media used, reactions observed, number of cells counted, number of cells karyotyped, number of chromosomes counted for each metaphase spread, and the quality of...
2012-12-01
calibrated using a certified mineral or pure metal standard and counting times are chosen to provide 3-sigma detection limits of between 100-200 ppm...also submit “blind” duplicates for analyses. The precision of the data generated by the “EMPA point count ” will be evaluated by calculating RPD values...important to consider the variation in results among all samples studied for a particular media, since the overall particle count is very large. Data
Validation of an automated colony counting system for group A Streptococcus.
Frost, H R; Tsoi, S K; Baker, C A; Laho, D; Sanderson-Smith, M L; Steer, A C; Smeesters, P R
2016-02-08
The practice of counting bacterial colony forming units on agar plates has long been used as a method to estimate the concentration of live bacteria in culture. However, due to the laborious and potentially error prone nature of this measurement technique, an alternative method is desirable. Recent technologic advancements have facilitated the development of automated colony counting systems, which reduce errors introduced during the manual counting process and recording of information. An additional benefit is the significant reduction in time taken to analyse colony counting data. Whilst automated counting procedures have been validated for a number of microorganisms, the process has not been successful for all bacteria due to the requirement for a relatively high contrast between bacterial colonies and growth medium. The purpose of this study was to validate an automated counting system for use with group A Streptococcus (GAS). Twenty-one different GAS strains, representative of major emm-types, were selected for assessment. In order to introduce the required contrast for automated counting, 2,3,5-triphenyl-2H-tetrazolium chloride (TTC) dye was added to Todd-Hewitt broth with yeast extract (THY) agar. Growth on THY agar with TTC was compared with growth on blood agar and THY agar to ensure the dye was not detrimental to bacterial growth. Automated colony counts using a ProtoCOL 3 instrument were compared with manual counting to confirm accuracy over the stages of the growth cycle (latent, mid-log and stationary phases) and in a number of different assays. The average percentage differences between plating and counting methods were analysed using the Bland-Altman method. A percentage difference of ±10 % was determined as the cut-off for a critical difference between plating and counting methods. All strains measured had an average difference of less than 10 % when plated on THY agar with TTC. This consistency was also observed over all phases of the growth cycle and when plated in blood following bactericidal assays. Agreement between these methods suggest the use of an automated colony counting technique for GAS will significantly reduce time spent counting bacteria to enable a more efficient and accurate measurement of bacteria concentration in culture.
Crack classification in concrete beams using AE parameters
NASA Astrophysics Data System (ADS)
Bahari, N. A. A. S.; Shahidan, S.; Abdullah, S. R.; Ali, N.; Zuki, S. S. Mohd; Ibrahim, M. H. W.; Rahim, M. A.
2017-11-01
The acoustic emission (AE) technique is an effective tool for the evaluation of crack growth. The aim of this study is to evaluate crack classification in reinforced concrete beams using statistical analysis. AE has been applied for the early monitoring of reinforced concrete structures using AE parameters such as average frequency, rise time, amplitude counts and duration. This experimental study focuses on the utilisation of this method in evaluating reinforced concrete beams. Beam specimens measuring 150 mm × 250 mm × 1200 mm were tested using a three-point load flexural test using Universal Testing Machines (UTM) together with an AE monitoring system. The results indicated that RA value can be used to determine the relationship between tensile crack and shear movement in reinforced concrete beams.
Lunar Cratering Chronology: Calibrating Degree of Freshness of Craters to Absolute Ages
NASA Astrophysics Data System (ADS)
Trang, D.; Gillis-Davis, J.; Boyce, J. M.
2013-12-01
The use of impact craters to age-date surfaces of and/or geomorphological features on planetary bodies is a decades old practice. Various dating techniques use different aspects of impact craters in order to determine ages. One approach is based on the degree of freshness of primary-impact craters. This method examines the degradation state of craters through visual inspection of seven criteria: polygonality, crater ray, continuous ejecta, rim crest sharpness, satellite craters, radial channels, and terraces. These criteria are used to rank craters in order of age from 0.0 (oldest) to 7.0 (youngest). However, the relative decimal scale used in this technique has not been tied to a classification of absolute ages. In this work, we calibrate the degree of freshness to absolute ages through crater counting. We link the degree of freshness to absolute ages through crater counting of fifteen craters with diameters ranging from 5-22 km and degree of freshness from 6.3 to 2.5. We use the Terrain Camera data set on Kaguya to count craters on the continuous ejecta of each crater in our sample suite. Specifically, we divide the crater's ejecta blanket into quarters and count craters between the rim of the main crater out to one crater radii from the rim for two of the four sections. From these crater counts, we are able to estimate the absolute model age of each main crater using the Craterstats2 tool in ArcGIS. Next, we compare the degree of freshness for the crater count-derived age of our main craters to obtain a linear inverse relation that links these two metrics. So far, for craters with degree of freshness from 6.3 to 5.0, the linear regression has an R2 value of 0.7, which corresponds to a relative uncertainty of ×230 million years. At this point, this tool that links degree of freshness to absolute ages cannot be used with craters <8km because this class of crater degrades quicker than larger craters. A graphical solution exists for correcting the degree of freshness for craters <8 km in diameter. We convert this graphical solution to a single function of two independent variables, observed degree of freshness and crater diameter. This function, which results in a corrected degree of freshness is found through a curve-fitting routine and corrects the degree of freshness for craters <8 km in diameter. As a result, we are able to derive absolute ages from the degree of freshness of craters with diameters from about ≤20 km down to a 1 km in diameter with a precision of ×230 million years.
7 CFR 210.9 - Agreement with State agency.
Code of Federal Regulations, 2010 CFR
2010-01-01
... who are determined by the local educational agency to be eligible for such meals under 7 CFR part 245... authority official signing the claim shall be responsible for reviewing and analyzing meal counts to ensure... reimbursable meals served to eligible children at the point of service, or through another counting system if...
An Algorithm to Automatically Generate the Combinatorial Orbit Counting Equations
Melckenbeeck, Ine; Audenaert, Pieter; Michoel, Tom; Colle, Didier; Pickavet, Mario
2016-01-01
Graphlets are small subgraphs, usually containing up to five vertices, that can be found in a larger graph. Identification of the graphlets that a vertex in an explored graph touches can provide useful information about the local structure of the graph around that vertex. Actually finding all graphlets in a large graph can be time-consuming, however. As the graphlets grow in size, more different graphlets emerge and the time needed to find each graphlet also scales up. If it is not needed to find each instance of each graphlet, but knowing the number of graphlets touching each node of the graph suffices, the problem is less hard. Previous research shows a way to simplify counting the graphlets: instead of looking for the graphlets needed, smaller graphlets are searched, as well as the number of common neighbors of vertices. Solving a system of equations then gives the number of times a vertex is part of each graphlet of the desired size. However, until now, equations only exist to count graphlets with 4 or 5 nodes. In this paper, two new techniques are presented. The first allows to generate the equations needed in an automatic way. This eliminates the tedious work needed to do so manually each time an extra node is added to the graphlets. The technique is independent on the number of nodes in the graphlets and can thus be used to count larger graphlets than previously possible. The second technique gives all graphlets a unique ordering which is easily extended to name graphlets of any size. Both techniques were used to generate equations to count graphlets with 4, 5 and 6 vertices, which extends all previous results. Code can be found at https://github.com/IneMelckenbeeck/equation-generator and https://github.com/IneMelckenbeeck/graphlet-naming. PMID:26797021
Generalized estimators of avian abundance from count survey data
Royle, J. Andrew
2004-01-01
I consider modeling avian abundance from spatially referenced bird count data collected according to common protocols such as capture?recapture, multiple observer, removal sampling and simple point counts. Small sample sizes and large numbers of parameters have motivated many analyses that disregard the spatial indexing of the data, and thus do not provide an adequate treatment of spatial structure. I describe a general framework for modeling spatially replicated data that regards local abundance as a random process, motivated by the view that the set of spatially referenced local populations (at the sample locations) constitute a metapopulation. Under this view, attention can be focused on developing a model for the variation in local abundance independent of the sampling protocol being considered. The metapopulation model structure, when combined with the data generating model, define a simple hierarchical model that can be analyzed using conventional methods. The proposed modeling framework is completely general in the sense that broad classes of metapopulation models may be considered, site level covariates on detection and abundance may be considered, and estimates of abundance and related quantities may be obtained for sample locations, groups of locations, unsampled locations. Two brief examples are given, the first involving simple point counts, and the second based on temporary removal counts. Extension of these models to open systems is briefly discussed.
Ripple, Dean C; Montgomery, Christopher B; Hu, Zhishang
2015-02-01
Accurate counting and sizing of protein particles has been limited by discrepancies of counts obtained by different methods. To understand the bias and repeatability of techniques in common use in the biopharmaceutical community, the National Institute of Standards and Technology has conducted an interlaboratory comparison for sizing and counting subvisible particles from 1 to 25 μm. Twenty-three laboratories from industry, government, and academic institutions participated. The circulated samples consisted of a polydisperse suspension of abraded ethylene tetrafluoroethylene particles, which closely mimic the optical contrast and morphology of protein particles. For restricted data sets, agreement between data sets was reasonably good: relative standard deviations (RSDs) of approximately 25% for light obscuration counts with lower diameter limits from 1 to 5 μm, and approximately 30% for flow imaging with specified manufacturer and instrument setting. RSDs of the reported counts for unrestricted data sets were approximately 50% for both light obscuration and flow imaging. Differences between instrument manufacturers were not statistically significant for light obscuration but were significant for flow imaging. We also report a method for accounting for differences in the reported diameter for flow imaging and electrical sensing zone techniques; the method worked well for diameters greater than 15 μm. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
Solid state tritium detector for biomedical applications
NASA Astrophysics Data System (ADS)
Gordon, J. S.; Farrell, R.; Daley, K.; Oakes, C. E.
1994-08-01
Radioactive labeling of proteins is a very important technique used in biomedical research to identify, isolate, and investigate the expression and properties of proteins in biological systems. In such procedures, the preferred radiolabel is often tritium. Presently, binding assays involving tritium are carried out using inconvenient and expensive techniques which rely on the use of scintillation fluid counting systems. This traditional method involves both time-consuming laboratory protocols and the generation of substantial quantities of radioactive and chemical waste. We have developed a novel technology to measure the tritium content of biological specimens that does not rely on scintillation fluids. The tritiated samples can be positioned directly under a large area, monolithic array of specially prepared avalanche photodiodes (APDs) which record the tritium activity distribution at each point within the field of view of the array. The 1 mm(sup 2) sensing elements exhibit an intrinsic tritium beta detection efficiency of 27% with high gain uniformity and very low cross talk.
Floyd A. Johnson
1961-01-01
This report assumes a knowledge of the principles of point sampling as described by Grosenbaugh, Bell and Alexander, and others. Whenever trees are counted at every point in a sample of points (large sample) and measured for volume at a portion (small sample) of these points, the sampling design could be called ratio double sampling. If the large...
STATISTICS OF GAMMA-RAY POINT SOURCES BELOW THE FERMI DETECTION LIMIT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malyshev, Dmitry; Hogg, David W., E-mail: dm137@nyu.edu
2011-09-10
An analytic relation between the statistics of photons in pixels and the number counts of multi-photon point sources is used to constrain the distribution of gamma-ray point sources below the Fermi detection limit at energies above 1 GeV and at latitudes below and above 30 deg. The derived source-count distribution is consistent with the distribution found by the Fermi Collaboration based on the first Fermi point-source catalog. In particular, we find that the contribution of resolved and unresolved active galactic nuclei (AGNs) to the total gamma-ray flux is below 20%-25%. In the best-fit model, the AGN-like point-source fraction is 17%more » {+-} 2%. Using the fact that the Galactic emission varies across the sky while the extragalactic diffuse emission is isotropic, we put a lower limit of 51% on Galactic diffuse emission and an upper limit of 32% on the contribution from extragalactic weak sources, such as star-forming galaxies. Possible systematic uncertainties are discussed.« less
Radionuclide counting technique for measuring wind velocity and direction
NASA Technical Reports Server (NTRS)
Singh, J. J. (Inventor)
1984-01-01
An anemometer utilizing a radionuclide counting technique for measuring both the velocity and the direction of wind is described. A pendulum consisting of a wire and a ball with a source of radiation on the lower surface of the ball is positioned by the wind. Detectors and are located in a plane perpendicular to pendulum (no wind). The detectors are located on the circumferene of a circle and are equidistant from each other as well as the undisturbed (no wind) source ball position.
Star counts and visual extinctions in dark nebulae
NASA Technical Reports Server (NTRS)
Dickman, R. L.
1978-01-01
Application of star count techniques to the determination of visual extinctions in compact, fairly high-extinction dark nebulae is discussed. Particular attention is devoted to the determination of visual extinctions for a cloud having a possibly anomalous ratio of total to selective extinction. The techniques discussed are illustrated in application at two colors to four well-known compact dust clouds or Bok globules: Barnard 92, B 133, B 134, and B 335. Minimum masses and lower limits to the central extinction of these objects are presented.
Night Sky Weather Monitoring System Using Fish-Eye CCD
NASA Astrophysics Data System (ADS)
Tomida, Takayuki; Saito, Yasunori; Nakamura, Ryo; Yamazaki, Katsuya
Telescope Array (TA) is international joint experiment observing ultra-high energy cosmic rays. TA employs fluorescence detection technique to observe cosmic rays. In this technique, tho existence of cloud significantly affects quality of data. Therefore, cloud monitoring provides important information. We are developing two new methods for evaluating night sky weather with pictures taken by charge-coupled device (CCD) camera. One is evaluating the amount of cloud with pixels brightness. The other is counting the number of stars with contour detection technique. The results of these methods show clear correlation, and we concluded both the analyses are reasonable methods for weather monitoring. We discuss reliability of the star counting method.
NASA Astrophysics Data System (ADS)
Jiang, Xiao-Pan; Zhang, Zi-Liang; Qin, Xiu-Bo; Yu, Run-Sheng; Wang, Bao-Yi
2010-12-01
Positronium time of flight spectroscopy (Ps-TOF) is an effective technique for porous material research. It has advantages over other techniques for analyzing the porosity and pore tortuosity of materials. This paper describes a design for Ps-TOF apparatus based on the Beijing intense slow positron beam, supplying a new material characterization technique. In order to improve the time resolution and increase the count rate of the apparatus, the detector system is optimized. For 3 eV o-Ps, the time broadening is 7.66 ns and the count rate is 3 cps after correction.
Use of hot water for beef carcass decontamination.
Castillo, A; Lucia, L M; Goodson, K J; Savell, J W; Acuff, G R
1998-01-01
Hot water treatment of beef carcass surfaces for reduction of Escherichia coli O157:H7, Salmonella typhimurium, and various indicator organisms was studied using a model carcass spray cabinet. Paired hot carcass surface regions with different external fat characteristics (inside round, outside round, brisket, flank, and clod) were removed from carcasses immediately after the slaughter and dressing process. All cuts were inoculated with bovine feces containing 10(6)/g each of rifampicin-resistant E. coli O157:H7 and S. typhimurium, or with uninoculated bovine feces. Surfaces then were exposed to a carcass water wash or a water wash followed by hot water spray (95 degrees C). Counts of rifampicin-resistant Salmonella and E. coli or aerobic plate count (APC) and coliform counts were conducted before and after each treatment. All treatments significantly reduced levels of pathogens from the initial inoculation level of 5.0 log(10) CFU/cm2. Treatments including hot water sprays provided mean reductions of initial counts for E. coli O157:H7 and S. typhimurium of 3.7 and 3.8 log, APC reductions of 2.9 log, and coliform and thermotolerant coliform count reductions of 3.3 log. The efficacy of hot water treatments was affected by the carcass surface region, but not by delaying the treatment (30 min) after contaminating the surface. Verification of efficacy of hot water interventions used as critical control points in a hazard analysis critical control point (HACCP) system may be possible using coliform counts.
A Novel In-Beam Delayed Neutron Counting Technique for Characterization of Special Nuclear Materials
NASA Astrophysics Data System (ADS)
Bentoumi, G.; Rogge, R. B.; Andrews, M. T.; Corcoran, E. C.; Dimayuga, I.; Kelly, D. G.; Li, L.; Sur, B.
2016-12-01
A delayed neutron counting (DNC) system, where the sample to be analyzed remains stationary in a thermal neutron beam outside of the reactor, has been developed at the National Research Universal (NRU) reactor of the Canadian Nuclear Laboratories (CNL) at Chalk River. The new in-beam DNC is a novel approach for non-destructive characterization of special nuclear materials (SNM) that could enable identification and quantification of fissile isotopes within a large and shielded sample. Despite the orders of magnitude reduction in neutron flux, the in-beam DNC method can be as informative as the conventional in-core DNC for most cases while offering practical advantages and mitigated risk when dealing with large radioactive samples of unknown origin. This paper addresses (1) the qualification of in-beam DNC using a monochromatic thermal neutron beam in conjunction with a proven counting apparatus designed originally for in-core DNC, and (2) application of in-beam DNC to an examination of large sealed capsules containing unknown radioactive materials. Initial results showed that the in-beam DNC setup permits non-destructive analysis of bulky and gamma shielded samples. The method does not lend itself to trace analysis, and at best could only reveal the presence of a few milligrams of 235U via the assay of in-beam DNC total counts. Through analysis of DNC count rates, the technique could be used in combination with other neutron or gamma techniques to quantify isotopes present within samples.
Using the Chandra Source-Finding Algorithm to Automatically Identify Solar X-ray Bright Points
NASA Technical Reports Server (NTRS)
Adams, Mitzi L.; Tennant, A.; Cirtain, J. M.
2009-01-01
This poster details a technique of bright point identification that is used to find sources in Chandra X-ray data. The algorithm, part of a program called LEXTRCT, searches for regions of a given size that are above a minimum signal to noise ratio. The algorithm allows selected pixels to be excluded from the source-finding, thus allowing exclusion of saturated pixels (from flares and/or active regions). For Chandra data the noise is determined by photon counting statistics, whereas solar telescopes typically integrate a flux. Thus the calculated signal-to-noise ratio is incorrect, but we find we can scale the number to get reasonable results. For example, Nakakubo and Hara (1998) find 297 bright points in a September 11, 1996 Yohkoh image; with judicious selection of signal-to-noise ratio, our algorithm finds 300 sources. To further assess the efficacy of the algorithm, we analyze a SOHO/EIT image (195 Angstroms) and compare results with those published in the literature (McIntosh and Gurman, 2005). Finally, we analyze three sets of data from Hinode, representing different parts of the decline to minimum of the solar cycle.
Lefebvre, W; Hernandez-Maldonado, D; Moyon, F; Cuvilly, F; Vaudolon, C; Shinde, D; Vurpillot, F
2015-12-01
The geometry of atom probe tomography tips strongly differs from standard scanning transmission electron microscopy foils. Whereas the later are rather flat and thin (<20 nm), tips display a curved surface and a significantly larger thickness. As far as a correlative approach aims at analysing the same specimen by both techniques, it is mandatory to explore the limits and advantages imposed by the particular geometry of atom probe tomography specimens. Based on simulations (electron probe propagation and image simulations), the possibility to apply quantitative high angle annular dark field scanning transmission electron microscopy to of atom probe tomography specimens has been tested. The influence of electron probe convergence and the benefice of deconvolution of electron probe point spread function electron have been established. Atom counting in atom probe tomography specimens is for the first time reported in this present work. It is demonstrated that, based on single projections of high angle annular dark field imaging, significant quantitative information can be used as additional input for refining the data obtained by correlative analysis of the specimen in APT, therefore opening new perspectives in the field of atomic scale tomography. Copyright © 2015 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Connolly, Amy Lynn
A search for directly produced Supersymmetric Higgs Bosons has been performed in the di-tau decay channel in 86.3 ± 3.5 pb -1 of data collected by CDF during Run1b at the Tevatron. They search for events where one tau decays to an electron and the other tau decays hadronically. They perform a counting experiment and set limits on the cross section for Higgs production in the high tan β region of the m A-tan β plane. For a benchmark parameter space point where m A = 100 and tan β = 50, they set a 95% confidence level upper limitmore » at 891 pb compared to the theoretically predicted cross section of 122 pb. For events where the tau candidates are not back-to-back, they utilize a di-tau mass reconstruction technique for the first time on hadron collider data. Limits based on a likelihood binned in di-tau mass from non-back-to-back events alone are weaker than the limits obtained from the counting experiment using the full di-tau sample.« less
NASA Astrophysics Data System (ADS)
Cheng, Miranda C. N.; Verlinde, Erik P.
2007-09-01
The dyonic 1/4-BPS states in 4D string theory with Script N = 4 spacetime supersymmetry are counted by a Siegel modular form. The pole structure of the modular form leads to a contour dependence in the counting formula obscuring its duality invariance. We exhibit the relation between this ambiguity and the (dis-)appearance of bound states of 1/2-BPS configurations. Using this insight we propose a precise moduli-dependent contour prescription for the counting formula. We then show that the degeneracies are duality-invariant and are correctly adjusted at the walls of marginal stability to account for the (dis-)appearance of the two-centered bound states. Especially, for large black holes none of these bound states exists at the attractor point and none of these ambiguous poles contributes to the counting formula. Using this fact we also propose a second, moduli-independent contour which counts the ``immortal dyons" that are stable everywhere.
NASA Astrophysics Data System (ADS)
Hu, Jianwei; Tobin, Stephen J.; LaFleur, Adrienne M.; Menlove, Howard O.; Swinhoe, Martyn T.
2013-11-01
Self-Interrogation Neutron Resonance Densitometry (SINRD) is one of several nondestructive assay (NDA) techniques being integrated into systems to measure spent fuel as part of the Next Generation Safeguards Initiative (NGSI) Spent Fuel Project. The NGSI Spent Fuel Project is sponsored by the US Department of Energy's National Nuclear Security Administration to measure plutonium in, and detect diversion of fuel pins from, spent nuclear fuel assemblies. SINRD shows promising capability in determining the 239Pu and 235U content in spent fuel. SINRD is a relatively low-cost and lightweight instrument, and it is easy to implement in the field. The technique makes use of the passive neutron source existing in a spent fuel assembly, and it uses ratios between the count rates collected in fission chambers that are covered with different absorbing materials. These ratios are correlated to key attributes of the spent fuel assembly, such as the total mass of 239Pu and 235U. Using count rate ratios instead of absolute count rates makes SINRD less vulnerable to systematic uncertainties. Building upon the previous research, this work focuses on the underlying physics of the SINRD technique: quantifying the individual impacts on the count rate ratios of a few important nuclides using the perturbation method; examining new correlations between count rate ratio and mass quantities based on the results of the perturbation study; quantifying the impacts on the energy windows of the filtering materials that cover the fission chambers by tallying the neutron spectra before and after the neutrons go through the filters; and identifying the most important nuclides that cause cooling-time variations in the count rate ratios. The results of these studies show that 235U content has a major impact on the SINRD signal in addition to the 239Pu content. Plutonium-241 and 241Am are the two main nuclides responsible for the variation in the count rate ratio with cooling time. In short, this work provides insights into some of the main factors that affect the performance of SINRD, and it should help improve the hardware design and the algorithm used to interpret the signal for the SINRD technique. In addition, the modeling and simulation techniques used in this work can be easily adopted for analysis of other NDA systems, especially when complex systems like spent nuclear fuel are involved. These studies were conducted at Los Alamos National Laboratory.
Pollutant loads and water quality in streams of heavily populated and industrialised towns
NASA Astrophysics Data System (ADS)
Ntengwe, F. W.
The availability of portable water is often taken for granted and water allowed to get polluted. Industries, settlements, farms, markets, leaking sewer lines, poor hygiene practices are all potential sources of pollution. Each pollutant has its own effect on water and the environment. A study was conducted in Kitwe Stream in order to establish whether engineering and other human activities affect water quality. Samples were collected at ten points, the first point being at the source while the tenth point was at the confluence with the Kafue River. The samples were analysed for physical, chemical and biological parameters. The results revealed high levels of concentration and loads of total suspended solids (TSS). The points with high TSS values were P4 (118 mg/l) and P6 (140 mg/l) representing daily loads of 7.74 and 8.71 tonnes, respectively. The highest values of coliform were found at points P9 (2099), P10 (2558) followed by P4 (1149), P5 (1256) and P6 (1370). High values of nitrites were found at points P4 (34 mg/l), P5 (32 mg/l), P6 (21 mg/l) and P10 (12.4 mg/l). Chlorides were also found to be high at points P4, P5 and P6 with values of 70 mg/l, 80 mg/l and 87 mg/l, respectively. These parameters exceeded the maximum contaminant level (MCL) of 100 mg/l for TSS, 1 mg/l for nitrites, 500/100 ml for coliform in Zambia. The conductivity and coliform were also found to be high (>500 μS/cm, >500). The benthic study revealed a normal diversity of invertebrates but chironomidae was found to be on average 60% of total species counted. The fish activity was high upstream and low downstream at the mouth of the stream where it joins the Kafue River. There was no fish activity at the middle points. The planktons (phytoplankton and zooplankton) count revealed a high count (15-30 per ml) in places where there was high fish activity and a low count (1-5 per ml) where there was no activity. The stream water quality was therefore affected by the human activities.
Radioactivities of Long Duration Exposure Facility (LDEF) materials: Baggage and bonanzas
NASA Technical Reports Server (NTRS)
Smith, Alan R.; Hurley, Donna L.
1991-01-01
Radioactivities in materials onboard the returned Long Duration Exposure Facility (LDEF) satellite were studied by a variety of techniques. Among the most powerful is low background Ge semiconductor detector gamma ray spectrometry. The observed radioactivities are of two origins: those radionuclides produced by nuclear reactions with the radiation field in orbit and radionuclides present initially as contaminants in materials used for construction of the spacecraft and experimental assemblies. In the first category are experiment related monitor foils and tomato seeds, and such spacecraft materials as Al, stainless steel, and Ti. In the second category are Al, Be, Ti, Va, and some special glasses. Consider that measured peak-area count rates from both categories range from a high value of about 1 count per minute down to less than 0.001 count per minute. Successful measurement of count rates toward the low end of this range can be achieved only through low background techniques, such as used to obtain the results presented here.
Radioactivities of Long Duration Exposure Facility (LDEF) materials: Baggage and bonanzas
NASA Astrophysics Data System (ADS)
Smith, Alan R.; Hurley, Donna L.
1991-06-01
Radioactivities in materials onboard the returned Long Duration Exposure Facility (LDEF) satellite were studied by a variety of techniques. Among the most powerful is low background Ge semiconductor detector gamma ray spectrometry. The observed radioactivities are of two origins: those radionuclides produced by nuclear reactions with the radiation field in orbit and radionuclides present initially as contaminants in materials used for construction of the spacecraft and experimental assemblies. In the first category are experiment related monitor foils and tomato seeds, and such spacecraft materials as Al, stainless steel, and Ti. In the second category are Al, Be, Ti, Va, and some special glasses. Consider that measured peak-area count rates from both categories range from a high value of about 1 count per minute down to less than 0.001 count per minute. Successful measurement of count rates toward the low end of this range can be achieved only through low background techniques, such as used to obtain the results presented here.
Test of a mosquito eggshell isolation method and subsampling procedure.
Turner, P A; Streever, W J
1997-03-01
Production of Aedes vigilax, the common salt-marsh mosquito, can be assessed by determining eggshell densities found in soil. In this study, 14 field-collected eggshell samples were used to test a subsampling technique and compare eggshell counts obtained with a flotation method to those obtained by direct examination of sediment (DES). Relative precision of the subsampling technique was assessed by determining the minimum number of subsamples required to estimate the true mean and confidence interval of a sample at a predetermined confidence level. A regression line was fitted to cube-root transformed eggshell counts obtained from flotation and DES and found to be significant (P < 0.001, r2 = 0.97). The flotation method allowed processing of samples in about one-third of the time required by DES, but recovered an average of 44% of the eggshells present. Eggshells obtained with the flotation method can be used to predict those from DES using the following equation: DES count = [1.386 x (flotation count)0.33 - 0.01]3.
Conversion from Engineering Units to Telemetry Counts on Dryden Flight Simulators
NASA Technical Reports Server (NTRS)
Fantini, Jay A.
1998-01-01
Dryden real-time flight simulators encompass the simulation of pulse code modulation (PCM) telemetry signals. This paper presents a new method whereby the calibration polynomial (from first to sixth order), representing the conversion from counts to engineering units (EU), is numerically inverted in real time. The result is less than one-count error for valid EU inputs. The Newton-Raphson method is used to numerically invert the polynomial. A reverse linear interpolation between the EU limits is used to obtain an initial value for the desired telemetry count. The method presented here is not new. What is new is how classical numerical techniques are optimized to take advantage of modem computer power to perform the desired calculations in real time. This technique makes the method simple to understand and implement. There are no interpolation tables to store in memory as in traditional methods. The NASA F-15 simulation converts and transmits over 1000 parameters at 80 times/sec. This paper presents algorithm development, FORTRAN code, and performance results.
NASA Astrophysics Data System (ADS)
Salançon, Evelyne; Degiovanni, Alain; Lapena, Laurent; Morin, Roger
2018-04-01
An event-counting method using a two-microchannel plate stack in a low-energy electron point projection microscope is implemented. 15 μm detector spatial resolution, i.e., the distance between first-neighbor microchannels, is demonstrated. This leads to a 7 times better microscope resolution. Compared to previous work with neutrons [Tremsin et al., Nucl. Instrum. Methods Phys. Res., Sect. A 592, 374 (2008)], the large number of detection events achieved with electrons shows that the local response of the detector is mainly governed by the angle between the hexagonal structures of the two microchannel plates. Using this method in point projection microscopy offers the prospect of working with a greater source-object distance (350 nm instead of 50 nm), advancing toward atomic resolution.
NASA Astrophysics Data System (ADS)
Wang, Yanqing; Wu, Gang
2017-05-01
In this paper, we are concerned with the upper box-counting dimension of the set of possible singular points in the space-time of suitable weak solutions to the 3D Navier-Stokes equations. By taking full advantage of the pressure \\Pi in terms of \
ERIC Educational Resources Information Center
Bushweller, Kevin C., Ed.
2018-01-01
The 2018 edition of "Technology Counts" is a nationally representative survey of 500 principals, assistant principals, and other school leaders to better understand what principals are thinking and doing about some of the technology issues in their schools today. Contents include: (1) Pressure Points for Principals (Kevin Bushweller);…
Parental Involvement: What Counts, Who Counts It, and Does It Help?
ERIC Educational Resources Information Center
Flessa, Joseph
2008-01-01
When asked to explain why so many urban schools show unsatisfactory results on academic or social measures, principals routinely and quickly turn to descriptions of parents. In other words, when seeking to explain why work within a school is so difficult or why reform initiatives have been unsuccessful, many principals point outside the school.…
Electronic strain-level counter
NASA Technical Reports Server (NTRS)
Pitts, F. L.; Spencer, J. L. (Inventor)
1973-01-01
An electronic strain level counter for obtaining structural strain data on in-flight aircraft is described. The device counts the number of times the strain at a point on an aircraft structural member exceeds each of several preset levels. A dead band is provided at each level to prohibit the counting of small strain variations around a given preset level.
NASA Astrophysics Data System (ADS)
Vandergoes, Marcus J.; Howarth, Jamie D.; Dunbar, Gavin B.; Turnbull, Jocelyn C.; Roop, Heidi A.; Levy, Richard H.; Li, Xun; Prior, Christine; Norris, Margaret; Keller, Liz D.; Baisden, W. Troy; Ditchburn, Robert; Fitzsimons, Sean J.; Bronk Ramsey, Christopher
2018-05-01
Annually resolved (varved) lake sequences are important palaeoenvironmental archives as they offer a direct incremental dating technique for high-frequency reconstruction of environmental and climate change. Despite the importance of these records, establishing a robust chronology and quantifying its precision and accuracy (estimations of error) remains an essential but challenging component of their development. We outline an approach for building reliable independent chronologies, testing the accuracy of layer counts and integrating all chronological uncertainties to provide quantitative age and error estimates for varved lake sequences. The approach incorporates (1) layer counts and estimates of counting precision; (2) radiometric and biostratigrapic dating techniques to derive independent chronology; and (3) the application of Bayesian age modelling to produce an integrated age model. This approach is applied to a case study of an annually resolved sediment record from Lake Ohau, New Zealand. The most robust age model provides an average error of 72 years across the whole depth range. This represents a fractional uncertainty of ∼5%, higher than the <3% quoted for most published varve records. However, the age model and reported uncertainty represent the best fit between layer counts and independent chronology and the uncertainties account for both layer counting precision and the chronological accuracy of the layer counts. This integrated approach provides a more representative estimate of age uncertainty and therefore represents a statistically more robust chronology.
Prototype Test Results for the Single Photon Detection SLR2000 Satellite Laser Ranging System
NASA Technical Reports Server (NTRS)
Zagwodzki, Thomas W.; McGarry, Jan F.; Degnan, John J.; Cheek, Jack W.; Dunn, Peter J.; Patterson, Don; Donovan, Howard
2004-01-01
NASA's aging Satellite Laser Ranging (SLR) network is scheduled to be replaced over the next few years with a fully automated single photon detection system. A prototype of this new system, called SLR2000, is currently undergoing field trials at the Goddard Space Flight Center in Greenbelt, Maryland to evaluate photon counting techniques and determine system hardware, software, and control algorithm performance levels and limitations. Newly developed diode pumped microchip lasers and quadrant microchannel plate-based photomultiplier tubes have enabled the development of this high repetition rate single photon detection SLR system. The SLR2000 receiver threshold is set at the single photoelectron (pe) level but tracks satellites with an average signal level typically much less than 1 pe. The 2 kHz laser fire rate aids in satellite acquisition and tracking and will enable closed loop tracking by accumulating single photon count statistics in a quadrant detector and using this information to correct for pointing errors. Laser transmitter beamwidths of 10 arcseconds (FWHM) or less are currently being used to maintain an adequate signal level for tracking while the receiver field of view (FOV) has been opened to 40 arcseconds to accommodate point ahead/look behind angular offsets. In the near future, the laser transmitter point ahead will be controlled by a pair of Risley prisms. This will allow the telescope to point behind and enable closure of the receiver FOV to roughly match the transmitter beam divergence. Bandpass filters (BPF) are removed for night tracking operations while 0.2 nm or 1 nm filters are used during daylight operation. Both day and night laser tracking of Low Earth Orbit (LEO) satellites has been achieved with a laser transmitter energy of only 65 microjoules per pulse. Satellite tracking is presently limited to LEO satellites until the brassboard laser transmitter can be upgraded or replaced. Simultaneous tracks have also been observed with NASA s SLR standard, MOBLAS 7, for the purposes of data comparison and identification of biases. Work continues to optimize the receive optics; upgrade or replace the laser transmitter; calibrate the quadrant detector, the point ahead Risley prisms, and event timer verniers; and test normal point generation with SLR2000 data. This paper will report on the satellite tracking results to date, issues yet to be resolved, and future plans for the SLR2000 system.
Comparison of point-of-care methods for preparation of platelet concentrate (platelet-rich plasma).
Weibrich, Gernot; Kleis, Wilfried K G; Streckbein, Philipp; Moergel, Maximilian; Hitzler, Walter E; Hafner, Gerd
2012-01-01
This study analyzed the concentrations of platelets and growth factors in platelet-rich plasma (PRP), which are likely to depend on the method used for its production. The cellular composition and growth factor content of platelet concentrates (platelet-rich plasma) produced by six different procedures were quantitatively analyzed and compared. Platelet and leukocyte counts were determined on an automatic cell counter, and analysis of growth factors was performed using enzyme-linked immunosorbent assay. The principal differences between the analyzed PRP production methods (blood bank method of intermittent flow centrifuge system/platelet apheresis and by the five point-of-care methods) and the resulting platelet concentrates were evaluated with regard to resulting platelet, leukocyte, and growth factor levels. The platelet counts in both whole blood and PRP were generally higher in women than in men; no differences were observed with regard to age. Statistical analysis of platelet-derived growth factor AB (PDGF-AB) and transforming growth factor β1 (TGF-β1) showed no differences with regard to age or gender. Platelet counts and TGF-β1 concentration correlated closely, as did platelet counts and PDGF-AB levels. There were only rare correlations between leukocyte counts and PDGF-AB levels, but comparison of leukocyte counts and PDGF-AB levels demonstrated certain parallel tendencies. TGF-β1 levels derive in substantial part from platelets and emphasize the role of leukocytes, in addition to that of platelets, as a source of growth factors in PRP. All methods of producing PRP showed high variability in platelet counts and growth factor levels. The highest growth factor levels were found in the PRP prepared using the Platelet Concentrate Collection System manufactured by Biomet 3i.
Thurber, Kristina M; Dierkhising, Ross A; Reiland, Sarah A; Pearson, Kristina K; Smith, Steven A; O'Meara, John G
2016-01-01
Carbohydrate counting may improve glycemic control in hospitalized cardiology patients by providing individualized insulin doses tailored to meal consumption. The purpose of this study was to compare glycemic outcomes with mealtime insulin dosed by carbohydrate counting versus fixed dosing in the inpatient setting. This single-center retrospective cohort study included 225 adult medical cardiology patients who received mealtime, basal, and correction-scale insulin concurrently for at least 72 h and up to 7 days in the interval March 1, 2010-November 7, 2013. Mealtime insulin was dosed by carbohydrate counting or with fixed doses determined prior to meal intake. An inpatient diabetes consult service was responsible for insulin management. Exclusion criteria included receipt of an insulin infusion. The primary end point compared mean daily postprandial glucose values, whereas secondary end points included comparison of preprandial glucose values and mean daily rates of hypoglycemia. Mean postprandial glucose level on Day 7 was 204 and 183 mg/dL in the carbohydrate counting and fixed mealtime dose groups, respectively (unadjusted P=0.04, adjusted P=0.12). There were no statistical differences between groups on Days 2-6. Greater rates of preprandial hypoglycemia were observed in the carbohydrate counting cohort on Day 5 (8.6% vs. 1.5%, P=0.02), Day 6 (1.7% vs. 0%, P=0.01), and Day 7 (7.1% vs. 0%, P=0.008). No differences in postprandial hypoglycemia were seen. Mealtime insulin dosing by carbohydrate counting was associated with similar glycemic outcomes as fixed mealtime insulin dosing, except for a greater incidence of preprandial hypoglycemia. Additional comparative studies that include hospital outcomes are needed.
Ben Chaabane, Salim; Fnaiech, Farhat
2014-01-23
Color image segmentation has been so far applied in many areas; hence, recently many different techniques have been developed and proposed. In the medical imaging area, the image segmentation may be helpful to provide assistance to doctor in order to follow-up the disease of a certain patient from the breast cancer processed images. The main objective of this work is to rebuild and also to enhance each cell from the three component images provided by an input image. Indeed, from an initial segmentation obtained using the statistical features and histogram threshold techniques, the resulting segmentation may represent accurately the non complete and pasted cells and enhance them. This allows real help to doctors, and consequently, these cells become clear and easy to be counted. A novel method for color edges extraction based on statistical features and automatic threshold is presented. The traditional edge detector, based on the first and the second order neighborhood, describing the relationship between the current pixel and its neighbors, is extended to the statistical domain. Hence, color edges in an image are obtained by combining the statistical features and the automatic threshold techniques. Finally, on the obtained color edges with specific primitive color, a combination rule is used to integrate the edge results over the three color components. Breast cancer cell images were used to evaluate the performance of the proposed method both quantitatively and qualitatively. Hence, a visual and a numerical assessment based on the probability of correct classification (PC), the false classification (Pf), and the classification accuracy (Sens(%)) are presented and compared with existing techniques. The proposed method shows its superiority in the detection of points which really belong to the cells, and also the facility of counting the number of the processed cells. Computer simulations highlight that the proposed method substantially enhances the segmented image with smaller error rates better than other existing algorithms under the same settings (patterns and parameters). Moreover, it provides high classification accuracy, reaching the rate of 97.94%. Additionally, the segmentation method may be extended to other medical imaging types having similar properties.
A Framework for Validating Traffic Simulation Models at the Vehicle Trajectory Level
DOT National Transportation Integrated Search
2017-03-01
Based on current practices, traffic simulation models are calibrated and validated using macroscopic measures such as 15-minute averages of traffic counts or average point-to-point travel times. For an emerging number of applications, including conne...
Better Than Counting: Density Profiles from Force Sampling
NASA Astrophysics Data System (ADS)
de las Heras, Daniel; Schmidt, Matthias
2018-05-01
Calculating one-body density profiles in equilibrium via particle-based simulation methods involves counting of events of particle occurrences at (histogram-resolved) space points. Here, we investigate an alternative method based on a histogram of the local force density. Via an exact sum rule, the density profile is obtained with a simple spatial integration. The method circumvents the inherent ideal gas fluctuations. We have tested the method in Monte Carlo, Brownian dynamics, and molecular dynamics simulations. The results carry a statistical uncertainty smaller than that of the standard counting method, reducing therefore the computation time.
Evaluation of a clay-based acidic bedding conditioner for dairy cattle bedding.
Proietto, R L; Hinckley, L S; Fox, L K; Andrew, S M
2013-02-01
This study investigated the effects of a clay-based acidic bedding conditioner on sawdust bedding pH, dry matter (DM), environmental pathogen counts, and environmental bacterial counts on teat ends of lactating dairy cows. Sixteen lactating Holstein cows were paired based on parity, days in milk, milk yield, and milk somatic cell count, and were negative for the presence of an intramammary pathogen. Within each pair, cows were randomly assigned to 1 of 2 treatments with 3-wk periods in a crossover design. Treatment groups consisted of 9 freestalls per group bedded with either untreated sawdust or sawdust with a clay-based acidic bedding conditioner, added at 3- to 4-d intervals over each 21-d period. Bedding and teat ends were aseptically sampled on d 0, 1, 2, 7, 14, and 21 for determination of environmental bacterial counts. At the same time points, bedding was sampled for DM and pH determination. The bacteria identified in the bedding material were total gram-negative bacteria, Streptococcus spp., and coliform bacteria. The bacteria identified on the teat ends were Streptococcus spp., coliform bacteria, and Klebsiella spp. Teat end score, milk somatic cell count, and intramammary pathogen presence were measured weekly. Bedding and teat cleanliness, environmental high and low temperatures, and dew point data were collected daily. The bedding conditioner reduced the pH, but not the DM, of the sawdust bedding compared with untreated sawdust. Overall environmental bacterial counts in bedding were lower for treated sawdust. Total bacterial counts in bedding and on teat ends increased with time over both periods. Compared with untreated sawdust, the treated bedding had lower counts of total gram-negative bacteria and streptococci, but not coliform counts. Teat end bacterial counts were lower for cows bedded on treated sawdust for streptococci, coliforms, and Klebsiella spp. compared with cows bedded on untreated sawdust. The clay-based acidic bedding conditioner reduced environmental pathogens in sawdust bedding and teat ends without affecting teat end integrity. Copyright © 2013 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tang, Chad; Gomez, Daniel R.; Wang, Hongmei
Purpose: Radiation pneumonitis (RP) is an inflammatory response to radiation therapy (RT). We assessed the association between RP and white blood cell (WBC) count, an established metric of systemic inflammation, after RT for non-small cell lung cancer. Methods and Materials: We retrospectively analyzed 366 patients with non-small cell lung cancer who received ≥60 Gy as definitive therapy. The primary endpoint was whether WBC count after RT (defined as 2 weeks through 3 months after RT completion) was associated with grade ≥3 or grade ≥2 RP. Median lung volume receiving ≥20 Gy (V{sub 20}) was 31%, and post-RT WBC counts rangedmore » from 1.7 to 21.2 × 10{sup 3} WBCs/μL. Odds ratios (ORs) associating clinical variables and post-RT WBC counts with RP were calculated via logistic regression. A recursive-partitioning algorithm was used to define optimal post-RT WBC count cut points. Results: Post-RT WBC counts were significantly higher in patients with grade ≥3 RP than without (P<.05). Optimal cut points for post-RT WBC count were found to be 7.4 and 8.0 × 10{sup 3}/μL for grade ≥3 and ≥2 RP, respectively. Univariate analysis revealed significant associations between post-RT WBC count and grade ≥3 (n=46, OR=2.6, 95% confidence interval [CI] 1.4‒4.9, P=.003) and grade ≥2 RP (n=164, OR=2.0, 95% CI 1.2‒3.4, P=.01). This association held in a stepwise multivariate regression. Of note, V{sub 20} was found to be significantly associated with grade ≥2 RP (OR=2.2, 95% CI 1.2‒3.4, P=.01) and trended toward significance for grade ≥3 RP (OR=1.9, 95% CI 1.0-3.5, P=.06). Conclusions: Post-RT WBC counts were significantly and independently associated with RP and have potential utility as a diagnostic or predictive marker for this toxicity.« less
Calculating the n-point correlation function with general and efficient python code
NASA Astrophysics Data System (ADS)
Genier, Fred; Bellis, Matthew
2018-01-01
There are multiple approaches to understanding the evolution of large-scale structure in our universe and with it the role of baryonic matter, dark matter, and dark energy at different points in history. One approach is to calculate the n-point correlation function estimator for galaxy distributions, sometimes choosing a particular type of galaxy, such as luminous red galaxies. The standard way to calculate these estimators is with pair counts (for the 2-point correlation function) and with triplet counts (for the 3-point correlation function). These are O(n2) and O(n3) problems, respectively and with the number of galaxies that will be characterized in future surveys, having efficient and general code will be of increasing importance. Here we show a proof-of-principle approach to the 2-point correlation function that relies on pre-calculating galaxy locations in coarse “voxels”, thereby reducing the total number of necessary calculations. The code is written in python, making it easily accessible and extensible and is open-sourced to the community. Basic results and performance tests using SDSS/BOSS data will be shown and we discuss the application of this approach to the 3-point correlation function.
Visual counts as an index of White-Tailed Prairie Dog density
Menkens, George E.; Biggins, Dean E.; Anderson, Stanley H.
1990-01-01
Black-footed ferrets (Mustela nigripes) are depended on prairie dogs (Cynomys spp.) for food and shelter and were historically restricted to prairie dog towns (Anderson et al. 1986). Because ferrets and prairie dogs are closely associated, successful ferret management and conservation depends on successful prairie dog management. A critical component of any management program for ferrets will be monitoring prairie dog population dynamics on towns containing ferrets or on towns proposed as ferret reintroduction sites. Three techniques for estimating prairie dog population size and density are counts of plugged and reopened burrows (Tietjen and Matschke 1982), mark-recapture (Otis et al. 1978; Seber 1982, 1986; Menkens and Anderson 1989), and visual counts (Fagerstone and Biggins 1986, Knowles 1986). The technique of plugging burrows and counting the number reopened by prairie dogs is too time and labor intensive for population evaluation on a large number of towns or over large areas. Total burrow counts are not correlated with white-tailed prairie dog (C. leucurus) densities and thus cannot be used for populated evaluation (Menkens et al. 1988). Mark-recapture requires trapping that is expensive and time and labor intensive. Monitoring a large number of prairie dog populations using mark-recapture would be difficult. Alternatively a large number of populations could be monitored in short periods of time using the visual count technique (Fagerstone and Biggins 1986, Knowles 1986). However, the accuracy of visual counts has only been evaluated in a few locations. Thus, it is not known whether the relationship between counts and prairie dog density is consistent throughout the prairie dog's range. Our objective was to evaluate the potential of using visual counts as a rapid means of estimating white-tailed prairie dog density in prairie dog towns throughout Wyoming. We studied 18 white-tailed prairie dog towns in 4 white-tailed prairie dog complexes in Wyoming near Laramie (105°40'W, 41°20'N, 3 grids), Pathfinder reservoir (106°55'W, 42°30'N, 6 grids), Shirley Basin (106°10'W, 42°20'N, 6 grids), and Meeteetse (108°10'W, 44°10'N, 3 grids). All towns were dominated by grasses, forbs, and shrubs (details in Collins and Lichvar 1986). Topography of towns ranged from flat to gently rolling hills.
A radionuclide counting technique for measuring wind velocity. [drag force anemometers
NASA Technical Reports Server (NTRS)
Singh, J. J.; Khandelwal, G. S.; Mall, G. H.
1981-01-01
A technique for measuring wind velocities of meteorological interest is described. It is based on inverse-square-law variation of the counting rates as the radioactive source-to-counter distance is changed by wind drag on the source ball. Results of a feasibility study using a weak bismuth 207 radiation source and three Geiger-Muller radiation counters are reported. The use of the technique is not restricted to Martian or Mars-like environments. A description of the apparatus, typical results, and frequency response characteristics are included. A discussion of a double-pendulum arrangement is presented. Measurements reported herein indicate that the proposed technique may be suitable for measuring wind speeds up to 100 m/sec, which are either steady or whose rates of fluctuation are less than 1 kHz.
Effects of the frame acquisition rate on the sensitivity of gastro-oesophageal reflux scintigraphy
Codreanu, I; Chamroonrat, W; Edwards, K
2013-01-01
Objective: To compare the sensitivity of gastro-oesophageal reflux (GOR) scintigraphy at 5-s and 60-s frame acquisition rates. Methods: GOR scintigraphy of 50 subjects (1 month–20 years old, mean 42 months) were analysed concurrently using 5-s and 60-s acquisition frames. Reflux episodes were graded as low if activity was detected in the distal half of the oesophagus and high if activity was detected in its upper half or in the oral cavity. For comparison purposes, detected GOR in any number of 5-s frames corresponding to one 60-s frame was counted as one episode. Results: A total of 679 episodes of GOR to the upper oesophagus were counted using a 5-s acquisition technique. Only 183 of such episodes were detected on 60-s acquisition images. To the lower oesophagus, a total of 1749 GOR episodes were detected using a 5-s acquisition technique and only 1045 episodes using 60-s acquisition frames (these also included the high-level GOR on 5-s frames counted as low level on 60-s acquisition frames). 10 patients had high-level GOR episodes that were detected only using a 5-s acquisition technique, leading to a different diagnosis in these patients. No correlation between the number of reflux episodes and the gastric emptying rates was noted. Conclusion: The 5-s frame acquisition technique is more sensitive than the 60-s frame acquisition technique for detecting both high- and low-level GOR. Advances in knowledge: Brief GOR episodes with a relatively low number of radioactive counts are frequently indistinguishable from intense background activity on 60-s acquisition frames. PMID:23520226
Photographic techniques for characterizing streambed particle sizes
Whitman, Matthew S.; Moran, Edward H.; Ourso, Robert T.
2003-01-01
We developed photographic techniques to characterize coarse (>2-mm) and fine (≤2-mm) streambed particle sizes in 12 streams in Anchorage, Alaska. Results were compared with current sampling techniques to assess which provided greater sampling efficiency and accuracy. The streams sampled were wadeable and contained gravel—cobble streambeds. Gradients ranged from about 5% at the upstream sites to about 0.25% at the downstream sites. Mean particle sizes and size-frequency distributions resulting from digitized photographs differed significantly from those resulting from Wolman pebble counts for five sites in the analysis. Wolman counts were biased toward selecting larger particles. Photographic analysis also yielded a greater number of measured particles (mean = 989) than did the Wolman counts (mean = 328). Stream embeddedness ratings assigned from field and photographic observations were significantly different at 5 of the 12 sites, although both types of ratings showed a positive relationship with digitized surface fines. Visual estimates of embeddedness and digitized surface fines may both be useful indicators of benthic conditions, but digitizing surface fines produces quantitative rather than qualitative data. Benefits of the photographic techniques include reduced field time, minimal streambed disturbance, convenience of postfield processing, easy sample archiving, and improved accuracy and replication potential.
Relationship between salivary flow rates and Candida counts in subjects with xerostomia.
Torres, Sandra R; Peixoto, Camila Bernardo; Caldas, Daniele Manhães; Silva, Eline Barboza; Akiti, Tiyomi; Nucci, Márcio; de Uzeda, Milton
2002-02-01
This study evaluated the relationship between salivary flow and Candida colony counts in the saliva of patients with xerostomia. Sialometry and Candida colony-forming unit (CFU) counts were taken from 112 subjects who reported xerostomia in a questionnaire. Chewing-stimulated whole saliva was collected and streaked in Candida plates and counted in 72 hours. Species identification was accomplished under standard methods. There was a significant inverse relationship between salivary flow and Candida CFU counts (P =.007) when subjects with high colony counts were analyzed (cutoff point of 400 or greater CFU/mL). In addition, the median sialometry of men was significantly greater than that of women (P =.003), even after controlling for confounding variables like underlying disease and medications. Sjögren's syndrome was associated with low salivary flow rate (P =.007). There was no relationship between the median Candida CFU counts and gender or age. There was a high frequency (28%) of mixed colonization. Candida albicans was the most frequent species, followed by C parapsilosis, C tropicalis, and C krusei. In subjects with high Candida CFU counts there was an inverse relationship between salivary flow and Candida CFU counts.
NASA Technical Reports Server (NTRS)
Andrews, C. W.
1976-01-01
Volume fraction of a constituent or phase was estimated in six specimens of conventional and DS-eutectic superalloys, using ASTM E562-76, a new standard recommended practice for determining volume fraction by systematic manual point count. Volume fractions determined ranged from 0.086 to 0.36, and with one exception, the 95 percent relative confidence limits were approximately 10 percent of the determined volume fractions. Since the confidence-limit goal of 10 percent, which had been arbitrarily chosen previously, was achieved in all but one case, this application of the new practice was considered successful.
Statistical measurement of the gamma-ray source-count distribution as a function of energy
NASA Astrophysics Data System (ADS)
Zechlin, H.-S.; Cuoco, A.; Donato, F.; Fornengo, N.; Regis, M.
2017-01-01
Photon counts statistics have recently been proven to provide a sensitive observable for characterizing gamma-ray source populations and for measuring the composition of the gamma-ray sky. In this work, we generalize the use of the standard 1-point probability distribution function (1pPDF) to decompose the high-latitude gamma-ray emission observed with Fermi-LAT into: (i) point-source contributions, (ii) the Galactic foreground contribution, and (iii) a diffuse isotropic background contribution. We analyze gamma-ray data in five adjacent energy bands between 1 and 171 GeV. We measure the source-count distribution dN/dS as a function of energy, and demonstrate that our results extend current measurements from source catalogs to the regime of so far undetected sources. Our method improves the sensitivity for resolving point-source populations by about one order of magnitude in flux. The dN/dS distribution as a function of flux is found to be compatible with a broken power law. We derive upper limits on further possible breaks as well as the angular power of unresolved sources. We discuss the composition of the gamma-ray sky and capabilities of the 1pPDF method.
Hunt, G B; Luff, J A; Daniel, L; Van den Bergh, R
2013-11-01
The aims of this prospective study were to quantify steatosis in dogs with congenital portosystemic shunts (CPS) using a fat-specific stain, to compare the amount of steatosis in different lobes of the liver, and to evaluate intra- and interobserver variability in lipid point counting. Computer-assisted point counting of lipid droplets was undertaken following Oil Red O staining in 21 dogs with congenital portosystemic shunts and 9 control dogs. Dogs with congenital portosystemic shunts had significantly more small lipid droplets (<6 μ) than control dogs (P = .0013 and .0002, respectively). There was no significant difference in steatosis between liver lobes for either control dogs and CPS dogs. Significant differences were seen between observers for the number of large lipid droplets (>9 μ) and lipogranulomas per tissue point (P = .023 and .01, respectively). In conclusion, computer-assisted counting of lipid droplets following Oil Red O staining of liver biopsy samples allows objective measurement and detection of significant differences between dogs with CPS and normal dogs. This method will allow future evaluation of the relationship between different presentations of CPS (anatomy, age, breed) and lipidosis, as well as the impact of hepatic lipidosis on outcomes following surgical shunt attenuation.
NASA Astrophysics Data System (ADS)
Rajendran, Kishore; Leng, Shuai; Jorgensen, Steven M.; Abdurakhimova, Dilbar; Ritman, Erik L.; McCollough, Cynthia H.
2017-03-01
Changes in arterial wall perfusion are an indicator of early atherosclerosis. This is characterized by an increased spatial density of vasa vasorum (VV), the micro-vessels that supply oxygen and nutrients to the arterial wall. Detection of increased VV during contrast-enhanced computed tomography (CT) imaging is limited due to contamination from blooming effect from the contrast-enhanced lumen. We report the application of an image deconvolution technique using a measured system point-spread function, on CT data obtained from a photon-counting CT system to reduce blooming and to improve the CT number accuracy of arterial wall, which enhances detection of increased VV. A phantom study was performed to assess the accuracy of the deconvolution technique. A porcine model was created with enhanced VV in one carotid artery; the other carotid artery served as a control. CT images at an energy range of 25-120 keV were reconstructed. CT numbers were measured for multiple locations in the carotid walls and for multiple time points, pre and post contrast injection. The mean CT number in the carotid wall was compared between the left (increased VV) and right (control) carotid arteries. Prior to deconvolution, results showed similar mean CT numbers in the left and right carotid wall due to the contamination from blooming effect, limiting the detection of increased VV in the left carotid artery. After deconvolution, the mean CT number difference between the left and right carotid arteries was substantially increased at all the time points, enabling detection of the increased VV in the artery wall.
Arima, Nobuyuki; Nishimura, Reiki; Osako, Tomofumi; Nishiyama, Yasuyuki; Fujisue, Mamiko; Okumura, Yasuhiro; Nakano, Masahiro; Tashima, Rumiko; Toyozumi, Yasuo
2016-01-01
In this case-control study, we investigated the most suitable cell counting area and the optimal cutoff point of the Ki-67 index. Thirty recurrent cases were selected among hormone receptor (HR)-positive/HER2-negative breast cancer patients. As controls, 90 nonrecurrent cases were randomly selected by allotting 3 controls to each recurrent case based on the following criteria: age, nodal status, tumor size, and adjuvant endocrine therapy alone. Both the hot spot and the average area of the tumor were evaluated on a Ki-67 immunostaining slide. The median Ki-67 index value at the hot spot and average area were 25.0 and 14.5%, respectively. Irrespective of the area counted, the Ki-67 index value was significantly higher in all of the recurrent cases (p < 0.0001). The multivariate analysis revealed that the Ki-67 index value of 20% at the hot spot was the most suitable cutoff point for predicting recurrence. Moreover, higher x0394;Ki-67 index value (the difference between the hot spot and the average area, ≥10%) and lower progesterone receptor expression (<20%) were significantly correlated with recurrence. A higher Ki-67 index value at the hot spot strongly correlated with recurrence, and the optimal cutoff point was found to be 20%. © 2015 S. Karger AG, Basel.
ERIC Educational Resources Information Center
Falter, H. Ellie
2011-01-01
How do teachers teach students to count rhythms? Teachers can choose from various techniques. Younger students may learn themed words (such as "pea," "carrot," or "avocado"), specific rhythm syllables (such as "ta" and "ti-ti"), or some other counting method to learn notation and internalize rhythms. As students grow musically, and especially when…
ERIC Educational Resources Information Center
Magnus, Brooke E.; Thissen, David
2017-01-01
Questionnaires that include items eliciting count responses are becoming increasingly common in psychology. This study proposes methodological techniques to overcome some of the challenges associated with analyzing multivariate item response data that exhibit zero inflation, maximum inflation, and heaping at preferred digits. The modeling…
2015-2016 Palila abundance estimates
Camp, Richard J.; Brinck, Kevin W.; Banko, Paul C.
2016-01-01
The palila (Loxioides bailleui) population was surveyed annually during 1998−2016 on Mauna Kea Volcano to determine abundance, population trend, and spatial distribution. In the latest surveys, the 2015 population was estimated at 852−1,406 birds (point estimate: 1,116) and the 2016 population was estimated at 1,494−2,385 (point estimate: 1,934). Similar numbers of palila were detected during the first and subsequent counts within each year during 2012−2016; the proportion of the total annual detections in each count ranged from 46% to 56%; and there was no difference in the detection probability due to count sequence. Furthermore, conducting repeat counts improved the abundance estimates by reducing the width of the confidence intervals between 9% and 32% annually. This suggests that multiple counts do not affect bird or observer behavior and can be continued in the future to improve the precision of abundance estimates. Five palila were detected on supplemental survey stations in the Ka‘ohe restoration area, outside the core survey area but still within Palila Critical Habitat (one in 2015 and four in 2016), suggesting that palila are present in habitat that is recovering from cattle grazing on the southwest slope. The average rate of decline during 1998−2016 was 150 birds per year. Over the 18-year monitoring period, the estimated rate of change equated to a 58% decline in the population.
Pattern-histogram-based temporal change detection using personal chest radiographs
NASA Astrophysics Data System (ADS)
Ugurlu, Yucel; Obi, Takashi; Hasegawa, Akira; Yamaguchi, Masahiro; Ohyama, Nagaaki
1999-05-01
An accurate and reliable detection of temporal changes from a pair of images has considerable interest in the medical science. Traditional registration and subtraction techniques can be applied to extract temporal differences when,the object is rigid or corresponding points are obvious. However, in radiological imaging, loss of the depth information, the elasticity of object, the absence of clearly defined landmarks and three-dimensional positioning differences constraint the performance of conventional registration techniques. In this paper, we propose a new method in order to detect interval changes accurately without using an image registration technique. The method is based on construction of so-called pattern histogram and comparison procedure. The pattern histogram is a graphic representation of the frequency counts of all allowable patterns in the multi-dimensional pattern vector space. K-means algorithm is employed to partition pattern vector space successively. Any differences in the pattern histograms imply that different patterns are involved in the scenes. In our experiment, a pair of chest radiographs of pneumoconiosis is employed and the changing histogram bins are visualized on both of the images. We found that the method can be used as an alternative way of temporal change detection, particularly when the precise image registration is not available.
Kumar, Rajnish; Mishra, Bharat Kumar; Lahiri, Tapobrata; Kumar, Gautam; Kumar, Nilesh; Gupta, Rahul; Pal, Manoj Kumar
2017-06-01
Online retrieval of the homologous nucleotide sequences through existing alignment techniques is a common practice against the given database of sequences. The salient point of these techniques is their dependence on local alignment techniques and scoring matrices the reliability of which is limited by computational complexity and accuracy. Toward this direction, this work offers a novel way for numerical representation of genes which can further help in dividing the data space into smaller partitions helping formation of a search tree. In this context, this paper introduces a 36-dimensional Periodicity Count Value (PCV) which is representative of a particular nucleotide sequence and created through adaptation from the concept of stochastic model of Kolekar et al. (American Institute of Physics 1298:307-312, 2010. doi: 10.1063/1.3516320 ). The PCV construct uses information on physicochemical properties of nucleotides and their positional distribution pattern within a gene. It is observed that PCV representation of gene reduces computational cost in the calculation of distances between a pair of genes while being consistent with the existing methods. The validity of PCV-based method was further tested through their use in molecular phylogeny constructs in comparison with that using existing sequence alignment methods.
Data indexing techniques for the EUVE all-sky survey
NASA Technical Reports Server (NTRS)
Lewis, J.; Saba, V.; Dobson, C.
1992-01-01
This poster describes techniques developed for manipulating large full-sky data sets for the Extreme Ultraviolet Explorer project. The authors have adapted the quatrilateralized cubic sphere indexing algorithm to allow us to efficiently store and process several types of large data sets, such as full-sky maps of photon counts, exposure time, and count rates. A variation of this scheme is used to index sparser data such as individual photon events and viewing times for selected areas of the sky, which are eventually used to create EUVE source catalogs.
Proof of Concept for the Trajectory-Level Validation Framework for Traffic Simulation Models
DOT National Transportation Integrated Search
2017-10-30
Based on current practices, traffic simulation models are calibrated and validated using macroscopic measures such as 15-minute averages of traffic counts or average point-to-point travel times. For an emerging number of applications, including conne...
Pill counts and pill rental: unintended entrepreneurial opportunities.
Viscomi, Christopher M; Covington, Melissa; Christenson, Catherine
2013-07-01
Prescription opioid diversion and abuse are becoming increasingly prevalent in many regions of the world, particularly the United States. One method advocated to assess compliance with opioid prescriptions is occasional "pill counts." Shortly before a scheduled appointment, a patient is notified that they must bring in the unused portion of their opioid prescription. It has been assumed that if a patient has the correct number and strength of pills that should be present for that point in a prescription interval that they are unlikely to be selling or abusing their opioids. Two cases are presented where patients describe short term rental of opioids from illicit opioid dealers in order to circumvent pill counts. Pill renting appears to be an established method of circumventing pill counts. Pill counts do not assure non-diversion of opioids and provide additional cash flow to illicit opioid dealers.
NASA Astrophysics Data System (ADS)
Treacy, Kaye; Frid, Sandra; Jacob, Lorraine
2015-09-01
This research was designed to investigate the conceptualisations and thinking strategies Indigenous Australian students use in counting tasks. Eighteen Aboriginal students, in years 1 to 11 at a remote community school, were interviewed using standard counting tasks and a `counting' task that involved fetching `maku' (witchetty grubs) to have enough to give a maku to each person in a picture. The tasks were developed with, and the interviews conducted by, an Aboriginal research assistant, to ensure appropriate cultural and language contexts. A main finding was that most of the students did not see the need to use counting to make equivalent sets, even though they were able to demonstrate standard counting skills. The findings highlight a need to further examine the world views, orientations and related mathematical concepts and processes that Indigenous students bring to school.
Managing and Monitoring Birds Using Point Counts: Standards and Applications
C. John Ralph; Sam Droege; John R. Sauer
1995-01-01
The use of population size as a measure of health of a species has been a very common tool of ornithologists for many years (Lack 1954, 1966; Hutchinson 1978). Methods for surveying population size are detailed in Ralph and Scott (1981), the excellent compendium by Cooperrider and others (1986), and the manual by Koskimies and Vaisanen (1991). Many types of counting...
A New Method for Calculating Counts in Cells
NASA Astrophysics Data System (ADS)
Szapudi, István
1998-04-01
In the near future, a new generation of CCD-based galaxy surveys will enable high-precision determination of the N-point correlation functions. The resulting information will help to resolve the ambiguities associated with two-point correlation functions, thus constraining theories of structure formation, biasing, and Gaussianity of initial conditions independently of the value of Ω. As one of the most successful methods of extracting the amplitude of higher order correlations is based on measuring the distribution of counts in cells, this work presents an advanced way of measuring it with unprecedented accuracy. Szapudi & Colombi identified the main sources of theoretical errors in extracting counts in cells from galaxy catalogs. One of these sources, termed as measurement error, stems from the fact that conventional methods use a finite number of sampling cells to estimate counts in cells. This effect can be circumvented by using an infinite number of cells. This paper presents an algorithm, which in practice achieves this goal; that is, it is equivalent to throwing an infinite number of sampling cells in finite time. The errors associated with sampling cells are completely eliminated by this procedure, which will be essential for the accurate analysis of future surveys.
Measuring Variability in the Presence of Noise
NASA Astrophysics Data System (ADS)
Welsh, W. F.
Quantitative measurements of a variable signal in the presence of noise requires very careful attention to subtle affects which can easily bias the measurements. This is not limited to the low-count rate regime, nor is the bias error necessarily small. In this talk I will mention some of the dangers in applying standard techniques which are appropriate for high signal to noise data but fail in the cases where the S/N is low. I will discuss methods for correcting the bias in the these cases, both for periodic and non-periodic variability, and will introduce the concept of the ``filtered de-biased RMS''. I will also illustrate some common abuses of power spectrum interpretation. All of these points will be illustrated with examples from recent work on CV and AGN variability.
Application of fluorescence spectroscopy for on-line bioprocess monitoring and control
NASA Astrophysics Data System (ADS)
Boehl, Daniela; Solle, D.; Toussaint, Hans J.; Menge, M.; Renemann, G.; Lindemann, Carsten; Hitzmann, Bernd; Scheper, Thomas-Helmut
2001-02-01
12 Modern bioprocess control requires fast data acquisition and in-time evaluation of bioprocess variables. On-line fluorescence spectroscopy for data acquisition and the use of chemometric methods accomplish these requirements. The presented investigations were performed with fluorescence spectrometers with wide ranges of excitation and emission wavelength. By detection of several biogenic fluorophors (amino acids, coenzymes and vitamins) a large amount of information about the state of the bioprocess are obtained. For the evaluation of the process variables partial least squares regression is used. This technique was applied to several bioprocesses: the production of ergotamine by Claviceps purpurea, the production of t-PA (tissue plasminogen activator) by animal cells and brewing processes. The main point of monitoring the brewing processes was to determine the process variables cell count and extract concentration.
Dynamic time-correlated single-photon counting laser ranging
NASA Astrophysics Data System (ADS)
Peng, Huan; Wang, Yu-rong; Meng, Wen-dong; Yan, Pei-qin; Li, Zhao-hui; Li, Chen; Pan, Hai-feng; Wu, Guang
2018-03-01
We demonstrate a photon counting laser ranging experiment with a four-channel single-photon detector (SPD). The multi-channel SPD improve the counting rate more than 4×107 cps, which makes possible for the distance measurement performed even in daylight. However, the time-correlated single-photon counting (TCSPC) technique cannot distill the signal easily while the fast moving targets are submersed in the strong background. We propose a dynamic TCSPC method for fast moving targets measurement by varying coincidence window in real time. In the experiment, we prove that targets with velocity of 5 km/s can be detected according to the method, while the echo rate is 20% with the background counts of more than 1.2×107 cps.
Source counting in MEG neuroimaging
NASA Astrophysics Data System (ADS)
Lei, Tianhu; Dell, John; Magee, Ralphy; Roberts, Timothy P. L.
2009-02-01
Magnetoencephalography (MEG) is a multi-channel, functional imaging technique. It measures the magnetic field produced by the primary electric currents inside the brain via a sensor array composed of a large number of superconducting quantum interference devices. The measurements are then used to estimate the locations, strengths, and orientations of these electric currents. This magnetic source imaging technique encompasses a great variety of signal processing and modeling techniques which include Inverse problem, MUltiple SIgnal Classification (MUSIC), Beamforming (BF), and Independent Component Analysis (ICA) method. A key problem with Inverse problem, MUSIC and ICA methods is that the number of sources must be detected a priori. Although BF method scans the source space on a point-to-point basis, the selection of peaks as sources, however, is finally made by subjective thresholding. In practice expert data analysts often select results based on physiological plausibility. This paper presents an eigenstructure approach for the source number detection in MEG neuroimaging. By sorting eigenvalues of the estimated covariance matrix of the acquired MEG data, the measured data space is partitioned into the signal and noise subspaces. The partition is implemented by utilizing information theoretic criteria. The order of the signal subspace gives an estimate of the number of sources. The approach does not refer to any model or hypothesis, hence, is an entirely data-led operation. It possesses clear physical interpretation and efficient computation procedure. The theoretical derivation of this method and the results obtained by using the real MEG data are included to demonstrates their agreement and the promise of the proposed approach.
Wedge sampling for computing clustering coefficients and triangle counts on large graphs
Seshadhri, C.; Pinar, Ali; Kolda, Tamara G.
2014-05-08
Graphs are used to model interactions in a variety of contexts, and there is a growing need to quickly assess the structure of such graphs. Some of the most useful graph metrics are based on triangles, such as those measuring social cohesion. Despite the importance of these triadic measures, algorithms to compute them can be extremely expensive. We discuss the method of wedge sampling. This versatile technique allows for the fast and accurate approximation of various types of clustering coefficients and triangle counts. Furthermore, these techniques are extensible to counting directed triangles in digraphs. Our methods come with provable andmore » practical time-approximation tradeoffs for all computations. We provide extensive results that show our methods are orders of magnitude faster than the state of the art, while providing nearly the accuracy of full enumeration.« less
Computer measurement of particle sizes in electron microscope images
NASA Technical Reports Server (NTRS)
Hall, E. L.; Thompson, W. B.; Varsi, G.; Gauldin, R.
1976-01-01
Computer image processing techniques have been applied to particle counting and sizing in electron microscope images. Distributions of particle sizes were computed for several images and compared to manually computed distributions. The results of these experiments indicate that automatic particle counting within a reasonable error and computer processing time is feasible. The significance of the results is that the tedious task of manually counting a large number of particles can be eliminated while still providing the scientist with accurate results.
Analysis of Giardin expression during encystation of Giardia lamblia
USDA-ARS?s Scientific Manuscript database
The present study analyzed giardin transcription in trophozoites and cysts during encystation of Giardia lamblia. Encystment was induced using standard methods, and the number of trophozoites and cysts were counted at various time-points during encystation. At all time points, RNA from both stages...
Minimum Detectable Activity for Tomographic Gamma Scanning System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Venkataraman, Ram; Smith, Susan; Kirkpatrick, J. M.
2015-01-01
For any radiation measurement system, it is useful to explore and establish the detection limits and a minimum detectable activity (MDA) for the radionuclides of interest, even if the system is to be used at far higher values. The MDA serves as an important figure of merit, and often a system is optimized and configured so that it can meet the MDA requirements of a measurement campaign. The non-destructive assay (NDA) systems based on gamma ray analysis are no exception and well established conventions, such the Currie method, exist for estimating the detection limits and the MDA. However, the Tomographicmore » Gamma Scanning (TGS) technique poses some challenges for the estimation of detection limits and MDAs. The TGS combines high resolution gamma ray spectrometry (HRGS) with low spatial resolution image reconstruction techniques. In non-imaging gamma ray based NDA techniques measured counts in a full energy peak can be used to estimate the activity of a radionuclide, independently of other counting trials. However, in the case of the TGS each “view” is a full spectral grab (each a counting trial), and each scan consists of 150 spectral grabs in the transmission and emission scans per vertical layer of the item. The set of views in a complete scan are then used to solve for the radionuclide activities on a voxel by voxel basis, over 16 layers of a 10x10 voxel grid. Thus, the raw count data are not independent trials any more, but rather constitute input to a matrix solution for the emission image values at the various locations inside the item volume used in the reconstruction. So, the validity of the methods used to estimate MDA for an imaging technique such as TGS warrant a close scrutiny, because the pair-counting concept of Currie is not directly applicable. One can also raise questions as to whether the TGS, along with other image reconstruction techniques which heavily intertwine data, is a suitable method if one expects to measure samples whose activities are at or just above MDA levels. The paper examines methods used to estimate MDAs for a TGS system, and explores possible solutions that can be rigorously defended.« less
[Validation of a clinical prediction rule to distinguish bacterial from aseptic meningitis].
Agüero, Gonzalo; Davenport, María C; Del Valle, María de la P; Gallegos, Paulina; Kannemann, Ana L; Bokser, Vivian; Ferrero, Fernando
2010-02-01
Despite most meningitis are not bacterial, antibiotics are usually administered on admission because bacterial meningitis is difficult to be rule-out. Distinguishing bacterial from aseptic meningitis on admission could avoid inappropriate antibiotic use and hospitalization. We aimed to validate a clinical prediction rule to distinguish bacterial from aseptic meningitis in children, on arriving to the emergency room. This prospective study included patients aged < 19 years with meningitis. Cerebrospinal fluid (CSF) and peripheral blood neutrophil count were obtained from all patients. The BMS (Bacterial Meningitis Score) described by Nigrovic (Pediatrics 2002; 110: 712), was calculated: positive CSF Gram stain= 2 points, CSF absolute neutrophil count > or = 1000 cells/mm(3), CSF protein > or = 80 mg/dl, peripheral blood absolute neutrophil count > or = 10.000/mm(3), seizure = 1 point each. Sensitivity (S), specificity (E), positive and negative predictive values (PPV and NPV), positive and negative likelihood ratios (PLR and NLR) of the BMS to predict bacterial meningitis were calculated. Seventy patients with meningitis were included (14 bacterial meningitis). When BMS was calculated, 25 patients showed a BMS= 0 points, 11 BMS= 1 point, and 34 BMS > or = 2 points. A BMS = 0 showed S: 100%, E: 44%, VPP: 31%, VPN: 100%, RVP: 1,81 RVN: 0. A BMS > or = 2 predicted bacterial meningitis with S: 100%, E: 64%, VPP: 41%, VPN: 100%, PLR: 2.8, NLR:0. Using BMS was simple, and allowed identifying children with very low risk of bacterial meningitis. It could be a useful tool to assist clinical decision making.
Pandey, Anil Kumar; Sharma, Param Dev; Dheer, Pankaj; Parida, Girish Kumar; Goyal, Harish; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh
2017-01-01
99m Technetium-methylene diphosphonate ( 99m Tc-MDP) bone scan images have limited number of counts per pixel, and hence, they have inferior image quality compared to X-rays. Theoretically, global histogram equalization (GHE) technique can improve the contrast of a given image though practical benefits of doing so have only limited acceptance. In this study, we have investigated the effect of GHE technique for 99m Tc-MDP-bone scan images. A set of 89 low contrast 99m Tc-MDP whole-body bone scan images were included in this study. These images were acquired with parallel hole collimation on Symbia E gamma camera. The images were then processed with histogram equalization technique. The image quality of input and processed images were reviewed by two nuclear medicine physicians on a 5-point scale where score of 1 is for very poor and 5 is for the best image quality. A statistical test was applied to find the significance of difference between the mean scores assigned to input and processed images. This technique improves the contrast of the images; however, oversaturation was noticed in the processed images. Student's t -test was applied, and a statistically significant difference in the input and processed image quality was found at P < 0.001 (with α = 0.05). However, further improvement in image quality is needed as per requirements of nuclear medicine physicians. GHE techniques can be used on low contrast bone scan images. In some of the cases, a histogram equalization technique in combination with some other postprocessing technique is useful.
Bird population and habitat surveys in urban areas
DeGraaf, R.M.; Geis, A.D.; Healy, P.A.
1991-01-01
Breeding bird populations in six habitats in Columbia. MD, were studied to develop procedures suitable for measuring bird use of residential areas and to identify habitat characteristics that define the distribution of various common bird species. A procedure to measure bird use based on 4-min transect counts on plots measuring 91 m ? 91 m proved better than point counts. Transect counts reduced many of the problems associated with counting birds in urban areas, such as varying noise and visibility. Eighty percent of observations were recorded in the first 4 min. Habitat measurement procedures were examined also. It was found that a subsample of woody tree and shrub crown volumes made on 0.2 ha was highly correlated with similar measures made on 0.8-ha plots.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Miller, Erin A.; Robinson, Sean M.; Anderson, Kevin K.
2015-01-19
Here we present a novel technique for the localization of radiological sources in urban or rural environments from an aerial platform. The technique is based on a Bayesian approach to localization, in which measured count rates in a time series are compared with predicted count rates from a series of pre-calculated test sources to define likelihood. Furthermore, this technique is expanded by using a localized treatment with a limited field of view (FOV), coupled with a likelihood ratio reevaluation, allowing for real-time computation on commodity hardware for arbitrarily complex detector models and terrain. In particular, detectors with inherent asymmetry ofmore » response (such as those employing internal collimation or self-shielding for enhanced directional awareness) are leveraged by this approach to provide improved localization. Our results from the localization technique are shown for simulated flight data using monolithic as well as directionally-aware detector models, and the capability of the methodology to locate radioisotopes is estimated for several test cases. This localization technique is shown to facilitate urban search by allowing quick and adaptive estimates of source location, in many cases from a single flyover near a source. In particular, this method represents a significant advancement from earlier methods like full-field Bayesian likelihood, which is not generally fast enough to allow for broad-field search in real time, and highest-net-counts estimation, which has a localization error that depends strongly on flight path and cannot generally operate without exhaustive search« less
A Method of Recording and Predicting the Pollen Count.
ERIC Educational Resources Information Center
Buck, M.
1985-01-01
A hair dryer, plastic funnel, and microscope slide can be used for predicting pollen counts on a day-to-day basis. Materials, methods for assembly, collection technique, meteorological influences, and daily patterns are discussed. Data collected using the apparatus suggest that airborne grass products other than pollen also affect hay fever…
Carbon fiber counting. [aircraft structures
NASA Technical Reports Server (NTRS)
Pride, R. A.
1980-01-01
A method was developed for characterizing the number and lengths of carbon fibers accidentally released by the burning of composite portions of civil aircraft structure in a jet fuel fire after an accident. Representative samplings of carbon fibers collected on transparent sticky film were counted from photographic enlargements with a computer aided technique which also provided fiber lengths.
Surpassing Humans and Computers with JellyBean: Crowd-Vision-Hybrid Counting Algorithms.
Sarma, Akash Das; Jain, Ayush; Nandi, Arnab; Parameswaran, Aditya; Widom, Jennifer
2015-11-01
Counting objects is a fundamental image processisng primitive, and has many scientific, health, surveillance, security, and military applications. Existing supervised computer vision techniques typically require large quantities of labeled training data, and even with that, fail to return accurate results in all but the most stylized settings. Using vanilla crowd-sourcing, on the other hand, can lead to significant errors, especially on images with many objects. In this paper, we present our JellyBean suite of algorithms, that combines the best of crowds and computer vision to count objects in images, and uses judicious decomposition of images to greatly improve accuracy at low cost. Our algorithms have several desirable properties: (i) they are theoretically optimal or near-optimal , in that they ask as few questions as possible to humans (under certain intuitively reasonable assumptions that we justify in our paper experimentally); (ii) they operate under stand-alone or hybrid modes, in that they can either work independent of computer vision algorithms, or work in concert with them, depending on whether the computer vision techniques are available or useful for the given setting; (iii) they perform very well in practice, returning accurate counts on images that no individual worker or computer vision algorithm can count correctly, while not incurring a high cost.
Grabow, W O; du Preez, M
1979-01-01
Total coliform counts obtained by means of standard membrane filtration techniques, using MacConkey agar, m-Endo LES agar, Teepol agar, and pads saturated with Teepol broth as growth media, were compared. Various combinations of these media were used in tests on 490 samples of river water and city wastewater after different stages of conventional purification and reclamation processes including lime treatment, and filtration, active carbon treatment, ozonation, and chlorination. Endo agar yielded the highest average counts for all these samples. Teepol agar generally had higher counts then Teepol broth, whereas MacConkey agar had the lowest average counts. Identification of 871 positive isolates showed that Aeromonas hydrophila was the species most commonly detected. Species of Escherichia, Citrobacter, Klebsiella, and Enterobacter represented 55% of isolates which conformed to the definition of total coliforms on Endo agar, 54% on Teepol agar, and 45% on MacConkey agar. Selection for species on the media differed considerably. Evaluation of these data and literature on alternative tests, including most probable number methods, indicated that the technique of choice for routine analysis of total coliform bacteria in drinking water is membrane filtration using m-Endo LES agar as growth medium without enrichment procedures or a cytochrome oxidase restriction. PMID:394678
A Calibration of NICMOS Camera 2 for Low Count Rates
NASA Astrophysics Data System (ADS)
Rubin, D.; Aldering, G.; Amanullah, R.; Barbary, K.; Dawson, K. S.; Deustua, S.; Faccioli, L.; Fadeyev, V.; Fakhouri, H. K.; Fruchter, A. S.; Gladders, M. D.; de Jong, R. S.; Koekemoer, A.; Krechmer, E.; Lidman, C.; Meyers, J.; Nordin, J.; Perlmutter, S.; Ripoche, P.; Schlegel, D. J.; Spadafora, A.; Suzuki, N.
2015-05-01
NICMOS 2 observations are crucial for constraining distances to most of the existing sample of z\\gt 1 SNe Ia. Unlike conventional calibration programs, these observations involve long exposure times and low count rates. Reciprocity failure is known to exist in HgCdTe devices and a correction for this effect has already been implemented for high and medium count rates. However, observations at faint count rates rely on extrapolations. Here instead, we provide a new zero-point calibration directly applicable to faint sources. This is obtained via inter-calibration of NIC2 F110W/F160W with the Wide Field Camera 3 (WFC3) in the low count-rate regime using z∼ 1 elliptical galaxies as tertiary calibrators. These objects have relatively simple near-IR spectral energy distributions, uniform colors, and their extended nature gives a superior signal-to-noise ratio at the same count rate than would stars. The use of extended objects also allows greater tolerances on point-spread function profiles. We find space telescope magnitude zero points (after the installation of the NICMOS cooling system, NCS) of 25.296\\+/- 0.022 for F110W and 25.803\\+/- 0.023 for F160W, both in agreement with the calibration extrapolated from count rates ≳1000 times larger (25.262 and 25.799). Before the installation of the NCS, we find 24.843\\+/- 0.025 for F110W and 25.498\\+/- 0.021 for F160W, also in agreement with the high-count-rate calibration (24.815 and 25.470). We also check the standard bandpasses of WFC3 and NICMOS 2 using a range of stars and galaxies at different colors and find mild tension for WFC3, limiting the accuracy of the zero points. To avoid human bias, our cross-calibration was “blinded” in that the fitted zero-point differences were hidden until the analysis was finalized. Based on observations with the NASA/ESA Hubble Space Telescope, obtained at the Space Telescope Science Institute, which is operated by AURA, Inc., under NASA contract NAS 5-26555, under programs SM2/NIC-7049, SM2/NIC-7152, CAL/NIC-7607, CAL/NIC-7691, CAL/NIC-7693, GO-7887, CAL/NIC-7902, CAL/NIC-7904, GO/DD-7941, SM3/NIC-8983, SM3/NIC-8986, GTO/ACS-9290, ENG/NIC-9324, CAL/NIC-9325, GO-9352, GO-9375, SNAP-9485, CAL/NIC-9639, GO-9717, GO-9834, GO-9856, CAL/NIC-9995, CAL/NIC-9997, GO-10189, GO-10258, CAL/NIC-10381, CAL/NIC-10454, GO-10496, CAL/NIC-10725, CAL/NIC-10726, GO-10886, CAL/NIC-11060, CAL/NIC-11061, GO-11135, GO-11143, GO-11202, CAL/NIC-11319, GO/DD-11359, SM4/WFC3-11439, SM4/WFC3-11451, GO-11557, GO-11591, GO-11600, GO/DD-11799, CAL/WFC3-11921, CAL/WFC3-11926, GO/DD-12051, GO-12061, GO-12062, GO-12177, CAL/WFC3-12333, CAL/WFC3-12334, CAL/WFC3-12341, GO-12443, GO-12444, GO-12445, CAL/WFC3-12698, CAL/WFC3-12699, GO-12874, CAL/WFC3-13088, and CAL/WFC3-13089.
Detection of microbial concentration in ice-cream using the impedance technique.
Grossi, M; Lanzoni, M; Pompei, A; Lazzarini, R; Matteuzzi, D; Riccò, B
2008-06-15
The detection of microbial concentration, essential for safe and high quality food products, is traditionally made with the plate count technique, that is reliable, but also slow and not easily realized in the automatic form, as required for direct use in industrial machines. To this purpose, the method based on impedance measurements represents an attractive alternative since it can produce results in about 10h, instead of the 24-48h needed by standard plate counts and can be easily realized in automatic form. In this paper such a method has been experimentally studied in the case of ice-cream products. In particular, all main ice-cream compositions of real interest have been considered and no nutrient media has been used to dilute the samples. A measurement set-up has been realized using benchtop instruments for impedance measurements on samples whose bacteria concentration was independently measured by means of standard plate counts. The obtained results clearly indicate that impedance measurement represents a feasible and reliable technique to detect total microbial concentration in ice-cream, suitable to be implemented as an embedded system for industrial machines.
Orthogonal bases of invariants in tensor models
NASA Astrophysics Data System (ADS)
Diaz, Pablo; Rey, Soo-Jong
2018-02-01
Representation theory provides an efficient framework to count and classify invariants in tensor models of (gauge) symmetry G d = U( N 1) ⊗ · · · ⊗ U( N d ) . We show that there are two natural ways of counting invariants, one for arbitrary G d and another valid for large rank of G d . We construct basis of invariant operators based on the counting, and compute correlators of their elements. The basis associated with finite rank of G d diagonalizes two-point function. It is analogous to the restricted Schur basis used in matrix models. We comment on future directions for investigation.
ERIC Educational Resources Information Center
Her Majesty's Inspectorate of Education, 2007
2007-01-01
"Count Us in: Achieving Success for Deaf Pupils" is a timely report. It comes when schools are becoming more confident in dealing with a wide range of additional support for learning needs. Schools are also more aware that they need to personalise experiences in order to meet pupils' learning needs. The report does point to strengths…
ERIC Educational Resources Information Center
Gervasoni, Ann; Peter-Koop, Andrea
2015-01-01
This paper compares the counting and whole number knowledge and skills of primary school children in Australia and Germany at the end of Grade 1 and Grade 2. Children's learning was assessed using the Early Numeracy Interview and associated Growth Point Framework. The findings highlight substantial differences between the two groups that vary for…
The Case for an Open Data Model
1998-08-01
Microsoft Word, Pagemaker, and Framemaker , and the drawing programs MacDraw, Adobe Illustrator, and Microsoft PowerPoint, use their own proprietary...needs a custom word counting tool, since no utility could work in Word and other word processors. Framemaker for Windows does not have a word counting...supplied in 2 At least none that I could find in Framemaker 5.5 for Windows. Another problem with
WisKids Count Data Book, 1997: A Portrait of Child Health in Wisconsin. Book 4.
ERIC Educational Resources Information Center
Kaplan, Tom; And Others
This Kids Count data book examines statewide trends in the well-being of Wisconsin's children, focusing specifically on child health. The book provides a baseline on child health at the point of elimination of the Aid to Families with Dependent children program and the adoption of Wisconsin Works (W-2) program. The statistical portrait is based on…
A new clocking method for a charge coupled device
DOE Office of Scientific and Technical Information (OSTI.GOV)
Umezu, Rika; Kitamoto, Shunji, E-mail: kitamoto@rikkyo.ac.jp; Murakami, Hiroshi
2014-07-15
We propose and demonstrate a new clocking method for a charge-coupled device (CCD). When a CCD is used for a photon counting detector of X-rays, its weak point is a limitation of its counting rate, because high counting rate makes non-negligible pile-up of photons. In astronomical usage, this pile-up is especially severe for an observation of a bright point-like object. One typical idea to reduce the pile-up is a parallel sum (P-sum) mode. This mode completely loses one-dimensional information. Our new clocking method, panning mode, provides complementary properties between the normal mode and the P-sum mode. We performed a simplemore » simulation in order to investigate a pile-up probability and compared the simulated result and actual obtained event rates. Using this simulation and the experimental results, we compared the pile-up tolerance of various clocking modes including our new method and also compared their other characteristics.« less
Point-by-point compositional analysis for atom probe tomography.
Stephenson, Leigh T; Ceguerra, Anna V; Li, Tong; Rojhirunsakool, Tanaporn; Nag, Soumya; Banerjee, Rajarshi; Cairney, Julie M; Ringer, Simon P
2014-01-01
This new alternate approach to data processing for analyses that traditionally employed grid-based counting methods is necessary because it removes a user-imposed coordinate system that not only limits an analysis but also may introduce errors. We have modified the widely used "binomial" analysis for APT data by replacing grid-based counting with coordinate-independent nearest neighbour identification, improving the measurements and the statistics obtained, allowing quantitative analysis of smaller datasets, and datasets from non-dilute solid solutions. It also allows better visualisation of compositional fluctuations in the data. Our modifications include:.•using spherical k-atom blocks identified by each detected atom's first k nearest neighbours.•3D data visualisation of block composition and nearest neighbour anisotropy.•using z-statistics to directly compare experimental and expected composition curves. Similar modifications may be made to other grid-based counting analyses (contingency table, Langer-Bar-on-Miller, sinusoidal model) and could be instrumental in developing novel data visualisation options.
NASA Astrophysics Data System (ADS)
Lu, Weihua; Chen, Xinjian; Zhu, Weifang; Yang, Lei; Cao, Zhaoyuan; Chen, Haoyu
2015-03-01
In this paper, we proposed a method based on the Freeman chain code to segment and count rhesus choroid-retinal vascular endothelial cells (RF/6A) automatically for fluorescence microscopy images. The proposed method consists of four main steps. First, a threshold filter and morphological transform were applied to reduce the noise. Second, the boundary information was used to generate the Freeman chain codes. Third, the concave points were found based on the relationship between the difference of the chain code and the curvature. Finally, cells segmentation and counting were completed based on the characteristics of the number of the concave points, the area and shape of the cells. The proposed method was tested on 100 fluorescence microscopic cell images, and the average true positive rate (TPR) is 98.13% and the average false positive rate (FPR) is 4.47%, respectively. The preliminary results showed the feasibility and efficiency of the proposed method.
Subnuclear foci quantification using high-throughput 3D image cytometry
NASA Astrophysics Data System (ADS)
Wadduwage, Dushan N.; Parrish, Marcus; Choi, Heejin; Engelward, Bevin P.; Matsudaira, Paul; So, Peter T. C.
2015-07-01
Ionising radiation causes various types of DNA damages including double strand breaks (DSBs). DSBs are often recognized by DNA repair protein ATM which forms gamma-H2AX foci at the site of the DSBs that can be visualized using immunohistochemistry. However most of such experiments are of low throughput in terms of imaging and image analysis techniques. Most of the studies still use manual counting or classification. Hence they are limited to counting a low number of foci per cell (5 foci per nucleus) as the quantification process is extremely labour intensive. Therefore we have developed a high throughput instrumentation and computational pipeline specialized for gamma-H2AX foci quantification. A population of cells with highly clustered foci inside nuclei were imaged, in 3D with submicron resolution, using an in-house developed high throughput image cytometer. Imaging speeds as high as 800 cells/second in 3D were achieved by using HiLo wide-field depth resolved imaging and a remote z-scanning technique. Then the number of foci per cell nucleus were quantified using a 3D extended maxima transform based algorithm. Our results suggests that while most of the other 2D imaging and manual quantification studies can count only up to about 5 foci per nucleus our method is capable of counting more than 100. Moreover we show that 3D analysis is significantly superior compared to the 2D techniques.
A New Method for Estimating Bacterial Abundances in Natural Samples using Sublimation
NASA Technical Reports Server (NTRS)
Glavin, Daniel P.; Cleaves, H. James; Schubert, Michael; Aubrey, Andrew; Bada, Jeffrey L.
2004-01-01
We have developed a new method based on the sublimation of adenine from Escherichia coli to estimate bacterial cell counts in natural samples. To demonstrate this technique, several types of natural samples including beach sand, seawater, deep-sea sediment, and two soil samples from the Atacama Desert were heated to a temperature of 500 C for several seconds under reduced pressure. The sublimate was collected on a cold finger and the amount of adenine released from the samples then determined by high performance liquid chromatography (HPLC) with UV absorbance detection. Based on the total amount of adenine recovered from DNA and RNA in these samples, we estimated bacterial cell counts ranging from approx. l0(exp 5) to l0(exp 9) E. coli cell equivalents per gram. For most of these samples, the sublimation based cell counts were in agreement with total bacterial counts obtained by traditional DAPI staining. The simplicity and robustness of the sublimation technique compared to the DAPI staining method makes this approach particularly attractive for use by spacecraft instrumentation. NASA is currently planning to send a lander to Mars in 2009 in order to assess whether or not organic compounds, especially those that might be associated with life, are present in Martian surface samples. Based on our analyses of the Atacama Desert soil samples, several million bacterial cells per gam of Martian soil should be detectable using this sublimation technique.
Lubow, Bruce C.; Ransom, Jason I.
2007-01-01
An aerial survey technique combining simultaneous double-count and sightability bias correction methodologies was used to estimate the population of wild horses inhabiting Adobe Town and Salt Wells Creek Herd Management Areas, Wyoming. Based on 5 surveys over 4 years, we conclude that the technique produced estimates consistent with the known number of horses removed between surveys and an annual population growth rate of 16.2 percent per year. Therefore, evidence from this series of surveys supports the validity of this survey method. Our results also indicate that the ability of aerial observers to see horse groups is very strongly dependent on skill of the individual observer, size of the horse group, and vegetation cover. It is also more modestly dependent on the ruggedness of the terrain and the position of the sun relative to the observer. We further conclude that censuses, or uncorrected raw counts, are inadequate estimates of population size for this herd. Such uncorrected counts were all undercounts in our trials, and varied in magnitude from year to year and observer to observer. As of April 2007, we estimate that the population of the Adobe Town /Salt Wells Creek complex is 906 horses with a 95 percent confidence interval ranging from 857 to 981 horses.
Parkinson, Lily A; Alexander, Amy B; Campbell, Terry W
2017-07-01
Elasmobranch hematology continues to reveal new peculiarities within this specialized field. This report compares total hematologic values from the same white-spotted bamboo sharks (Chiloscyllium plagiosum) housed in different environments. We compared the hemograms one year apart, using a standardized Natt-Herrick's technique. The total white blood cell (WBC) counts of the sharks were statistically different between the two time points (initial median total WBC count = 18,920 leukocytes/μl, SD = 8,108; 1 year later total WBC count = 1,815 leukocytes/μl, SD = 1,309). The packed cell volumes were additionally found to be statistically different (19%, SD = 2.9 vs. 22%, SD = 2.0). Analysis revealed the only differences between the time points were the temperature and stocking densities at which these sharks were housed. This report emphasizes the need for a thorough understanding of the husbandry of an elasmobranch prior to interpretation of a hemogram and suggests that reference intervals should be created for each environment. © 2017 Wiley Periodicals, Inc.
Low cost sensing of vegetation volume and structure with a Microsoft Kinect sensor
NASA Astrophysics Data System (ADS)
Azzari, G.; Goulden, M.
2011-12-01
The market for videogames and digital entertainment has decreased the cost of advanced technology to affordable levels. The Microsoft Kinect sensor for Xbox 360 is an infrared time of flight camera designed to track body position and movement at a single-articulation level. Using open source drivers and libraries, we acquired point clouds of vegetation directly from the Kinect sensor. The data were filtered for outliers, co-registered, and cropped to isolate the plant of interest from the surroundings and soil. The volume of single plants was then estimated with several techniques, including fitting with solid shapes (cylinders, spheres, boxes), voxel counts, and 3D convex/concave hulls. Preliminary results are presented here. The volume of a series of wild artichoke plants was measured from nadir using a Kinect on a 3m-tall tower. The calculated volumes were compared with harvested biomass; comparisons and derived allometric relations will be presented, along with examples of the acquired point clouds. This Kinect sensor shows promise for ground-based, automated, biomass measurement systems, and possibly for comparison/validation of remotely sensed LIDAR.
Precision wildlife monitoring using unmanned aerial vehicles
NASA Astrophysics Data System (ADS)
Hodgson, Jarrod C.; Baylis, Shane M.; Mott, Rowan; Herrod, Ashley; Clarke, Rohan H.
2016-03-01
Unmanned aerial vehicles (UAVs) represent a new frontier in environmental research. Their use has the potential to revolutionise the field if they prove capable of improving data quality or the ease with which data are collected beyond traditional methods. We apply UAV technology to wildlife monitoring in tropical and polar environments and demonstrate that UAV-derived counts of colony nesting birds are an order of magnitude more precise than traditional ground counts. The increased count precision afforded by UAVs, along with their ability to survey hard-to-reach populations and places, will likely drive many wildlife monitoring projects that rely on population counts to transition from traditional methods to UAV technology. Careful consideration will be required to ensure the coherence of historic data sets with new UAV-derived data and we propose a method for determining the number of duplicated (concurrent UAV and ground counts) sampling points needed to achieve data compatibility.
Design, development and manufacture of a breadboard radio frequency mass gauging system
NASA Technical Reports Server (NTRS)
1975-01-01
The feasibility of the RF gauging mode, counting technique was demonstrated for gauging liquid hydrogen and liquid oxygen under all attitude conditions. With LH2, it was also demonstrated under dynamic fluid conditions, in which the fluid assumes ever changing positions within the tank, that the RF gauging technique on the average provides a very good indication of mass. It is significant that the distribution of the mode count data at each fill level during dynamic LH2 and LOX orientation testing does approach a statistical normal distribution. Multiple space-diversity probes provide better coupling to the resonant modes than utilization of a single probe element. The variable sweep rate generator technique provides a more uniform mode versus time distribution for processing.
Investigation of Density Fluctuations in Supersonic Free Jets and Correlation with Generated Noise
NASA Technical Reports Server (NTRS)
Panda, J.; Seasholtz, R. G.
2000-01-01
The air density fluctuations in the plumes of fully-expanded, unheated free jets were investigated experimentally using a Rayleigh scattering based technique. The point measuring technique used a continuous wave laser, fiber-optic transmission and photon counting electronics. The radial and centerline profiles of time-averaged density and root-mean-square density fluctuation provided a comparative description of jet growth. To measure density fluctuation spectra a two-Photomultiplier tube technique was used. Crosscorrelation between the two PMT signals significantly reduced electronic shot noise contribution. Turbulent density fluctuations occurring up to a Strouhal number (Sr) of 2.5 were resolved. A remarkable feature of density spectra, obtained from the same locations of jets in 0.5< M<1.5 range, is a constant Strouhal frequency for peak fluctuations. A detailed survey at Mach numbers M = 0.95, 1.4 and 1.8 showed that, in general, distribution of various Strouhal frequency fluctuations remained similar for the three jets. In spite of the similarity in the flow fluctuation the noise characteristics were found to be significantly different. Spark schlieren photographs and near field microphone measurements confirmed that the eddy Mach wave radiation was present in Mach 1.8 jet, and was absent in Mach 0.95 jet. To measure correlation between the flow and the far field sound pressure fluctuations, a microphone was kept at a distance of 50 diameters, 30 deg. to the flow direction, and the laser probe volume was moved from point to point in the flow. The density fluctuations in the peripheral shear layer of Mach 1.8 jet showed significant correlation up to the measurement limit of Sr = 2.5, while for Mach 0.95 jet no correlation was measured. Along the centerline measurable correlation was found from the end of the potential core and at the low frequency range (Sr less than 0.5). Usually the normalized correlation values increased with an increase of the jet Mach number. The experimental data point out eddy Mach waves as a strong source of sound generation in supersonic jets and fail to locate the primary noise mechanism in subsonic jets.
Hallas, Gary; Monis, Paul
2015-01-01
The enumeration of bacteria using plate-based counts is a core technique used by food and water microbiology testing laboratories. However, manual counting of bacterial colonies is both time and labour intensive, can vary between operators and also requires manual entry of results into laboratory information management systems, which can be a source of data entry error. An alternative is to use automated digital colony counters, but there is a lack of peer-reviewed validation data to allow incorporation into standards. We compared the performance of digital counting technology (ProtoCOL3) against manual counting using criteria defined in internationally recognized standard methods. Digital colony counting provided a robust, standardized system suitable for adoption in a commercial testing environment. The digital technology has several advantages:•Improved measurement of uncertainty by using a standard and consistent counting methodology with less operator error.•Efficiency for labour and time (reduced cost).•Elimination of manual entry of data onto LIMS.•Faster result reporting to customers.
A rocket telescope spectrometer with high precision pointing control.
Bottema, M; Fastie, W G; Moos, H W
1969-09-01
One second of arc pointing accuracy has been achieved by servocontrolling the secondary mirror of a Dall-Kirkham telescope flown in an Aerobee 150 rocket. The primary mirror is weight-relieved, mounted at its nodal line and can resolve 2 arc sec. An objective LiF prism mounted near the focal plane provides a lowresolution far uv spectrum suitable for studying planetary atmospheres. Solar blind photomultiplier tubes with pulse counting electronics provide a dark current background of less than 1 count/sec. Spectra of Venus, Jupiter and eta Ursa Majoris (U Ma) were obtained in a flight from White Sands, New Mexico, on 5 December 1967. Further flights are planned with the recovered package.
Abundance and reproduction of songbirds in burned and unburned pine forests of the Georgia Piedmont
White, D.H.; Chapman, B.R.; Brunjes, J.H.; Raftovich, R.V.; Seginak, J.T.
1999-01-01
We studied the abundance and productivity of songbirds in prescribed burned and unburned mature (>60 yr) pine forests at Piedmont National Wildlife Refuge, Georgia, during 1993-1995. We estimated species abundance, richness, and evenness using data from 312 point counts in 18 burned sites and six unburned sites. We measured gross habitat features in 0.04-ha circles centered on each point count station. We calculated productivity estimates at nests of seven of the most common nesting species. Habitat components we measured in 1-, 2-, and 3-yr post-burn sites were similar, but most components differed between burned and unburned sites. Although 98 species were detected during point counts, we report only on the 46 species that nested in the area and were detected >10% of the counts in either habitat class. Twenty-one species preferred burned sites and six preferred unburned sites. Avian species richness and evenness were similar for burned and unburned sites. Burned sites were preferred for nesting over unburned sites. Only nine nests of six species were found in unburned sites. Productivity estimates were low in burned sites. One or more eggs hatched in only 59 of 187 nests monitored, and an average of only 0.82 chicks per nest were estimated to have fledged. Predation was the most common probable cause for nest failure, ranging from 45% in the Northern Cardinal (Cardinalis cardinalis) to 64% in the Summer Tanager (Piranga rubra). Because the sources of predation at the refuge are unknown, future research should address this issue.
Nearest neighbors by neighborhood counting.
Wang, Hui
2006-06-01
Finding nearest neighbors is a general idea that underlies many artificial intelligence tasks, including machine learning, data mining, natural language understanding, and information retrieval. This idea is explicitly used in the k-nearest neighbors algorithm (kNN), a popular classification method. In this paper, this idea is adopted in the development of a general methodology, neighborhood counting, for devising similarity functions. We turn our focus from neighbors to neighborhoods, a region in the data space covering the data point in question. To measure the similarity between two data points, we consider all neighborhoods that cover both data points. We propose to use the number of such neighborhoods as a measure of similarity. Neighborhood can be defined for different types of data in different ways. Here, we consider one definition of neighborhood for multivariate data and derive a formula for such similarity, called neighborhood counting measure or NCM. NCM was tested experimentally in the framework of kNN. Experiments show that NCM is generally comparable to VDM and its variants, the state-of-the-art distance functions for multivariate data, and, at the same time, is consistently better for relatively large k values. Additionally, NCM consistently outperforms HEOM (a mixture of Euclidean and Hamming distances), the "standard" and most widely used distance function for multivariate data. NCM has a computational complexity in the same order as the standard Euclidean distance function and NCM is task independent and works for numerical and categorical data in a conceptually uniform way. The neighborhood counting methodology is proven sound for multivariate data experimentally. We hope it will work for other types of data.
van Benthem, B H; Veugelers, P J; Cornelisse, P G; Strathdee, S A; Kaldor, J M; Shafer, K A; Coutinho, R A; van Griensven, G J
1998-06-18
To investigate the significance of the time from seroconversion to AIDS (incubation time) and other covariates for survival from AIDS to death. In survival analysis, survival from AIDS to death was compared for different categories of length of incubation time adjusted and unadjusted for other covariates, and significant predictors for survival from AIDS to death were investigated. Survival after AIDS was not affected by the incubation time in univariate as well as in multivariate analyses. Predictive factors for progression from AIDS to death were age at seroconversion, type of AIDS diagnosis, and CD4 cell count at AIDS. The relative hazard for age at seroconversion increased 1.38-fold over 10 years. Men with a CD4 cell count at AIDS of <130 x 10(6)/l had a twofold higher risk in progression to death than men with higher CD4 cell counts. Persons diagnosed with lymphoma had a sixfold higher risk of progression to death than persons with Kaposi's sarcoma or opportunistic infections. The incubation time as well as other factors before AIDS did not affect survival after AIDS. Survival from AIDS to death can be predicted by data obtained at the time of AIDS diagnosis, such as type of diagnosis, age and CD4 cell count. AIDS seems to be a significant point in progression to death, and not just a floating point between infection and death affected by prior factors for persons who did not receive effective therapy and did not have long incubation times.
Grabowski, Nils Th; Klein, Günter
2017-01-01
To increase the shelf life of edible insects, modern techniques (e.g. freeze-drying) add to the traditional methods (degutting, boiling, sun-drying or roasting). However, microorganisms become inactivated rather than being killed, and when rehydrated, many return to vegetative stadia. Crickets (Gryllus bimaculatus) and superworms (Zophobas atratus) were submitted to four different drying techniques (T1 = 10' cooking, 24 h drying at 60℃; T2 = 10' cooking, 24 h drying at 80℃; T3 = 30' cooking, 12 h drying at 80℃, and 12 h drying at 100℃; T4 = boiling T3-treated insects after five days) and analysed for total bacteria counts, Enterobacteriaceae, staphylococci, bacilli, yeasts and moulds counts, E. coli, salmonellae, and Listeria monocytogenes (the latter three being negative throughout). The microbial counts varied strongly displaying species- and treatment-specific patterns. T3 was the most effective of the drying treatments tested to decrease all counts but bacilli, for which T2 was more efficient. Still, total bacteria counts remained high (G. bimaculatus > Z. atratus). Other opportunistically pathogenic microorganisms (Bacillus thuringiensis, B. licheniformis, B. pumilis, Pseudomonas aeruginosa, and Cryptococcus neoformans) were also encountered. The tyndallisation-like T4 reduced all counts to below detection limit, but nutrients leakage should be considered regarding food quality. In conclusion, species-specific drying procedures should be devised to ensure food safety. © The Author(s) 2016.
Laureshyn, Aliaksei; Goede, Maartje de; Saunier, Nicolas; Fyhri, Aslak
2017-08-01
Relying on accident records as the main data source for studying cyclists' safety has many drawbacks, such as high degree of under-reporting, the lack of accident details and particularly of information about the interaction processes that led to the accident. It is also an ethical problem as one has to wait for accidents to happen in order to make a statement about cyclists' (un-)safety. In this perspective, the use of surrogate safety measures based on actual observations in traffic is very promising. In this study we used video data from three intersections in Norway that were all independently analysed using three methods: the Swedish traffic conflict technique (Swedish TCT), the Dutch conflict technique (DOCTOR) and the probabilistic surrogate measures of safety (PSMS) technique developed in Canada. The first two methods are based on manual detection and counting of critical events in traffic (traffic conflicts), while the third considers probabilities of multiple trajectories for each interaction and delivers a density map of potential collision points per site. Due to extensive use of microscopic data, PSMS technique relies heavily on automated tracking of the road users in video. Across the three sites, the methods show similarities or are at least "compatible" with the accident records. The two conflict techniques agree quite well for the number, type and location of conflicts, but some differences with no obvious explanation are also found. PSMS reports many more safety-relevant interactions including less severe events. The location of the potential collision points is compatible with what the conflict techniques suggest, but the possibly significant share of false alarms due to inaccurate trajectories extracted from video complicates the comparison. The tested techniques still require enhancement, with respect to better adjustment to analysis of the situations involving cyclists (and vulnerable road users in general) and further validation. However, we believe this to be a future direction for the road safety analysis as the number of accidents is constantly decreasing and the quality of accident data does not seem to improve. Copyright © 2016 Elsevier Ltd. All rights reserved.
Performance analysis of a dual-tree algorithm for computing spatial distance histograms
Chen, Shaoping; Tu, Yi-Cheng; Xia, Yuni
2011-01-01
Many scientific and engineering fields produce large volume of spatiotemporal data. The storage, retrieval, and analysis of such data impose great challenges to database systems design. Analysis of scientific spatiotemporal data often involves computing functions of all point-to-point interactions. One such analytics, the Spatial Distance Histogram (SDH), is of vital importance to scientific discovery. Recently, algorithms for efficient SDH processing in large-scale scientific databases have been proposed. These algorithms adopt a recursive tree-traversing strategy to process point-to-point distances in the visited tree nodes in batches, thus require less time when compared to the brute-force approach where all pairwise distances have to be computed. Despite the promising experimental results, the complexity of such algorithms has not been thoroughly studied. In this paper, we present an analysis of such algorithms based on a geometric modeling approach. The main technique is to transform the analysis of point counts into a problem of quantifying the area of regions where pairwise distances can be processed in batches by the algorithm. From the analysis, we conclude that the number of pairwise distances that are left to be processed decreases exponentially with more levels of the tree visited. This leads to the proof of a time complexity lower than the quadratic time needed for a brute-force algorithm and builds the foundation for a constant-time approximate algorithm. Our model is also general in that it works for a wide range of point spatial distributions, histogram types, and space-partitioning options in building the tree. PMID:21804753
Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources.
Klumpp, John; Brandl, Alexander
2015-03-01
A particle counting and detection system is proposed that searches for elevated count rates in multiple energy regions simultaneously. The system analyzes time-interval data (e.g., time between counts), as this was shown to be a more sensitive technique for detecting low count rate sources compared to analyzing counts per unit interval (Luo et al. 2013). Two distinct versions of the detection system are developed. The first is intended for situations in which the sample is fixed and can be measured for an unlimited amount of time. The second version is intended to detect sources that are physically moving relative to the detector, such as a truck moving past a fixed roadside detector or a waste storage facility under an airplane. In both cases, the detection system is expected to be active indefinitely; i.e., it is an online detection system. Both versions of the multi-energy detection systems are compared to their respective gross count rate detection systems in terms of Type I and Type II error rates and sensitivity.
Peng, Nie; Bang-Fa, Ni; Wei-Zhi, Tian
2013-02-01
Application of effective interaction depth (EID) principle for parametric normalization of full energy peak efficiencies at different counting positions, originally for quasi-point sources, has been extended to bulky sources (within ∅30 mm×40 mm) with arbitrary matrices. It is also proved that the EID function for quasi-point source can be directly used for cylindrical bulky sources (within ∅30 mm×40 mm) with the geometric center as effective point source for low atomic number (Z) and low density (D) media and high energy γ-rays. It is also found that in general EID for bulky sources is dependent upon Z and D of the medium and the energy of the γ-rays in question. In addition, the EID principle was theoretically verified by MCNP calculations. Copyright © 2012 Elsevier Ltd. All rights reserved.
Chandra's Darkest Bright Star: not so Dark after All?
NASA Astrophysics Data System (ADS)
Ayres, Thomas R.
2008-11-01
The Chandra High Resolution camera (HRC) has obtained numerous short exposures of the ultraviolet (UV)-bright star Vega (α Lyrae; HD 172167: A0 V), to calibrate the response of the detector to out-of-band (non-X-ray) radiation. A new analysis uncovered a stronger "blue leak" in the imaging section (HRC-I) than reported in an earlier study of Vega based on a subset of the pointings. The higher count rate—a factor of nearly 2 above prelaunch estimates—raised the possibility that genuine coronal X-rays might lurk among the out-of-band events. Exploiting the broader point-spread function of the UV leak compared with soft X-rays identified an excess of counts centered on the target, technically at 3σ significance. A number of uncertainties, however, prevent a clear declaration of a Vegan corona. A more secure result would be within reach of a deep uninterrupted HRC-I pointing.
A photographic technique for estimating egg density of the white pine weevil, Pissodes strobi (Peck)
Roger T. Zerillo
1975-01-01
Compares a photographic technique with visual and dissection techniques for estimating egg density of the white pine weevil, Pissodes strobi (Peck). The relatively high correlations (.67 and .79) between counts from photographs and those obtained by dissection indicate that the non-destructive photographic technique could be a useful tool for...
Cosmology Constraints from the Weak Lensing Peak Counts and the Power Spectrum in CFHTLenS
Liu, Jia; May, Morgan; Petri, Andrea; ...
2015-03-04
Lensing peaks have been proposed as a useful statistic, containing cosmological information from non-Gaussianities that is inaccessible from traditional two-point statistics such as the power spectrum or two-point correlation functions. Here we examine constraints on cosmological parameters from weak lensing peak counts, using the publicly available data from the 154 deg2 CFHTLenS survey. We utilize a new suite of ray-tracing N-body simulations on a grid of 91 cosmological models, covering broad ranges of the three parameters Ω m, σ 8, and w, and replicating the galaxy sky positions, redshifts, and shape noise in the CFHTLenS observations. We then build anmore » emulator that interpolates the power spectrum and the peak counts to an accuracy of ≤ 5%, and compute the likelihood in the three-dimensional parameter space (Ω m, σ 8, w) from both observables. We find that constraints from peak counts are comparable to those from the power spectrum, and somewhat tighter when different smoothing scales are combined. Neither observable can constrain w without external data. When the power spectrum and peak counts are combined, the area of the error “banana” in the (Ω m, σ 8) plane reduces by a factor of ≈ two, compared to using the power spectrum alone. For a flat Λ cold dark matter model, combining both statistics, we obtain the constraint σ 8(Ω m/0.27)0.63 = 0.85 +0.03 -0.03.« less
Cosmology Constraints from the Weak Lensing Peak Counts and the Power Spectrum in CFHTLenS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Jia; May, Morgan; Petri, Andrea
Lensing peaks have been proposed as a useful statistic, containing cosmological information from non-Gaussianities that is inaccessible from traditional two-point statistics such as the power spectrum or two-point correlation functions. Here we examine constraints on cosmological parameters from weak lensing peak counts, using the publicly available data from the 154 deg2 CFHTLenS survey. We utilize a new suite of ray-tracing N-body simulations on a grid of 91 cosmological models, covering broad ranges of the three parameters Ω m, σ 8, and w, and replicating the galaxy sky positions, redshifts, and shape noise in the CFHTLenS observations. We then build anmore » emulator that interpolates the power spectrum and the peak counts to an accuracy of ≤ 5%, and compute the likelihood in the three-dimensional parameter space (Ω m, σ 8, w) from both observables. We find that constraints from peak counts are comparable to those from the power spectrum, and somewhat tighter when different smoothing scales are combined. Neither observable can constrain w without external data. When the power spectrum and peak counts are combined, the area of the error “banana” in the (Ω m, σ 8) plane reduces by a factor of ≈ two, compared to using the power spectrum alone. For a flat Λ cold dark matter model, combining both statistics, we obtain the constraint σ 8(Ω m/0.27)0.63 = 0.85 +0.03 -0.03.« less
Cuatianquiz Lima, Cecilia; Macías Garcia, Constantino
2016-01-01
Secondary cavity nesting (SCN) birds breed in holes that they do not excavate themselves. This is possible where there are large trees whose size and age permit the digging of holes by primary excavators and only rarely happens in forest plantations, where we expected a deficit of both breeding holes and SCN species. We assessed whether the availability of tree cavities influenced the number of SCNs in two temperate forest types, and evaluated the change in number of SCNs after adding nest boxes. First, we counted all cavities within each of our 25-m radius sampling points in mature and young forest plots during 2009. We then added nest boxes at standardised locations during 2010 and 2011 and conducted fortnightly bird counts (January-October 2009-2011). In 2011 we added two extra plots of each forest type, where we also conducted bird counts. Prior to adding nest boxes, counts revealed more SCNs in mature than in young forest. Following the addition of nest boxes, the number of SCNs increased significantly in the points with nest boxes in both types of forest. Counts in 2011 confirmed the increase in number of birds due to the addition of nest boxes. Given the likely benefits associated with a richer bird community we propose that, as is routinely done in some countries, forest management programs preserve old tree stumps and add nest boxes to forest plantations in order to increase bird numbers and bird community diversity.
Herbeck, Joshua T.; Müller, Viktor; Maust, Brandon S.; Ledergerber, Bruno; Torti, Carlo; Di Giambenedetto, Simona; Gras, Luuk; Günthard, Huldrych F.; Jacobson, Lisa P.; Mullins, James I.; Gottlieb, Geoffrey S.
2013-01-01
Objective The potential for changing HIV-1 virulence has significant implications for the AIDS epidemic, including changing HIV transmission rates, rapidity of disease progression, and timing of ART. Published data to date have provided conflicting results. Design We conducted a meta-analysis of changes in baseline CD4+ T-cell counts and set point plasma viral RNA load over time in order to establish whether summary trends are consistent with changing HIV-1 virulence. Methods We searched PubMed for studies of trends in HIV-1 prognostic markers of disease progression and supplemented findings with publications referenced in epidemiological or virulence studies. We identified 12 studies of trends in baseline CD4+ T-cell counts (21 052 total individuals), and eight studies of trends in set point viral loads (10 785 total individuals), spanning the years 1984–2010. Using random-effects meta-analysis, we estimated summary effect sizes for trends in HIV-1 plasma viral loads and CD4+ T-cell counts. Results Baseline CD4+ T-cell counts showed a summary trend of decreasing cell counts [effect=−4.93 cells/µl per year, 95% confidence interval (CI) −6.53 to −3.3]. Set point viral loads showed a summary trend of increasing plasma viral RNA loads (effect=0.013 log10 copies/ml per year, 95% CI −0.001 to 0.03). The trend rates decelerated in recent years for both prognostic markers. Conclusion Our results are consistent with increased virulence of HIV-1 over the course of the epidemic. Extrapolating over the 30 years since the first description of AIDS, this represents a CD4+ T cells loss of approximately 148 cells/µl and a gain of 0.39 log10 copies/ml of viral RNA measured during early infection. These effect sizes would predict increasing rates of disease progression, and need for ART as well as increasing transmission risk. PMID:22089381
Bacteriological evaluation of Allium sativum oil as a new medicament for pulpotomy of primary teeth
Mohammad, Shukry Gamal; Baroudi, Kusai
2015-01-01
Objective: To compare the effects of Allium sativum oil and formocresol on the pulp tissue of the pulpotomized teeth. Materials and Methods: Twenty children were selected for this study. All children had a pair of non-vital primary molars. A sterile paper point was dipped in the root canals prior to the mortal pulpotomy. These paper points were collected in transfer media and immediately transported to the microbiological lab to be investigated microbiologically (for Streptococcus mutans and Lactobacillus acidophilus). Then the procedure of mortal pulpotomy was performed. After 2 weeks, the cotton pellets were removed and sterile paper points were dipped in the root canals for microbiological examination. Then comparison between the count of bacteria before and after treatment was conducted. Statistical analysis was performed using independent t-test and paired t-test at the significance level of α = 0.05. Results: After application of both medicaments, there was a marked decrease in S. mutans and L. acidophilus counts. The difference between the mean of log values of the count before and after the application was highly significant for both medicaments (P < 0.05); however, better results were obtained when A. sativum oil was used. Conclusion: A. sativum oil had more powerful antimicrobial effects than formocresol on the bacteria of the infected root canals. PMID:25992338
Bacteriological evaluation of Allium sativum oil as a new medicament for pulpotomy of primary teeth.
Mohammad, Shukry Gamal; Baroudi, Kusai
2015-01-01
To compare the effects of Allium sativum oil and formocresol on the pulp tissue of the pulpotomized teeth. Twenty children were selected for this study. All children had a pair of non-vital primary molars. A sterile paper point was dipped in the root canals prior to the mortal pulpotomy. These paper points were collected in transfer media and immediately transported to the microbiological lab to be investigated microbiologically (for Streptococcus mutans and Lactobacillus acidophilus). Then the procedure of mortal pulpotomy was performed. After 2 weeks, the cotton pellets were removed and sterile paper points were dipped in the root canals for microbiological examination. Then comparison between the count of bacteria before and after treatment was conducted. Statistical analysis was performed using independent t-test and paired t-test at the significance level of α = 0.05. After application of both medicaments, there was a marked decrease in S. mutans and L. acidophilus counts. The difference between the mean of log values of the count before and after the application was highly significant for both medicaments (P < 0.05); however, better results were obtained when A. sativum oil was used. A. sativum oil had more powerful antimicrobial effects than formocresol on the bacteria of the infected root canals.
Hunt, GB; Luff, J; Daniel, L; Van den Bergh, R.
2015-01-01
The aims of this prospective study were to quantify steatosis in dogs with congenital portosystemic shunts using a fat-specific stain, to compare the amount of steatosis in different lobes of the liver, and to evaluate intra- and inter-Observer variability in lipid point counting. Computer-assisted point counting of lipid droplets was undertaken following Oil-Red-O staining in 21 dogs with congenital portosystemic shunts and 9 control dogs. Dogs with congenital portosystemic shunts had significantly more small lipid droplets (< 6 μ) than control dogs (p = 0.0013 and 0.0002, respectively). There was no significant difference in steatosis between liver lobes for either control dogs and CPS dogs. Significant differences were seen between observers for the number of large lipid droplets (> 9 μ) and lipogranulomas per tissue point (p = 0.023 and 0.01, respectively). In conclusion, computer-assisted counting of lipid droplets following Oil Red O staining of liver biopsy samples allows objective measurement and detection of significant differences between dogs with CPS and normal dogs. This method will allow future evaluation of the relationship between different presentations of CPS (anatomy, age, breed) and lipidosis, as well as the impact of hepatic lipidosis on outcomes following surgical shunt attenuation. PMID:23528942
Laser-induced photo emission detection: data acquisition based on light intensity counting
NASA Astrophysics Data System (ADS)
Yulianto, N.; Yudasari, N.; Putri, K. Y.
2017-04-01
Laser Induced Breakdown Detection (LIBD) is one of the quantification techniques for colloids. There are two ways of detection in LIBD: optical detection and acoustic detection. LIBD is based on the detection of plasma emission due to the interaction between particle and laser beam. In this research, the changing of light intensity during plasma formations was detected by a photodiode sensor. A photo emission data acquisition system was built to collect and transform them into digital counts. The real-time system used data acquisition device National Instrument DAQ 6009 and LABVIEW software. The system has been tested on distilled water and tap water samples. The result showed 99.8% accuracy by using counting technique in comparison to the acoustic detection with sample rate of 10 Hz, thus the acquisition system can be applied as an alternative method to the existing LIBD acquisition system.
NASA Technical Reports Server (NTRS)
Jenniskens, Peter; Crawford, Chris; Butow, Steven J.; Nugent, David; Koop, Mike; Holman, David; Houston, Jane; Jobse, Klaas; Kronk, Gary
2000-01-01
A new hybrid technique of visual and video meteor observations was developed to provide high precision near real-time flux measurements for satellite operators from airborne platforms. A total of 33,000 Leonids. recorded on video during the 1999 Leonid storm, were watched by a team of visual observers using a video head display and an automatic counting tool. The counts reveal that the activity profile of the Leonid storm is a Lorentz profile. By assuming a radial profile for the dust trail that is also a Lorentzian, we make predictions for future encounters. If that assumption is correct, we passed 0.0003 AU deeper into the 1899 trailet than expected during the storm of 1999 and future encounters with the 1866 trailet will be less intense than. predicted elsewhere.
Statistical properties of several models of fractional random point processes
NASA Astrophysics Data System (ADS)
Bendjaballah, C.
2011-08-01
Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Windhorst, Rogier A.; Cohen, Seth H.; Mechtley, Matt
2011-04-01
We describe the Hubble Space Telescope (HST) Wide Field Camera 3 (WFC3) Early Release Science (ERS) observations in the Great Observatories Origins Deep Survey (GOODS) South field. The new WFC3 ERS data provide calibrated, drizzled mosaics in the UV filters F225W, F275W, and F336W, as well as in the near-IR filters F098M (Y{sub s} ), F125W (J), and F160W (H) with 1-2 HST orbits per filter. Together with the existing HST Advanced Camera for Surveys (ACS) GOODS-South mosaics in the BViz filters, these panchromatic 10-band ERS data cover 40-50 arcmin{sup 2} at 0.2-1.7 {mu}m in wavelength at 0.''07-0.''15 FWHM resolutionmore » and 0.''090 Multidrizzled pixels to depths of AB {approx_equal} 26.0-27.0 mag (5{sigma}) for point sources, and AB {approx_equal} 25.5-26.5 mag for compact galaxies. In this paper, we describe (1) the scientific rationale, and the data taking plus reduction procedures of the panchromatic 10-band ERS mosaics, (2) the procedure of generating object catalogs across the 10 different ERS filters, and the specific star-galaxy separation techniques used, and (3) the reliability and completeness of the object catalogs from the WFC3 ERS mosaics. The excellent 0.''07-0.''15 FWHM resolution of HST/WFC3 and ACS makes star-galaxy separation straightforward over a factor of 10 in wavelength to AB {approx_equal} 25-26 mag from the UV to the near-IR, respectively. Our main results are: (1) proper motion of faint ERS stars is detected over 6 years at 3.06 {+-} 0.66 mas year{sup -1} (4.6{sigma}), consistent with Galactic structure models; (2) both the Galactic star counts and the galaxy counts show mild but significant trends of decreasing count slopes from the mid-UV to the near-IR over a factor of 10 in wavelength; (3) combining the 10-band ERS counts with the panchromatic Galaxy and Mass Assembly survey counts at the bright end (10 mag {approx}< AB {approx}< 20 mag) and the Hubble Ultra Deep Field counts in the BVizY{sub s}JH filters at the faint end (24 mag {approx}< AB {approx}< 30 mag) yields galaxy counts that are well measured over the entire flux range 10 mag {approx}< AB {approx}< 30 mag for 0.2-2 {mu}m in wavelength; (4) simple luminosity+density evolution models can fit the galaxy counts over this entire flux range. However, no single model can explain the counts over this entire flux range in all 10 filters simultaneously. More sophisticated models of galaxy assembly are needed to reproduce the overall constraints provided by the current panchromatic galaxy counts for 10 mag {approx}< AB {approx}< 30 mag over a factor of 10 in wavelength.« less
Microbial and nutritional aspects on the production of live feeds in a fish farming industry.
De Donno, A; Lugoli, F; Bagordo, F; Vilella, S; Campa, A; Grassi, T; Guido, M
2010-03-01
Aquaculture is an enterprise in constant development, in particular relating to its effect on the environment and also the quality of its products. It represents a valid alternative to traditional fishing, facing the increasing demand for fish products. To guarantee to the consumer a product of high nutritional, organoleptic and hygienic quality, it is fundamental to monitor every phase of the fish farming industry, isolating the potential risk points. For this reason there has been a rapid evolution of productive technique, particularly in the technology, artificial reproduction and feed sectors. The aim of this research has been the monitoring of the evolution of certain microbial and nutritional quality indexes (total microbial counts and lipid analysis on suspensions of Rotifers and Artemia, used as live feed) in the larval phase of the productive cycle of the farm raised fish, in an intensive system. The study has shown an increment in the total microbial counts in the fish farming industry within the production of Rotifers and Artemia, more evident in the suspensions of Rotifers. In addition the study has demonstrated that the maintenance phase, in the enrichment protocol, can reduce the EPA and DHA content. The results confirm the importance of microbial and nutritional control of the live feeds before they get supplied to fish larvae.
Yim, Yun-Kyoung; Lee, Hyun; Hong, Kwon-Eui; Kim, Young-Il; Lee, Byung-Ryul; Kim, Tae-Han; Yi, Ji-Young
2006-01-01
AIM: To investigate the hepatoprotective effect of manual acupuncture at Yanglingquan (GB34) on CCl4-induced chronic liver damage in rats. METHODS: Rats were injected intraperitoneally with CCl4 (1 mL/kg) and treated with manual acupuncture using reinforcing manipulation techniques at left GB34 (Yanglingquan) 3 times a week for 10 wk. A non-acupoint in left gluteal area was selected as a sham point. To estimate the hepatoprotective effect of manual acupuncture at GB34, measurement of liver index, biochemical assays including serum ALT, AST, ALP and total cholesterol, histological analysis and blood cell counts were conducted. RESULTS: Manual acupuncture at GB34 reduced the liver index, serum ALT, AST, ALP and total cholesterol levels as compared with the control group and the sham acupuncture group. It also increased and normalized the populations of WBC and lymphocytes. CONCLUSION: Manual acupuncture with reinforcing manipulation techniques at left GB34 reduces liver toxicity, protects liver function and liver tissue, and normalizes immune activity in CCl4-intoxicated rats. PMID:16610030
PRESAGE: Protecting Structured Address Generation against Soft Errors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram
Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation (to index large arrays) have not been widely researched. We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGE is that any addressmore » computation scheme that flows an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Enabling the flow of errors allows one to situate detectors at loop exit points, and helps turn silent corruptions into easily detectable error situations. Our experiments using PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.« less
PRESAGE: Protecting Structured Address Generation against Soft Errors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, Vishal C.; Gopalakrishnan, Ganesh; Krishnamoorthy, Sriram
Modern computer scaling trends in pursuit of larger component counts and power efficiency have, unfortunately, lead to less reliable hardware and consequently soft errors escaping into application data ("silent data corruptions"). Techniques to enhance system resilience hinge on the availability of efficient error detectors that have high detection rates, low false positive rates, and lower computational overhead. Unfortunately, efficient detectors to detect faults during address generation have not been widely researched (especially in the context of indexing large arrays). We present a novel lightweight compiler-driven technique called PRESAGE for detecting bit-flips affecting structured address computations. A key insight underlying PRESAGEmore » is that any address computation scheme that propagates an already incurred error is better than a scheme that corrupts one particular array access but otherwise (falsely) appears to compute perfectly. Ensuring the propagation of errors allows one to place detectors at loop exit points and helps turn silent corruptions into easily detectable error situations. Our experiments using the PolyBench benchmark suite indicate that PRESAGE-based error detectors have a high error-detection rate while incurring low overheads.« less
Selective photon counter for digital x-ray mammography tomosynthesis
NASA Astrophysics Data System (ADS)
Goldan, Amir H.; Karim, Karim S.; Rowlands, J. A.
2006-03-01
Photon counting is an emerging detection technique that is promising for mammography tomosynthesis imagers. In photon counting systems, the value of each image pixel is equal to the number of photons that interact with the detector. In this research, we introduce the design and implementation of a low noise, novel selective photon counting pixel for digital mammography tomosynthesis in crystalline silicon CMOS (complementary metal oxide semiconductor) 0.18 micron technology. The design comprises of a low noise charge amplifier (CA), two low offset voltage comparators, a decision-making unit (DMU), a mode selector, and a pseudo-random counter. Theoretical calculations and simulation results of linearity, gain, and noise of the photon counting pixel are presented.
Screening of probiotic goat milk tablets using Plackett-Burman design.
Chen, He; Zhang, Jianhua; Shu, Guowei
2014-01-01
Probiotics defined as additional microorganisms were added to goat milk powder, which not only improves the intestinal flora balance but also promotes human and animal health. The objectives of this study were to improve and guarantee high probiotics viable count and accordance with consumer's acceptance. The reading selected the number of colony between 30 and 300, then calculated the viable count per gram of goat milk tablet (cfu/g). The items of sensory evaluation included: appearance, flavour, colour, texture and taste. The score test was composed of 5 trained assessors, scored combination of different formulations (full marks of 100 points) and recorded the results. Analysis of the results showed that sucrose, inulin and mannitol were selected as the main effective parameters on both viable count and sensory evaluation. Furthermore optimization of the formulation of probiotic goat milk tablets was to maximise the probiotics viable count to achieve 9.5·108 cfu/g and its scores of sensory evaluation to get 94 points. Future probiotics products will be combined with a variety of probiotics, which can display their respective advantages and characteristics. Thus the products will not only be in accordance with the requirements of human health and trend of social development, but also will quickly become a favorite among consumers.
Accuracy or precision: Implications of sample design and methodology on abundance estimation
Kowalewski, Lucas K.; Chizinski, Christopher J.; Powell, Larkin A.; Pope, Kevin L.; Pegg, Mark A.
2015-01-01
Sampling by spatially replicated counts (point-count) is an increasingly popular method of estimating population size of organisms. Challenges exist when sampling by point-count method, and it is often impractical to sample entire area of interest and impossible to detect every individual present. Ecologists encounter logistical limitations that force them to sample either few large-sample units or many small sample-units, introducing biases to sample counts. We generated a computer environment and simulated sampling scenarios to test the role of number of samples, sample unit area, number of organisms, and distribution of organisms in the estimation of population sizes using N-mixture models. Many sample units of small area provided estimates that were consistently closer to true abundance than sample scenarios with few sample units of large area. However, sample scenarios with few sample units of large area provided more precise abundance estimates than abundance estimates derived from sample scenarios with many sample units of small area. It is important to consider accuracy and precision of abundance estimates during the sample design process with study goals and objectives fully recognized, although and with consequence, consideration of accuracy and precision of abundance estimates is often an afterthought that occurs during the data analysis process.
Non-destructive lichen biomass estimation in northwestern Alaska: a comparison of methods.
Rosso, Abbey; Neitlich, Peter; Smith, Robert J
2014-01-01
Terrestrial lichen biomass is an important indicator of forage availability for caribou in northern regions, and can indicate vegetation shifts due to climate change, air pollution or changes in vascular plant community structure. Techniques for estimating lichen biomass have traditionally required destructive harvesting that is painstaking and impractical, so we developed models to estimate biomass from relatively simple cover and height measurements. We measured cover and height of forage lichens (including single-taxon and multi-taxa "community" samples, n = 144) at 73 sites on the Seward Peninsula of northwestern Alaska, and harvested lichen biomass from the same plots. We assessed biomass-to-volume relationships using zero-intercept regressions, and compared differences among two non-destructive cover estimation methods (ocular vs. point count), among four landcover types in two ecoregions, and among single-taxon vs. multi-taxa samples. Additionally, we explored the feasibility of using lichen height (instead of volume) as a predictor of stand-level biomass. Although lichen taxa exhibited unique biomass and bulk density responses that varied significantly by growth form, we found that single-taxon sampling consistently under-estimated true biomass and was constrained by the need for taxonomic experts. We also found that the point count method provided little to no improvement over ocular methods, despite increased effort. Estimated biomass of lichen-dominated communities (mean lichen cover: 84.9±1.4%) using multi-taxa, ocular methods differed only nominally among landcover types within ecoregions (range: 822 to 1418 g m-2). Height alone was a poor predictor of lichen biomass and should always be weighted by cover abundance. We conclude that the multi-taxa (whole-community) approach, when paired with ocular estimates, is the most reasonable and practical method for estimating lichen biomass at landscape scales in northwest Alaska.
Non-Destructive Lichen Biomass Estimation in Northwestern Alaska: A Comparison of Methods
Rosso, Abbey; Neitlich, Peter; Smith, Robert J.
2014-01-01
Terrestrial lichen biomass is an important indicator of forage availability for caribou in northern regions, and can indicate vegetation shifts due to climate change, air pollution or changes in vascular plant community structure. Techniques for estimating lichen biomass have traditionally required destructive harvesting that is painstaking and impractical, so we developed models to estimate biomass from relatively simple cover and height measurements. We measured cover and height of forage lichens (including single-taxon and multi-taxa “community” samples, n = 144) at 73 sites on the Seward Peninsula of northwestern Alaska, and harvested lichen biomass from the same plots. We assessed biomass-to-volume relationships using zero-intercept regressions, and compared differences among two non-destructive cover estimation methods (ocular vs. point count), among four landcover types in two ecoregions, and among single-taxon vs. multi-taxa samples. Additionally, we explored the feasibility of using lichen height (instead of volume) as a predictor of stand-level biomass. Although lichen taxa exhibited unique biomass and bulk density responses that varied significantly by growth form, we found that single-taxon sampling consistently under-estimated true biomass and was constrained by the need for taxonomic experts. We also found that the point count method provided little to no improvement over ocular methods, despite increased effort. Estimated biomass of lichen-dominated communities (mean lichen cover: 84.9±1.4%) using multi-taxa, ocular methods differed only nominally among landcover types within ecoregions (range: 822 to 1418 g m−2). Height alone was a poor predictor of lichen biomass and should always be weighted by cover abundance. We conclude that the multi-taxa (whole-community) approach, when paired with ocular estimates, is the most reasonable and practical method for estimating lichen biomass at landscape scales in northwest Alaska. PMID:25079228
NASA Technical Reports Server (NTRS)
Wang, Alian; Kuebler, Karla E.; Jolliff, Bradley L.; Haskin, Larry A.
2003-01-01
Fe-Ti-Cr-Oxide minerals contain much information about rock petrogenesis and alteration. Among the most important in the petrology of common intrusive and extrusive rocks are those of the FeO-TiO2-Cr2O3 compositional system chromite, ulv spinel-magnetite, and ilmenite-hematite. These minerals retain memories of oxygen fugacity. Their exsolution into companion mineral pairs give constraints on formation temperature and cooling rate. Laser Raman spectroscopy is anticipated to be a powerful technique for characterization of materials on the surface of Mars. A Mars Microbeam Raman Spectrometer (MMRS) is under development. It combines a micro sized laser beam and an automatic point-counting mechanism, and so can detect minor minerals or weak Raman-scattering phases such as Fe- Ti-Cr-oxides in mixtures (rocks & soils), and provide information on grain size and mineral mode. Most Fe-Ti-Cr-oxides produce weaker Raman signals than those from oxyanionic minerals, e.g. carbonates, sulfates, phosphates, and silicates, partly because most of them are intrinsically weaker Raman scatters, and partly because their dark colors limit the penetration depth of the excitation laser beam (visible wavelength) and of the Raman radiation produced. The purpose of this study is to show how well the Fe-Ti-Cr-oxides can be characterized by on-surface planetary exploration using Raman spectroscopy. We studied the basic Raman features of common examples of these minerals using well-characterized individual mineral grains. The knowledge gained was then used to study the Fe-Ti-Cr-oxides in Martian meteorite EETA79001, especially effects of compositional and structural variations on their Raman features.
Nelms, C.O.; Twedt, D.J.; Smith, Winston Paul
1993-01-01
In 1992, the Vicksburg Field Research Station of the National Wetlands Research Center initiated research on the ecology of migratory birds within forests of the Mississippi Alluvial Valley (MAV). The MAV was historically a nearly contiguous bottomland hardwood forest, however, only remnants remain. These remnants are fragmented and often influenced by drainage projects, silviculture, agriculture, and urban development. Our objectives are to assess species richness and relative abundance, and to relate these to the size, quality, and composition of forest stands. Species richness and relative abundance were estimated for 53 randomly selected forest sites using 1 to 8 point counts per site, depending on the size of the forest fragment. However, statistical comparisons among sites will be restricted to an equal number ofpoint counts within the sites being compared. Point counts, lasting five minutes, were conducted from 11 May to 29 June 1992, foltowing Ralph, Sauer, and Droege (Point Count Standards; memo dated 9 March 1992). Vegetation was measured at the first three points on each site using a modification of the methods employed by Martin and Roper (Condor 90: 5 1-57; 1988). During 252 counts, 7 1 species were encountered, but only 62 species were encountered within a 50-m radius of point center. The mean number of species encountered within 50 m of a point, was 7.3 (s.d. = 2.7) and the mean number of individuals was 11.2 (s.d. = 4.2). The mean number of species detected at any distance was 9.6 (s.d, = 2.8) and the mean number of individuals was 15.6 (s.d. = 7.9). The most frequently encountered warblers in the MAV were Prothonotary Warbler and Northern Parula. Rarely encountered warblers were American Redstart and Worm-eating Warbler. The genera, Quercus, Ulmus, Carya, and Celtis were each encountered at 80 or more of the 152 points at which vegetation was sampled. Species most frequentlyencountered were: sugarberry (Celtis laevagata), water hickory (Caqa aquatica), American elm (Ulmus arnericana), sweetgum (Liquidambar styraciflua) and willow oak (Quercus phellos) The mean basal area of all trees 10 cm diameter-at-breast height (dbh) was 28 m2 /ha (range 7-70). The mean canopy cover was 87 percent, mean canopy height was 20 m, ground cover was 60 percent, and vegetation density (2-7 m) was 47 percent. The most frequently encountered understory species were sugarberry, ash (Fraxinus spp.), maple (Acer spp.), and elm (Ulnrus spp.). A cooperative GIs effort among the U.S. Fish and Wildlife Service, the Nature Conservancy, and the University of Arkansas is currently classifying forested habitats within the MAV. This effort will provide information on stand size and topology which will be used in concert with our current data, and data from visits to additional forest stands in 1993, to assess the relationship between the size, quatity, and composition of forests within the MAV and their breeding bird community.
Dermatoglyphics: A Diagnostic Aid?
Fuller, I. C.
1973-01-01
Dermatoglyphics of patients suffering from diabetes, schizophrenia, duodenal ulcer, asthma, and various cancers have been contrasted and significant differences in the digital ridge counts, maximum atd angles, and distal palmar loop ridge counts have been found. A discriminant analysis of the digital ridge counts was performed and the function was used to attempt differential diagnosis between these conditions on dermatoglyphic evidence alone. This diagnostic trial failed, and possible reasons for its failure are discussed. Attention is drawn to the possibility that prognostic implications of dermatoglyphics might be relevant to screening techniques. PMID:4714584
SU-E-I-79: Source Geometry Dependence of Gamma Well-Counter Measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, M; Belanger, A; Kijewski, M
Purpose: To determine the effect of liquid sample volume and geometry on counting efficiency in a gamma well-counter, and to assess the relative contributions of sample geometry and self-attenuation. Gamma wellcounters are standard equipment in clinical and preclinical studies, for measuring patient blood radioactivity and quantifying animal tissue uptake for tracer development and other purposes. Accurate measurements are crucial. Methods: Count rates were measured for aqueous solutions of 99m- Tc at four liquid volume values in a 1-cm-diam tube and at six volume values in a 2.2-cm-diam vial. Total activity was constant for all volumes, and data were corrected formore » decay. Count rates from a point source in air, supported by a filter paper, were measured at seven heights between 1.3 and 5.7 cm from the bottom of a tube. Results: Sample volume effects were larger for the tube than for the vial. For the tube, count efficiency relative to a 1-cc volume ranged from 1.05 at 0.05 cc to 0.84 at 3 cc. For the vial, relative count efficiency ranged from 1.02 at 0.05 cc to 0.87 at 15 cc. For the point source, count efficiency relative to 1.3 cm from the tube bottom ranged from 0.98 at 1.8 cm to 0.34 at 5.7 cm. The relative efficiency of a 3-cc liquid sample in a tube compared to a 1-cc sample is 0.84; the average relative efficiency for the solid sample in air between heights in the tube corresponding to the surfaces of those volumes (1.3 and 4.8 cm) is 0.81, implying that the major contribution to efficiency loss is geometry, rather than attenuation. Conclusion: Volume-dependent correction factors should be used for accurate quantitation radioactive of liquid samples. Solid samples should be positioned at the bottom of the tube for maximum count efficiency.« less
Automated Video-Based Traffic Count Analysis.
DOT National Transportation Integrated Search
2016-01-01
The goal of this effort has been to develop techniques that could be applied to the : detection and tracking of vehicles in overhead footage of intersections. To that end we : have developed and published techniques for vehicle tracking based on dete...
Recursive algorithms for phylogenetic tree counting.
Gavryushkina, Alexandra; Welch, David; Drummond, Alexei J
2013-10-28
In Bayesian phylogenetic inference we are interested in distributions over a space of trees. The number of trees in a tree space is an important characteristic of the space and is useful for specifying prior distributions. When all samples come from the same time point and no prior information available on divergence times, the tree counting problem is easy. However, when fossil evidence is used in the inference to constrain the tree or data are sampled serially, new tree spaces arise and counting the number of trees is more difficult. We describe an algorithm that is polynomial in the number of sampled individuals for counting of resolutions of a constraint tree assuming that the number of constraints is fixed. We generalise this algorithm to counting resolutions of a fully ranked constraint tree. We describe a quadratic algorithm for counting the number of possible fully ranked trees on n sampled individuals. We introduce a new type of tree, called a fully ranked tree with sampled ancestors, and describe a cubic time algorithm for counting the number of such trees on n sampled individuals. These algorithms should be employed for Bayesian Markov chain Monte Carlo inference when fossil data are included or data are serially sampled.
Yuta, Atsushi; Ukai, Kotaro; Sakakura, Yasuo; Tani, Hideshi; Matsuda, Fukiko; Yang, Tian-qun; Majima, Yuichi
2002-07-01
We made a prediction of the Japanese cedar (Cryptomeria japonica) pollen counts at Tsu city based on male flower-setting conditions of standard trees. The 69 standard trees from 23 kinds of clones, planted at Mie Prefecture Science and Technology Promotion Center (Hakusan, Mie) in 1964, were selected. Male flower-setting conditions for 276 faces (69 trees x 4 points of the compass) were scored from 0 to 3. The average of scores and total pollen counts from 1988 to 2000 was analyzed. As the results, the average scores from standard trees and total pollen counts except two mass pollen-scattered years in 1995 and 2000 had a positive correlation (r = 0.914) by linear function. On the mass pollen-scattered years, pollen counts were influenced from the previous year. Therefore, the score of the present year minus that of the previous year were used for analysis. The average scores from male flower-setting conditions and pollen counts had a strong positive correlation (r = 0.994) when positive scores by taking account of the previous year were analyzed. We conclude that prediction of pollen counts are possible based on the male flower-setting conditions of standard trees.
Medland, Sarah E; Loesch, Danuta Z; Mdzewski, Bogdan; Zhu, Gu; Montgomery, Grant W; Martin, Nicholas G
2007-01-01
The finger ridge count (a measure of pattern size) is one of the most heritable complex traits studied in humans and has been considered a model human polygenic trait in quantitative genetic analysis. Here, we report the results of the first genome-wide linkage scan for finger ridge count in a sample of 2,114 offspring from 922 nuclear families. Both univariate linkage to the absolute ridge count (a sum of all the ridge counts on all ten fingers), and multivariate linkage analyses of the counts on individual fingers, were conducted. The multivariate analyses yielded significant linkage to 5q14.1 (Logarithm of odds [LOD] = 3.34, pointwise-empirical p-value = 0.00025) that was predominantly driven by linkage to the ring, index, and middle fingers. The strongest univariate linkage was to 1q42.2 (LOD = 2.04, point-wise p-value = 0.002, genome-wide p-value = 0.29). In summary, the combination of univariate and multivariate results was more informative than simple univariate analyses alone. Patterns of quantitative trait loci factor loadings consistent with developmental fields were observed, and the simple pleiotropic model underlying the absolute ridge count was not sufficient to characterize the interrelationships between the ridge counts of individual fingers. PMID:17907812
2012-01-01
Background Most clinical guidelines recommend that AIDS-free, HIV-infected persons with CD4 cell counts below 0.350 × 109 cells/L initiate combined antiretroviral therapy (cART), but the optimal CD4 cell count at which cART should be initiated remains a matter of debate. Objective To identify the optimal CD4 cell count at which cART should be initiated. Design Prospective observational data from the HIV-CAUSAL Collaboration and dynamic marginal structural models were used to compare cART initiation strategies for CD4 thresholds between 0.200 and 0.500 × 109 cells/L. Setting HIV clinics in Europe and the Veterans Health Administration system in the United States. Patients 20 971 HIV-infected, therapy-naive persons with baseline CD4 cell counts at or above 0.500 × 109 cells/L and no previous AIDS-defining illnesses, of whom 8392 had a CD4 cell count that decreased into the range of 0.200 to 0.499 × 109 cells/L and were included in the analysis. Measurements Hazard ratios and survival proportions for all-cause mortality and a combined end point of AIDS-defining illness or death. Results Compared with initiating cART at the CD4 cell count threshold of 0.500 × 109 cells/L, the mortality hazard ratio was 1.01 (95% CI, 0.84 to 1.22) for the 0.350 threshold and 1.20 (CI, 0.97 to 1.48) for the 0.200 threshold. The corresponding hazard ratios were 1.38 (CI, 1.23 to 1.56) and 1.90 (CI, 1.67 to 2.15), respectively, for the combined end point of AIDS-defining illness or death. Limitations CD4 cell count at cART initiation was not randomized. Residual confounding may exist. Conclusion Initiation of cART at a threshold CD4 count of 0.500 × 109 cells/L increases AIDS-free survival. However, mortality did not vary substantially with the use of CD4 thresholds between 0.300 and 0.500 ×109 cells/L. Primary Funding Source National Institutes of Health. PMID:21502648
Cain, Lauren E; Logan, Roger; Robins, James M; Sterne, Jonathan A C; Sabin, Caroline; Bansi, Loveleen; Justice, Amy; Goulet, Joseph; van Sighem, Ard; de Wolf, Frank; Bucher, Heiner C; von Wyl, Viktor; Esteve, Anna; Casabona, Jordi; del Amo, Julia; Moreno, Santiago; Seng, Remonie; Meyer, Laurence; Perez-Hoyos, Santiago; Muga, Roberto; Lodi, Sara; Lanoy, Emilie; Costagliola, Dominique; Hernan, Miguel A
2011-04-19
Most clinical guidelines recommend that AIDS-free, HIV-infected persons with CD4 cell counts below 0.350 × 10(9) cells/L initiate combined antiretroviral therapy (cART), but the optimal CD4 cell count at which cART should be initiated remains a matter of debate. To identify the optimal CD4 cell count at which cART should be initiated. Prospective observational data from the HIV-CAUSAL Collaboration and dynamic marginal structural models were used to compare cART initiation strategies for CD4 thresholds between 0.200 and 0.500 × 10(9) cells/L. HIV clinics in Europe and the Veterans Health Administration system in the United States. 20, 971 HIV-infected, therapy-naive persons with baseline CD4 cell counts at or above 0.500 × 10(9) cells/L and no previous AIDS-defining illnesses, of whom 8392 had a CD4 cell count that decreased into the range of 0.200 to 0.499 × 10(9) cells/L and were included in the analysis. Hazard ratios and survival proportions for all-cause mortality and a combined end point of AIDS-defining illness or death. Compared with initiating cART at the CD4 cell count threshold of 0.500 × 10(9) cells/L, the mortality hazard ratio was 1.01 (95% CI, 0.84 to 1.22) for the 0.350 threshold and 1.20 (CI, 0.97 to 1.48) for the 0.200 threshold. The corresponding hazard ratios were 1.38 (CI, 1.23 to 1.56) and 1.90 (CI, 1.67 to 2.15), respectively, for the combined end point of AIDS-defining illness or death. CD4 cell count at cART initiation was not randomized. Residual confounding may exist. Initiation of cART at a threshold CD4 count of 0.500 × 10(9) cells/L increases AIDS-free survival. However, mortality did not vary substantially with the use of CD4 thresholds between 0.300 and 0.500 × 10(9) cells/L.
Intraosseous repair of the inferior alveolar nerve in rats: an experimental model.
Curtis, N J; Trickett, R I; Owen, E; Lanzetta, M
1998-08-01
A reliable method of exposure of the inferior alveolar nerve in Wistar rats has been developed, to allow intraosseous repair with two microsurgical techniques under halothane inhalational anaesthesia. The microsuturing technique involves anastomosis with 10-0 nylon sutures; a laser-weld technique uses an albumin-based solder containing indocyanine green, plus an infrared (810 nm wavelength) diode laser Seven animals had left inferior alveolar nerve repairs performed with the microsuture and laser-weld techniques. Controls were provided by unoperated nerves in the repaired cases. Histochemical analysis was performed utilizing neuron counts and horseradish peroxidase tracer (HRP) uptake in the mandibular division of the trigeminal ganglion, following sacrifice and staining of frozen sections with cresyl violet and diaminobenzidene. The results of this analysis showed similar mean neuron counts and mean HRP uptake by neurons for the unoperated controls and both microsuture and laser-weld groups. This new technique of intraosseous exposure of the inferior alveolar nerve in rats is described. It allows reliable and reproducible microsurgical repairs using both microsuture and laser-weld techniques.
Powerful model for the point source sky: Far-ultraviolet and enhanced midinfrared performance
NASA Technical Reports Server (NTRS)
Cohen, Martin
1994-01-01
I report further developments of the Wainscoat et al. (1992) model originally created for the point source infrared sky. The already detailed and realistic representation of the Galaxy (disk, spiral arms and local spur, molecular ring, bulge, spheroid) has been improved, guided by CO surveys of local molecular clouds, and by the inclusion of a component to represent Gould's Belt. The newest version of the model is very well validated by Infrared Astronomy Satellite (IRAS) source counts. A major new aspect is the extension of the same model down to the far ultraviolet. I compare predicted and observed far-utraviolet source counts from the Apollo 16 'S201' experiment (1400 A) and the TD1 satellite (for the 1565 A band).
NASA Technical Reports Server (NTRS)
Herzfeld, Ute C.; McDonald, Brian W.; Wallins, Bruce F.; Markus, Thorsten; Neumann, Thomas A.; Brenner, Anita
2012-01-01
The Ice, Cloud and Land Elevation Satellite-II (ICESat-2) mission has been selected by NASA as a Decadal Survey mission, to be launched in 2016. Mission objectives are to measure land ice elevation, sea ice freeboard/ thickness and changes in these variables and to collect measurements over vegetation that will facilitate determination of canopy height, with an accuracy that will allow prediction of future environmental changes and estimation of sea-level rise. The importance of the ICESat-2 project in estimation of biomass and carbon levels has increased substantially, following the recent cancellation of all other planned NASA missions with vegetation-surveying lidars. Two innovative components will characterize the ICESat-2 lidar: (1) Collection of elevation data by a multi-beam system and (2) application of micropulse lidar (photon counting) technology. A micropulse photon-counting altimeter yields clouds of discrete points, which result from returns of individual photons, and hence new data analysis techniques are required for elevation determination and association of returned points to reflectors of interest including canopy and ground in forested areas. The objective of this paper is to derive and validate an algorithm that allows detection of ground under dense canopy and identification of ground and canopy levels in simulated ICESat-2-type data. Data are based on airborne observations with a Sigma Space micropulse lidar and vary with respect to signal strength, noise levels, photon sampling options and other properties. A mathematical algorithm is developed, using spatial statistical and discrete mathematical concepts, including radial basis functions, density measures, geometrical anisotropy, eigenvectors and geostatistical classification parameters and hyperparameters. Validation shows that the algorithm works very well and that ground and canopy elevation, and hence canopy height, can be expected to be observable with a high accuracy during the ICESat-2 mission. A result relevant for instrument design is that even the two weaker beam classes considered can be expected to yield useful results for vegetation measurements (93.01-99.57% correctly selected points for a beam with expected return of 0.93 mean signals per shot (msp9) and 72.85% - 98.68% for 0.48 msp (msp4)). Resampling options affect results more than noise levels. The algorithm derived here is generally applicable for analysis of micropulse lidar altimeter data collected over forested areas as well as other surfaces, including land ice, sea ice and land surfaces.
A technique for automatically extracting useful field of view and central field of view images.
Pandey, Anil Kumar; Sharma, Param Dev; Aheer, Deepak; Kumar, Jay Prakash; Sharma, Sanjay Kumar; Patel, Chetan; Kumar, Rakesh; Bal, Chandra Sekhar
2016-01-01
It is essential to ensure the uniform response of the single photon emission computed tomography gamma camera system before using it for the clinical studies by exposing it to uniform flood source. Vendor specific acquisition and processing protocol provide for studying flood source images along with the quantitative uniformity parameters such as integral and differential uniformity. However, a significant difficulty is that the time required to acquire a flood source image varies from 10 to 35 min depending both on the activity of Cobalt-57 flood source and the pre specified counts in the vendors protocol (usually 4000K-10,000K counts). In case the acquired total counts are less than the total prespecified counts, and then the vendor's uniformity processing protocol does not precede with the computation of the quantitative uniformity parameters. In this study, we have developed and verified a technique for reading the flood source image, remove unwanted information, and automatically extract and save the useful field of view and central field of view images for the calculation of the uniformity parameters. This was implemented using MATLAB R2013b running on Ubuntu Operating system and was verified by subjecting it to the simulated and real flood sources images. The accuracy of the technique was found to be encouraging, especially in view of practical difficulties with vendor-specific protocols. It may be used as a preprocessing step while calculating uniformity parameters of the gamma camera in lesser time with fewer constraints.
Kleine, Tilmann O; Nebe, C Thomas; Löwer, Christa; Lehmitz, Reinhard; Kruse, Rolf; Geilenkeuser, Wolf-Jochen; Dorn-Beineke, Alexandra
2009-08-01
Flow cytometry (FCM) is used with haematology analyzers (HAs) to count cells and differentiate leukocytes in cerebrospinal fluid (CSF). To evaluate the FCM techniques of HAs, 10 external DGKL trials with CSF controls were carried out in 2004 to 2008. Eight single platform HAs with and without CSF equipment were evaluated with living blood leukocytes and erythrocytes in CSF like DGKL controls: Coulter (LH750,755), Abbott CD3200, CD3500, CD3700, CD4000, Sapphire, ADVIA 120(R) CSF assay, and Sysmex XE-2100(R). Results were compared with visual counting of native cells in Fuchs-Rosenthal chamber, unstained, and absolute values of leukocyte differentiation, assayed by dual platform analysis with immune-FCM (FACSCalibur, CD45, CD14) and the chamber counts. Reference values X were compared with HA values Y by statistical evaluation with Passing/Bablock (P/B) linear regression analysis to reveal conformity of both methods. The HAs, studied, produced no valid results with DGKL CSF controls, because P/B regression revealed no conformity with the reference values due to:-blank problems with impedance analysis,-leukocyte loss with preanalytical erythrocyte lysis procedures, especially of monocytes,-inaccurate results with ADVIA cell sphering and cell differentiation with algorithms and enzyme activities (e.g., peroxidase). HA techniques have to be improved, e.g., using no erythrocyte lysis and CSF adequate techniques, to examine CSF samples precise and accurate. Copyright 2009 International Society for Advancement of Cytometry.
Poisson mixture model for measurements using counting.
Miller, Guthrie; Justus, Alan; Vostrotin, Vadim; Dry, Donald; Bertelli, Luiz
2010-03-01
Starting with the basic Poisson statistical model of a counting measurement process, 'extraPoisson' variance or 'overdispersion' are included by assuming that the Poisson parameter representing the mean number of counts itself comes from another distribution. The Poisson parameter is assumed to be given by the quantity of interest in the inference process multiplied by a lognormally distributed normalising coefficient plus an additional lognormal background that might be correlated with the normalising coefficient (shared uncertainty). The example of lognormal environmental background in uranium urine data is discussed. An additional uncorrelated background is also included. The uncorrelated background is estimated from a background count measurement using Bayesian arguments. The rather complex formulas are validated using Monte Carlo. An analytical expression is obtained for the probability distribution of gross counts coming from the uncorrelated background, which allows straightforward calculation of a classical decision level in the form of a gross-count alarm point with a desired false-positive rate. The main purpose of this paper is to derive formulas for exact likelihood calculations in the case of various kinds of backgrounds.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fliermans, C.B.; Dougherty, J.M.; Franck, M.M.
Effective in situ bioremediation strategies require an understanding of the effects pollutants and remediation techniques have on subsurface microbial communities. Therefore, detailed characterization of a site`s microbial communities is important. Subsurface sediment borings and water samples were collected from a trichloroethylene (TCE) contaminated site, before and after horizontal well in situ air stripping and bioventing, as well as during methane injection for stimulation of methane-utilizing microorganisms. Subsamples were processed for heterotrophic plate counts, acridine orange direct counts (AODC), community diversity, direct fluorescent antibodies (DFA) enumeration for several nitrogen-transforming bacteria, and Biolog {reg_sign} evaluation of enzyme activity in collected water samples.more » Plate counts were higher in near-surface depths than in the vadose zone sediment samples. During the in situ air stripping and bioventing, counts increased at or near the saturated zone, remained elevated throughout the aquifer, but did not change significantly after the air stripping. Sporadic increases in plate counts at different depths as well as increased diversity appeared to be linked to differing lithologies. AODCs were orders of magnitude higher than plate counts and remained relatively constant with depth except for slight increases near the surface depths and the capillary fringe. Nitrogen-transforming bacteria, as measured by serospecific DFA, were greatly affected both by the in situ air stripping and the methane injection. Biolog{reg_sign} activity appeared to increase with subsurface stimulation both by air and methane. The complexity of subsurface systems makes the use of selective monitoring tools imperative.« less
A review of costing methodologies in critical care studies.
Pines, Jesse M; Fager, Samuel S; Milzman, David P
2002-09-01
Clinical decision making in critical care has traditionally been based on clinical outcome measures such as mortality and morbidity. Over the past few decades, however, increasing competition in the health care marketplace has made it necessary to consider costs when making clinical and managerial decisions in critical care. Sophisticated costing methodologies have been developed to aid this decision-making process. We performed a narrative review of published costing studies in critical care during the past 6 years. A total of 282 articles were found, of which 68 met our search criteria. They involved a mean of 508 patients (range, 20-13,907). A total of 92.6% of the studies (63 of 68) used traditional cost analysis, whereas the remaining 7.4% (5 of 68) used cost-effectiveness analysis. None (0 of 68) used cost-benefit analysis or cost-utility analysis. A total of 36.7% (25 of 68) used hospital charges as a surrogate for actual costs. Of the 43 articles that actually counted costs, 37.2% (16 of 43) counted physician costs, 27.9% (12 of 43) counted facility costs, 34.9% (15 of 43) counted nursing costs, 9.3% (4 of 43) counted societal costs, and 90.7% (39 of 43) counted laboratory, equipment, and pharmacy costs. Our conclusion is that despite considerable progress in costing methodologies, critical care studies have not adequately implemented these techniques. Given the importance of financial implications in medicine, it would be prudent for critical care studies to use these more advanced techniques. Copyright 2002, Elsevier Science (USA). All rights reserved.
Beta/alpha continuous air monitor
Becker, Gregory K.; Martz, Dowell E.
1989-01-01
A single deep layer silicon detector in combination with a microcomputer, recording both alpha and beta activity and the energy of each pulse, distinguishing energy peaks using a novel curve fitting technique to reduce the natural alpha counts in the energy region where plutonium and other transuranic alpha emitters are present, and using a novel algorithm to strip out radon daughter contribution to actual beta counts.
Estimation of U content in coffee samples by fission-track counting
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharma, P.K.; Lal, N.; Nagpaul, K.K.
1985-06-01
Because coffee is consumed in large quantities by humans, the authors undertook the study of the uranium content of coffee as a continuation of earlier work to estimate the U content of foodstuffs. Since literature on this subject is scarce, they decided to use the well-established fission-track-counting technique to determine the U content of coffee.
Atila-Pektaş, B; Yurdakul, P; Gülmez, D; Görduysus, O
2013-05-01
To compare the antimicrobial activities of Activ Point (Roeko, Langenau, Germany), Calcium Hydroxide Plus Point (Roeko, Langenau, Germany), calcium hydroxide, 1% chlorhexidine gel and bioactive glass (S53P4) against Enterococcus faecalis and Streptococcus mutans. One hundred and twenty extracted single-rooted human teeth were used. After removing the crowns, root canals were prepared by using the Protaper rotary system. Following autoclave sterilization, root canals were incubated at 37 °C with E. faecalis ATCC 29212 and S. mutans RSHM 676 for 1 week. The specimens, which were divided into five treatment groups for each microorganism according to the intracanal medicament used, were tested in 10 experimental runs. In each experimental run, 10 roots were included as treatment, one root as positive control and one root as sterility control. Sterile paper points were utilized to take samples from root canals after the incubation of teeth in thioglycollate medium at 37 °C for 1 week. Samples taken from teeth by sterile paper points were inoculated onto sheep blood agar, and following an overnight incubation, the colonies grown on sheep blood agar were counted and interpreted as colony-forming units. Results were tested statistically by using Kruskal-Wallis and Conover's nonparametric multiple comparison tests. CHX gel (P < 0.001 and P < 0.001), Activ Point (P = 0.003 and P = 0.002) and Ca(OH)₂ (P = 0.010 and P = 0.005) were significantly more effective against E. faecalis than that of Ca(OH)₂ Plus Point and bioactive glass, respectively. On the other hand, compared with Ca(OH)₂ , CHX gel (P < 0.001), and Activ Point (P < 0.001), bioactive glass (P = 0.014) produced significantly lower colony counts of S. mutans. When compared with the positive control, treatment with Ca(OH)₂ Plus Point (P = 0.085 and P = 0.066) did not produce significantly lower colony counts of E. faecalis and S. mutans, respectively. Compared with the medicaments having an antimicrobial effect because of their alkaline pH, the medicaments containing chlorhexidine were effective against both E. faecalis and S. mutans. © 2012 International Endodontic Journal.
NASA Astrophysics Data System (ADS)
Araújo, M. M.; Duarte, R. C.; Silva, P. V.; Marchioni, E.; Villavicencio, A. L. C. H.
2009-07-01
Marketing of minimally processed vegetables (MPV) are gaining impetus due to its convenience, freshness and apparent health effect. However, minimal processing does not reduce pathogenic microorganisms to safe levels. Food irradiation is used to extend the shelf life and to inactivate food-borne pathogens. In combination with minimal processing it could improve safety and quality of MPV. A microbiological screening method based on the use of direct epifluorescent filter technique (DEFT) and aerobic plate count (APC) has been established for the detection of irradiated foodstuffs. The aim of this study was to evaluate the applicability of this technique in detecting MPV irradiation. Samples from retail markets were irradiated with 0.5 and 1.0 kGy using a 60Co facility. In general, with a dose increment, DEFT counts remained similar independent of the irradiation while APC counts decreased gradually. The difference of the two counts gradually increased with dose increment in all samples. It could be suggested that a DEFT/APC difference over 2.0 log would be a criteria to judge if a MPV was treated by irradiation. The DEFT/APC method could be used satisfactorily as a screening method for indicating irradiation processing.
Pandey, Anil Kumar; Sharma, Param Dev; Dheer, Pankaj; Parida, Girish Kumar; Goyal, Harish; Patel, Chetan; Bal, Chandrashekhar; Kumar, Rakesh
2017-01-01
Purpose of the Study: 99mTechnetium-methylene diphosphonate (99mTc-MDP) bone scan images have limited number of counts per pixel, and hence, they have inferior image quality compared to X-rays. Theoretically, global histogram equalization (GHE) technique can improve the contrast of a given image though practical benefits of doing so have only limited acceptance. In this study, we have investigated the effect of GHE technique for 99mTc-MDP-bone scan images. Materials and Methods: A set of 89 low contrast 99mTc-MDP whole-body bone scan images were included in this study. These images were acquired with parallel hole collimation on Symbia E gamma camera. The images were then processed with histogram equalization technique. The image quality of input and processed images were reviewed by two nuclear medicine physicians on a 5-point scale where score of 1 is for very poor and 5 is for the best image quality. A statistical test was applied to find the significance of difference between the mean scores assigned to input and processed images. Results: This technique improves the contrast of the images; however, oversaturation was noticed in the processed images. Student's t-test was applied, and a statistically significant difference in the input and processed image quality was found at P < 0.001 (with α = 0.05). However, further improvement in image quality is needed as per requirements of nuclear medicine physicians. Conclusion: GHE techniques can be used on low contrast bone scan images. In some of the cases, a histogram equalization technique in combination with some other postprocessing technique is useful. PMID:29142344
Chen, Yao-Yi; Dasari, Surendra; Ma, Ze-Qiang; Vega-Montoto, Lorenzo J.; Li, Ming
2013-01-01
Spectral counting has become a widely used approach for measuring and comparing protein abundance in label-free shotgun proteomics. However, when analyzing complex samples, the ambiguity of matching between peptides and proteins greatly affects the assessment of peptide and protein inventories, differentiation, and quantification. Meanwhile, the configuration of database searching algorithms that assign peptides to MS/MS spectra may produce different results in comparative proteomic analysis. Here, we present three strategies to improve comparative proteomics through spectral counting. We show that comparing spectral counts for peptide groups rather than for protein groups forestalls problems introduced by shared peptides. We demonstrate the advantage and flexibility of this new method in two datasets. We present four models to combine four popular search engines that lead to significant gains in spectral counting differentiation. Among these models, we demonstrate a powerful vote counting model that scales well for multiple search engines. We also show that semi-tryptic searching outperforms tryptic searching for comparative proteomics. Overall, these techniques considerably improve protein differentiation on the basis of spectral count tables. PMID:22552787
Chen, Yao-Yi; Dasari, Surendra; Ma, Ze-Qiang; Vega-Montoto, Lorenzo J; Li, Ming; Tabb, David L
2012-09-01
Spectral counting has become a widely used approach for measuring and comparing protein abundance in label-free shotgun proteomics. However, when analyzing complex samples, the ambiguity of matching between peptides and proteins greatly affects the assessment of peptide and protein inventories, differentiation, and quantification. Meanwhile, the configuration of database searching algorithms that assign peptides to MS/MS spectra may produce different results in comparative proteomic analysis. Here, we present three strategies to improve comparative proteomics through spectral counting. We show that comparing spectral counts for peptide groups rather than for protein groups forestalls problems introduced by shared peptides. We demonstrate the advantage and flexibility of this new method in two datasets. We present four models to combine four popular search engines that lead to significant gains in spectral counting differentiation. Among these models, we demonstrate a powerful vote counting model that scales well for multiple search engines. We also show that semi-tryptic searching outperforms tryptic searching for comparative proteomics. Overall, these techniques considerably improve protein differentiation on the basis of spectral count tables.
CALCIUM ABSORPTION IN MAN: BASED ON LARGE VOLUME LIQUID SCINTILLATION COUNTER STUDIES.
LUTWAK, L; SHAPIRO, J R
1964-05-29
A technique has been developed for the in vivo measurement of absorption of calcium in man after oral administration of 1 to 5 microcuries of calcium-47 and continuous counting of the radiation in the subject's arm with a large volume liquid scintillation counter. The maximum value for the arm counting technique is proportional to the absorption of tracer as measured by direct stool analysis. The rate of uptake by the arm is lower in subjects with either the malabsorption syndrome or hypoparathyroidism. The administration of vitamin D increases both the absorption rate and the maximum amount of calcium absorbed.
Sanchez-Cabeza, J A; Pujol, L
1995-05-01
The radiological examination of water requires a rapid screening technique that permits the determination of the gross alpha and beta activities of each sample in order to decide if further radiological analyses are necessary. In this work, the use of a low background liquid scintillation system (Quantulus 1220) is proposed to simultaneously determine the gross activities in water samples. Liquid scintillation is compared to more conventional techniques used in most monitoring laboratories. In order to determine the best counting configuration of the system, pulse shape discrimination was optimized for 6 scintillant/vial combinations. It was concluded that the best counting configuration was obtained with the scintillation cocktail Optiphase Hisafe 3 in Zinsser low diffusion vials. The detection limits achieved were 0.012 Bq L-1 and 0.14 Bq L-1 for gross alpha and beta activity respectively, after a 1:10 concentration process by simple evaporation and for a counting time of only 360 min. The proposed technique is rapid, gives spectral information, and is adequate to determine gross activities according to the World Health Organization (WHO) guideline values.
Relativistic Transformations of Light Power.
ERIC Educational Resources Information Center
McKinley, John M.
1979-01-01
Using a photon-counting technique, finds the angular distribution of emitted and detected power and the total radiated power of an arbitrary moving source, and uses the technique to verify the predicted effect of the earth's motion through the cosmic blackbody radiation. (Author/GA)
Initiation of Antiretroviral Therapy in Early Asymptomatic HIV Infection.
Lundgren, Jens D; Babiker, Abdel G; Gordin, Fred; Emery, Sean; Grund, Birgit; Sharma, Shweta; Avihingsanon, Anchalee; Cooper, David A; Fätkenheuer, Gerd; Llibre, Josep M; Molina, Jean-Michel; Munderi, Paula; Schechter, Mauro; Wood, Robin; Klingman, Karin L; Collins, Simon; Lane, H Clifford; Phillips, Andrew N; Neaton, James D
2015-08-27
Data from randomized trials are lacking on the benefits and risks of initiating antiretroviral therapy in patients with asymptomatic human immunodeficiency virus (HIV) infection who have a CD4+ count of more than 350 cells per cubic millimeter. We randomly assigned HIV-positive adults who had a CD4+ count of more than 500 cells per cubic millimeter to start antiretroviral therapy immediately (immediate-initiation group) or to defer it until the CD4+ count decreased to 350 cells per cubic millimeter or until the development of the acquired immunodeficiency syndrome (AIDS) or another condition that dictated the use of antiretroviral therapy (deferred-initiation group). The primary composite end point was any serious AIDS-related event, serious non-AIDS-related event, or death from any cause. A total of 4685 patients were followed for a mean of 3.0 years. At study entry, the median HIV viral load was 12,759 copies per milliliter, and the median CD4+ count was 651 cells per cubic millimeter. On May 15, 2015, on the basis of an interim analysis, the data and safety monitoring board determined that the study question had been answered and recommended that patients in the deferred-initiation group be offered antiretroviral therapy. The primary end point occurred in 42 patients in the immediate-initiation group (1.8%; 0.60 events per 100 person-years), as compared with 96 patients in the deferred-initiation group (4.1%; 1.38 events per 100 person-years), for a hazard ratio of 0.43 (95% confidence interval [CI], 0.30 to 0.62; P<0.001). Hazard ratios for serious AIDS-related and serious non-AIDS-related events were 0.28 (95% CI, 0.15 to 0.50; P<0.001) and 0.61 (95% CI, 0.38 to 0.97; P=0.04), respectively. More than two thirds of the primary end points (68%) occurred in patients with a CD4+ count of more than 500 cells per cubic millimeter. The risks of a grade 4 event were similar in the two groups, as were the risks of unscheduled hospital admissions. The initiation of antiretroviral therapy in HIV-positive adults with a CD4+ count of more than 500 cells per cubic millimeter provided net benefits over starting such therapy in patients after the CD4+ count had declined to 350 cells per cubic millimeter. (Funded by the National Institute of Allergy and Infectious Diseases and others; START ClinicalTrials.gov number, NCT00867048.).
Cook, Richard J; Wei, Wei
2003-07-01
The design of clinical trials is typically based on marginal comparisons of a primary response under two or more treatments. The considerable gains in efficiency afforded by models conditional on one or more baseline responses has been extensively studied for Gaussian models. The purpose of this article is to present methods for the design and analysis of clinical trials in which the response is a count or a point process, and a corresponding baseline count is available prior to randomization. The methods are based on a conditional negative binomial model for the response given the baseline count and can be used to examine the effect of introducing selection criteria on power and sample size requirements. We show that designs based on this approach are more efficient than those proposed by McMahon et al. (1994).
Cuatianquiz Lima, Cecilia
2016-01-01
Secondary cavity nesting (SCN) birds breed in holes that they do not excavate themselves. This is possible where there are large trees whose size and age permit the digging of holes by primary excavators and only rarely happens in forest plantations, where we expected a deficit of both breeding holes and SCN species. We assessed whether the availability of tree cavities influenced the number of SCNs in two temperate forest types, and evaluated the change in number of SCNs after adding nest boxes. First, we counted all cavities within each of our 25-m radius sampling points in mature and young forest plots during 2009. We then added nest boxes at standardised locations during 2010 and 2011 and conducted fortnightly bird counts (January–October 2009–2011). In 2011 we added two extra plots of each forest type, where we also conducted bird counts. Prior to adding nest boxes, counts revealed more SCNs in mature than in young forest. Following the addition of nest boxes, the number of SCNs increased significantly in the points with nest boxes in both types of forest. Counts in 2011 confirmed the increase in number of birds due to the addition of nest boxes. Given the likely benefits associated with a richer bird community we propose that, as is routinely done in some countries, forest management programs preserve old tree stumps and add nest boxes to forest plantations in order to increase bird numbers and bird community diversity. PMID:26998410
SU-F-SPS-09: Parallel MC Kernel Calculations for VMAT Plan Improvement
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chamberlain, S; Roswell Park Cancer Institute, Buffalo, NY; French, S
Purpose: Adding kernels (small perturbations in leaf positions) to the existing apertures of VMAT control points may improve plan quality. We investigate the calculation of kernel doses using a parallelized Monte Carlo (MC) method. Methods: A clinical prostate VMAT DICOM plan was exported from Eclipse. An arbitrary control point and leaf were chosen, and a modified MLC file was created, corresponding to the leaf position offset by 0.5cm. The additional dose produced by this 0.5 cm × 0.5 cm kernel was calculated using the DOSXYZnrc component module of BEAMnrc. A range of particle history counts were run (varying from 3more » × 10{sup 6} to 3 × 10{sup 7}); each job was split among 1, 10, or 100 parallel processes. A particle count of 3 × 10{sup 6} was established as the lower range because it provided the minimal accuracy level. Results: As expected, an increase in particle counts linearly increases run time. For the lowest particle count, the time varied from 30 hours for the single-processor run, to 0.30 hours for the 100-processor run. Conclusion: Parallel processing of MC calculations in the EGS framework significantly decreases time necessary for each kernel dose calculation. Particle counts lower than 1 × 10{sup 6} have too large of an error to output accurate dose for a Monte Carlo kernel calculation. Future work will investigate increasing the number of parallel processes and optimizing run times for multiple kernel calculations.« less
Estimating population trends with a linear model
Bart, Jonathan; Collins, Brian D.; Morrison, R.I.G.
2003-01-01
We describe a simple and robust method for estimating trends in population size. The method may be used with Breeding Bird Survey data, aerial surveys, point counts, or any other program of repeated surveys at permanent locations. Surveys need not be made at each location during each survey period. The method differs from most existing methods in being design based, rather than model based. The only assumptions are that the nominal sampling plan is followed and that sample size is large enough for use of the t-distribution. Simulations based on two bird data sets from natural populations showed that the point estimate produced by the linear model was essentially unbiased even when counts varied substantially and 25% of the complete data set was missing. The estimating-equation approach, often used to analyze Breeding Bird Survey data, performed similarly on one data set but had substantial bias on the second data set, in which counts were highly variable. The advantages of the linear model are its simplicity, flexibility, and that it is self-weighting. A user-friendly computer program to carry out the calculations is available from the senior author.
NASA Technical Reports Server (NTRS)
Unger, Eric R.; Hager, James O.; Agrawal, Shreekant
1999-01-01
This paper is a discussion of the supersonic nonlinear point design optimization efforts at McDonnell Douglas Aerospace under the High-Speed Research (HSR) program. The baseline for these optimization efforts has been the M2.4-7A configuration which represents an arrow-wing technology for the High-Speed Civil Transport (HSCT). Optimization work on this configuration began in early 1994 and continued into 1996. Initial work focused on optimization of the wing camber and twist on a wing/body configuration and reductions of 3.5 drag counts (Euler) were realized. The next phase of the optimization effort included fuselage camber along with the wing and a drag reduction of 5.0 counts was achieved. Including the effects of the nacelles and diverters into the optimization problem became the next focus where a reduction of 6.6 counts (Euler W/B/N/D) was eventually realized. The final two phases of the effort included a large set of constraints designed to make the final optimized configuration more realistic and they were successful albeit with a loss of performance.
Using Pinochle to motivate the restricted combinations with repetitions problem
NASA Astrophysics Data System (ADS)
Gorman, Patrick S.; Kunkel, Jeffrey D.; Vasko, Francis J.
2011-07-01
A standard example used in introductory combinatoric courses is to count the number of five-card poker hands possible from a straight deck of 52 distinct cards. A more interesting problem is to count the number of distinct hands possible from a Pinochle deck in which there are multiple, but obviously limited, copies of each type of card (two copies for single-deck, four for double deck). This problem is more interesting because our only concern is to count the number of distinguishable hands that can be dealt. In this note, under various scenarios, we will discuss two combinatoric techniques for counting these hands; namely, the inclusion-exclusion principle and generating functions. We will then show that these Pinochle examples motivate a general counting formula for what are called 'regular' combinations by Riordan. Finally, we prove the correctness of this formula using generating functions.
Data acquisition and analysis of the UNCOSS underwater explosive neutron sensor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carasco, C.; Eleon, C.; Perot, B.
2011-07-01
The purpose of the FP7 UNCOSS project (Underwater Coastal Sea Surveyor, http://www.uncoss-project.org) is to develop a neutron-based underwater explosive sensor to detect unexploded ordnance lying on the sea bottom. The Associated Particle Technique is used to focus the inspection on a suspicious object located by optical and electromagnetic sensors and to determine if there is an explosive charge inside. This paper presents the data acquisition electronics and data analysis software which have been developed for this project. The electronics digitize and process the signal in real-time based on a field programmable gate array structure to perform precise time-of-flight and gamma-raymore » energy measurements. UNCOSS software offers the basic tools to analyze the time-of-flight and energy spectra of the interrogated object. It allows to unfold the gamma-ray spectrum into pure elemental count proportions, mainly C, N, O, Fe, Al, Si, and Ca. The C, N, and O count fractions are converted into chemical proportions by taking into account the gamma-ray production cross sections, as well as neutron and photon attenuation in the different shields between the ROV (Remotely Operated Vehicle) and the explosive, such as the explosive iron shell, seawater, and ROV envelop. These chemical ratios are plotted in a two-dimensional (2D) barycentric representation to position the measured point with respect to common explosives. The systematic uncertainty due to the above attenuation effects and counting statistical fluctuations are combined with a Monte Carlo method to provide a 3D uncertainty area in a barycentric plot, which allows to determine the most probable detected materials in view to make a decision about the presence of explosive. (authors)« less
Pharmacokinetics and efficacy of intraocular flurbiprofen.
Blazaki, S; Tsika, C; Tzatzarakis, M; Naoumidi, E; Tsatsakis, A; Tsatsanis, C; Tsilimbaris, Miltiadis K
2017-12-01
Intravitreal delivery of non-steroidal anti-inflammatory drugs could be an effective way to treat macular edema caused by posterior segment inflammation. In this study, we evaluated the intravitreal bioavailability and anti-inflammatory efficacy of flurbiprofen in rabbit eyes. For pharmacokinetics, 0.1 ml of 7.66 mg/ml flurbiprofen solution was injected intravitreally and vitreous drug levels were analyzed at specific time points using LC-MS technique. For efficacy, 100 ng lipopolysaccharide of E.coli was injected intravitreally in rabbits to induce inflammation. The animals were separated in three groups and received intraocular flurbiprofen, dexamethasone and PBS to serve as control. Complete ocular examination and total cell count in aqueous fluid were determined to evaluate the extent of inflammation. Eyes were then enucleated for histopathology analysis. The efficacy in the uveitis model was determined by clinical signs of inflammation, total leukocyte count and histology findings. No adverse events were observed during pharmacokinetic assessment. No signs of inflammation, hemorrhage or retina detachment were detected. The recovery of flurbiprofen from vitreous samples was 92.6%. The half-life of flurbiprofen was estimated to be 1.92 h with an elimination constant rate (K) of 0.36. Treatment with intraocular injections of flurbiprofen and dexamethasone significantly reduced total leukocyte count in a manner comparable to dexamethasone [reduction of 96.84% (p < 0.05) and 97.44% (p < 0.05), respectively]. Histologic studies demonstrated significantly less signs of ocular inflammation after flurbiprofen injection compared to control eyes. Flurbiprofen is effective in suppressing inflammation in this experimental uveitis model. In our experimental setting, intravitreal flurbiprofen seem to have a therapeutic result comparable to dexamethasone. However, the half-life of the drug remains short, necessitating further research to prolong its presence in the vitreous cavity.
Ogungbenro, Kayode; Aarons, Leon
2011-08-01
In the recent years, interest in the application of experimental design theory to population pharmacokinetic (PK) and pharmacodynamic (PD) experiments has increased. The aim is to improve the efficiency and the precision with which parameters are estimated during data analysis and sometimes to increase the power and reduce the sample size required for hypothesis testing. The population Fisher information matrix (PFIM) has been described for uniresponse and multiresponse population PK experiments for design evaluation and optimisation. Despite these developments and availability of tools for optimal design of population PK and PD experiments much of the effort has been focused on repeated continuous variable measurements with less work being done on repeated discrete type measurements. Discrete data arise mainly in PDs e.g. ordinal, nominal, dichotomous or count measurements. This paper implements expressions for the PFIM for repeated ordinal, dichotomous and count measurements based on analysis by a mixed-effects modelling technique. Three simulation studies were used to investigate the performance of the expressions. Example 1 is based on repeated dichotomous measurements, Example 2 is based on repeated count measurements and Example 3 is based on repeated ordinal measurements. Data simulated in MATLAB were analysed using NONMEM (Laplace method) and the glmmML package in R (Laplace and adaptive Gauss-Hermite quadrature methods). The results obtained for Examples 1 and 2 showed good agreement between the relative standard errors obtained using the PFIM and simulations. The results obtained for Example 3 showed the importance of sampling at the most informative time points. Implementation of these expressions will provide the opportunity for efficient design of population PD experiments that involve discrete type data through design evaluation and optimisation.
The clinical significance of platelet counts in the first 24 hours after severe injury.
Stansbury, Lynn G; Hess, Aaron S; Thompson, Kwaku; Kramer, Betsy; Scalea, Thomas M; Hess, John R
2013-04-01
Admission platelet (PLT) counts are known to be associated with all-cause mortality for seriously injured patients admitted to a trauma center. The course of subsequent PLT counts, their implications, and the effects of PLT therapy are less well known. Trauma center patients who were directly admitted from the scene of injury, received 1 or more units of uncrossmatched red blood cells in the first hour of care, survived for at least 15 minutes, and had a PLT count measured in the first hour were analyzed for the association of their admission and subsequent PLT counts in the first 24 hours with injury severity and hemorrhagic and central nervous system (CNS) causes of in-hospital mortality. Over an 8.25-year period, 1292 of 45,849 direct trauma admissions met entry criteria. Admission PLT counts averaged 228×10(9) ±90×10(9) /L and decreased by 104×10(9) /L by the second hour and 1×10(9) /L each hour thereafter. The admission count was not related to time to admission. Each 1-point increase in the injury severity score was associated with a 1×10(9) /L decrease in the PLT count at all times in the first 24 hours of care. Admission PLT counts were strongly associated with hemorrhagic and CNS injury mortality and subsequent PLT counts. Effects of PLT therapy could not be ascertained. Admission PLT counts in critically injured trauma patients are usually normal, decreasing after admission. Low PLT counts at admission and during the course of trauma care are strongly associated with mortality. © 2012 American Association of Blood Banks.
Evaluation of a Multicolor, Single-Tube Technique To Enumerate Lymphocyte Subpopulations▿
Colombo, F.; Cattaneo, A.; Lopa, R.; Portararo, P.; Rebulla, P.; Porretti, L.
2008-01-01
To evaluate the fully automated FACSCanto software, we compared lymphocyte subpopulation counts obtained using three-color FACSCalibur-CELLQuest and six-color FACSCanto-FACSCanto software techniques. High correlations were observed between data obtained with these techniques. Our study indicated that FACSCanto clinical software is accurate and sensitive in single-platform lymphocyte immunophenotyping. PMID:18448621
Mallet, Delphine; Wantiez, Laurent; Lemouellic, Soazig; Vigliola, Laurent; Pelletier, Dominique
2014-01-01
Estimating diversity and abundance of fish species is fundamental for understanding community structure and dynamics of coral reefs. When designing a sampling protocol, one crucial step is the choice of the most suitable sampling technique which is a compromise between the questions addressed, the available means and the precision required. The objective of this study is to compare the ability to sample reef fish communities at the same locations using two techniques based on the same stationary point count method: one using Underwater Visual Census (UVC) and the other rotating video (STAVIRO). UVC and STAVIRO observations were carried out on the exact same 26 points on the reef slope of an intermediate reef and the associated inner barrier reefs. STAVIRO systems were always deployed 30 min to 1 hour after UVC and set exactly at the same place. Our study shows that; (i) fish community observations by UVC and STAVIRO differed significantly; (ii) species richness and density of large species were not significantly different between techniques; (iii) species richness and density of small species were higher for UVC; (iv) density of fished species was higher for STAVIRO and (v) only UVC detected significant differences in fish assemblage structure across reef type at the spatial scale studied. We recommend that the two techniques should be used in a complementary way to survey a large area within a short period of time. UVC may census reef fish within complex habitats or in very shallow areas such as reef flat whereas STAVIRO would enable carrying out a large number of stations focused on large and diver-averse species, particularly in the areas not covered by UVC due to time and depth constraints. This methodology would considerably increase the spatial coverage and replication level of fish monitoring surveys. PMID:24392126
USDA-ARS?s Scientific Manuscript database
Different measurement techniques for aerosol characterization and quantification either directly or indirectly measure different aerosol properties (i.e. count, mass, speciation, etc.). Comparisons and combinations of multiple measurement techniques sampling the same aerosol can provide insight into...
... such as a complete blood count In some cases, you may need surgery right away. This may involve an exploratory laparotomy or an emergency appendectomy . Alternative Names Abdominal tenderness Images Anatomical landmarks, ...
Beta/alpha continuous air monitor
Becker, G.K.; Martz, D.E.
1988-06-27
A single deep layer silicon detector in combination with a microcomputer, recording both alpha and beta activity and the energy of each pulse, distinquishing energy peaks using a novel curve fitting technique to reduce the natural alpha counts in the energy region where plutonium and other transuranic alpha emitters are present, and using a novel algorithm to strip out radon daughter contribution to actual beta counts. 7 figs.
Habash, Marc; Johns, Robert
2009-10-01
This study compared an automated Escherichia coli and coliform detection system with the membrane filtration direct count technique for water testing. The automated instrument performed equal to or better than the membrane filtration test in analyzing E. coli-spiked samples and blind samples with interference from Proteus vulgaris or Aeromonas hydrophila.
Benítez, Francisco Moreno; Camacho, Antonio Letrán; del Cuvillo Bernal, Alfonso; de Medina, Pedro Lobatón Sánchez; García Cózar, Francisco J; Romeu, Marisa Espinazo
2014-01-01
There is an increase in the incidence of pollen related allergy, thus information on pollen schedules would be a great asset for physicians to improve the clinical care of patients. Like cypress pollen sensitization shows a high prevalence among the causes of allergic rhinitis, and therefore it is of interest to use it like a model of study, distinguishing cypress pollen, pollen count, and allergenic load level. In this work, we use a flow cytometry based technique to obtain both Cupressus arizonica pollen count and allergenic load, using specific rabbit polyclonal antibody Cup a1 and its comparison with optical microscopy technique measurement. Airborne samples were collected from Burkard Spore-Trap and Burkard Cyclone Cupressus arizonica pollen was studied using specific rabbit polyclonal antibody Cup a1, labeled with AlexaFluor(®) 488 or 750 and analysed by Flow Cytometry in both an EPICS XL and Cyan ADP cytometers (Beckman Coulter(®) ). Optical microscopy study was realized with a Leica optical microscope. Bland and Altman was used to determine agreement between both techniques measured. We can identify three different populations based on rabbit polyclonal antibody Cup a1 staining. The main region (44.5%) had 97.3% recognition, a second region (25%) with 28% and a third region (30.5%) with 68% respectively. Immunofluorescence and confocal microscopy showed that main region corresponds to whole pollen grains, the second region are pollen without exine and the third region is constituted by smaller particles with allergenic properties. Pollen schedule shows a higher correlation measured by optical microscopy and flow cytometry in the pollen count with a P-value: 0.0008 E(-2) and 0.0002 with regard to smaller particles, so the Bland and Altman measurement showed a good correlation between them, P-value: 0.0003. Determination of pollen count and allergenic load by flow cytometry represents an important tool in the determination of airborne respiratory allergens. We showed that not only whole pollen but also smaller particles could induce allergic sensitization. This is the first study where flow cytometry is used for calculating pollen counts and allergenic load. Copyright © 2013 Clinical Cytometry Society.
The impact of varicocelectomy on sperm parameters: a meta-analysis.
Schauer, Ingrid; Madersbacher, Stephan; Jost, Romy; Hübner, Wilhelm Alexander; Imhof, Martin
2012-05-01
We determined the impact of 3 surgical techniques (high ligation, inguinal varicocelectomy and the subinguinal approach) for varicocelectomy on sperm parameters (count and motility) and pregnancy rates. By searching the literature using MEDLINE and the Cochrane Library with the last search performed in February 2011, focusing on the last 20 years, a total of 94 articles published between 1975 and 2011 reporting on sperm parameters before and after varicocelectomy were identified. Inclusion criteria for this meta-analysis were at least 2 semen analyses (before and 3 or more months after the procedure), patient age older than 19 years, clinical subfertility and/or abnormal semen parameters, and a clinically palpable varicocele. To rule out skewing factors a bias analysis was performed, and statistical analysis was done with RevMan5(®) and SPSS 15.0(®). A total of 14 articles were included in the statistical analysis. All 3 surgical approaches led to significant or highly significant postoperative improvement of both parameters with only slight numeric differences among the techniques. This difference did not reach statistical significance for sperm count (p = 0.973) or sperm motility (p = 0.372). After high ligation surgery sperm count increased by 10.85 million per ml (p = 0.006) and motility by 6.80% (p <0.00001) on the average. Inguinal varicocelectomy led to an improvement in sperm count of 7.17 million per ml (p <0.0001) while motility changed by 9.44% (p = 0.001). Subinguinal varicocelectomy provided an increase in sperm count of 9.75 million per ml (p = 0.002) and sperm motility by 12.25% (p = 0.001). Inguinal varicocelectomy showed the highest pregnancy rate of 41.48% compared to 26.90% and 26.56% after high ligation and subinguinal varicocelectomy, respectively, and the difference was statistically significant (p = 0.035). This meta-analysis suggests that varicocelectomy leads to significant improvements in sperm count and motility regardless of surgical technique, with the inguinal approach offering the highest pregnancy rate. Copyright © 2012 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.
The SWIFT AGN and Cluster Survey. I. Number Counts of AGNs and Galaxy Clusters
NASA Astrophysics Data System (ADS)
Dai, Xinyu; Griffin, Rhiannon D.; Kochanek, Christopher S.; Nugent, Jenna M.; Bregman, Joel N.
2015-05-01
The Swift active galactic nucleus (AGN) and Cluster Survey (SACS) uses 125 deg2 of Swift X-ray Telescope serendipitous fields with variable depths surrounding γ-ray bursts to provide a medium depth (4× {{10}-15} erg cm-2 s-1) and area survey filling the gap between deep, narrow Chandra/XMM-Newton surveys and wide, shallow ROSAT surveys. Here, we present a catalog of 22,563 point sources and 442 extended sources and examine the number counts of the AGN and galaxy cluster populations. SACS provides excellent constraints on the AGN number counts at the bright end with negligible uncertainties due to cosmic variance, and these constraints are consistent with previous measurements. We use Wide-field Infrared Survey Explorer mid-infrared (MIR) colors to classify the sources. For AGNs we can roughly separate the point sources into MIR-red and MIR-blue AGNs, finding roughly equal numbers of each type in the soft X-ray band (0.5-2 keV), but fewer MIR-blue sources in the hard X-ray band (2-8 keV). The cluster number counts, with 5% uncertainties from cosmic variance, are also consistent with previous surveys but span a much larger continuous flux range. Deep optical or IR follow-up observations of this cluster sample will significantly increase the number of higher-redshift (z\\gt 0.5) X-ray-selected clusters.
Statistical Measurement of the Gamma-Ray Source-count Distribution as a Function of Energy
NASA Astrophysics Data System (ADS)
Zechlin, Hannes-S.; Cuoco, Alessandro; Donato, Fiorenza; Fornengo, Nicolao; Regis, Marco
2016-08-01
Statistical properties of photon count maps have recently been proven as a new tool to study the composition of the gamma-ray sky with high precision. We employ the 1-point probability distribution function of six years of Fermi-LAT data to measure the source-count distribution dN/dS and the diffuse components of the high-latitude gamma-ray sky as a function of energy. To that aim, we analyze the gamma-ray emission in five adjacent energy bands between 1 and 171 GeV. It is demonstrated that the source-count distribution as a function of flux is compatible with a broken power law up to energies of ˜50 GeV. The index below the break is between 1.95 and 2.0. For higher energies, a simple power-law fits the data, with an index of {2.2}-0.3+0.7 in the energy band between 50 and 171 GeV. Upper limits on further possible breaks as well as the angular power of unresolved sources are derived. We find that point-source populations probed by this method can explain {83}-13+7% ({81}-19+52%) of the extragalactic gamma-ray background between 1.04 and 1.99 GeV (50 and 171 GeV). The method has excellent capabilities for constraining the gamma-ray luminosity function and the spectra of unresolved blazars.
20 CFR 416.1111 - How we count earned income.
Code of Federal Regulations, 2010 CFR
2010-04-01
... wages at the earliest of the following points: when you receive them or when they are credited to your... item that is not fully paid for and are responsible for the unpaid balance, only the paid-up value is... services rendered, at the earliest of the following points: when you receive them, when they are credited...
Predicting Bird Response to Alternative Management Scenarios on a Ranch in Campeche, México
Paul A. Wood; Deanna K. Dawson; John R. Sauer; Marcia H. Wilson
2005-01-01
We developed models to predict the potential response of wintering Neotropical migrant and resident bird species to alternative management scenarios, using data from point counts of birds along with habitat variables measured or estimated from remotely sensed data in a Geographic Information System. Expected numbers of occurrences at points were calculated for 100...
Code of Federal Regulations, 2011 CFR
2011-01-01
...—Shipping Point 1 (A) For 1 through 20 Samples Factor Grades AL 2 Number of 33-count samples 3 1 2 3 4 5 6 7.... 2 Russet. Table I—Shipping Point 1 (Continued) (B) For 21 through 40 Samples Factor Grades AL 2... outside the continental United States, the port of entry into the United States. 2 AL—Absolute limit...
Code of Federal Regulations, 2012 CFR
2012-01-01
...—Shipping Point 1 (A) For 1 through 20 Samples Factor Grades AL 2 Number of 33-count samples 3 1 2 3 4 5 6 7.... 2 Russet. Table I—Shipping Point 1 (Continued) (B) For 21 through 40 Samples Factor Grades AL 2... outside the continental United States, the port of entry into the United States. 2 AL—Absolute limit...
Code of Federal Regulations, 2010 CFR
2010-01-01
...—Shipping Point 1 (A) For 1 through 20 Samples Factor Grades AL 2 Number of 33-count samples 3 1 2 3 4 5 6 7.... 2 Russet. Table I—Shipping Point 1 (Continued) (B) For 21 through 40 Samples Factor Grades AL 2... outside the continental United States, the port of entry into the United States. 2 AL—Absolute limit...
Optical-communication systems for deep-space applications
NASA Technical Reports Server (NTRS)
Vilnrotter, V. A.; Gagliardi, R. M.
1980-01-01
The feasibility of using optical communication systems for data telemetry from deep space vehicles to Earth based receivers is evaluated. Performance analysis shows that practical, photon counting optical systems can transmit data reliably at 30 to 40 dB high rates than existing RF systems, or can be used to extend the communication range by 15 to 20 dB. The advantages of pulse-position modulation (PPM) formats are discussed, and photon counting receiver structures designed for PPM decoding are described. The effects of background interference and weather on receiver performance are evaluated. Some consideration is given to tracking and beam pointing operations, since system performance ultimately depends on the accuracy to which these operations can be carried out. An example of a tracking and pointing system utilizing an optical uplink beacon is presented, and it is shown that microradian beam pointing is within the capabilities of state-of-the-art technology. Recommendations for future theoretical studies and component development programs are presented.
A Review of Statistical Disclosure Control Techniques Employed by Web-Based Data Query Systems.
Matthews, Gregory J; Harel, Ofer; Aseltine, Robert H
We systematically reviewed the statistical disclosure control techniques employed for releasing aggregate data in Web-based data query systems listed in the National Association for Public Health Statistics and Information Systems (NAPHSIS). Each Web-based data query system was examined to see whether (1) it employed any type of cell suppression, (2) it used secondary cell suppression, and (3) suppressed cell counts could be calculated. No more than 30 minutes was spent on each system. Of the 35 systems reviewed, no suppression was observed in more than half (n = 18); observed counts below the threshold were observed in 2 sites; and suppressed values were recoverable in 9 sites. Six sites effectively suppressed small counts. This inquiry has revealed substantial weaknesses in the protective measures used in data query systems containing sensitive public health data. Many systems utilized no disclosure control whatsoever, and the vast majority of those that did deployed it inconsistently or inadequately.
Compact SPAD-Based Pixel Architectures for Time-Resolved Image Sensors
Perenzoni, Matteo; Pancheri, Lucio; Stoppa, David
2016-01-01
This paper reviews the state of the art of single-photon avalanche diode (SPAD) image sensors for time-resolved imaging. The focus of the paper is on pixel architectures featuring small pixel size (<25 μm) and high fill factor (>20%) as a key enabling technology for the successful implementation of high spatial resolution SPAD-based image sensors. A summary of the main CMOS SPAD implementations, their characteristics and integration challenges, is provided from the perspective of targeting large pixel arrays, where one of the key drivers is the spatial uniformity. The main analog techniques aimed at time-gated photon counting and photon timestamping suitable for compact and low-power pixels are critically discussed. The main features of these solutions are the adoption of analog counting techniques and time-to-analog conversion, in NMOS-only pixels. Reliable quantum-limited single-photon counting, self-referenced analog-to-digital conversion, time gating down to 0.75 ns and timestamping with 368 ps jitter are achieved. PMID:27223284
Detection of bremsstrahlung radiation of 90Sr-90Y for emergency lung counting.
Ho, A; Hakmana Witharana, S S; Jonkmans, G; Li, L; Surette, R A; Dubeau, J; Dai, X
2012-09-01
This study explores the possibility of developing a field-deployable (90)Sr detector for rapid lung counting in emergency situations. The detection of beta-emitters (90)Sr and its daughter (90)Y inside the human lung via bremsstrahlung radiation was performed using a 3″ × 3″ NaI(Tl) crystal detector and a polyethylene-encapsulated source to emulate human lung tissue. The simulation results show that this method is a viable technique for detecting (90)Sr with a minimum detectable activity (MDA) of 1.07 × 10(4) Bq, using a realistic dual-shielded detector system in a 0.25-µGy h(-1) background field for a 100-s scan. The MDA is sufficiently sensitive to meet the requirement for emergency lung counting of Type S (90)Sr intake. The experimental data were verified using Monte Carlo calculations, including an estimate for internal bremsstrahlung, and an optimisation of the detector geometry was performed. Optimisations in background reduction techniques and in the electronic acquisition systems are suggested.
Curtis, Jacqueline W
2017-01-01
Census tracts are often used to investigate area-based correlates of a variety of health outcomes. This approach has been shown to be valuable in understanding the ways that health is shaped by place and to design appropriate interventions that account for community-level processes. Following this line of inquiry, it is common in the study of pedestrian injuries to aggregate the point level locations of these injuries to the census tracts in which they occur. Such aggregation enables investigation of the relationships between a range of socioeconomic variables and areas of notably high or low incidence. This study reports on the spatial distribution of child pedestrian injuries in a mid-sized U.S. city over a three-year period. Utilizing a combination of geospatial approaches, Near Analysis, Kernel Density Estimation, and Local Moran's I, enables identification, visualization, and quantification of close proximity between incidents and tract boundaries. Specifically, results reveal that nearly half of the 100 incidents occur within roads that are also census tract boundaries. Results also uncover incidents that occur on tract boundaries, not merely near them. This geographic pattern raises the question of the utility of associating area-based census data from any one tract to the injuries occurring in these border zones. Furthermore, using a standard spatial join technique in a Geographic Information System (GIS), these points located on the border are counted as falling into census tracts on both sides of the boundary, which introduces uncertainty in any subsequent analysis. Therefore, two additional approaches of aggregating points to polygons were tested in this study. Results differ with each approach, but without any alert of such differences to the GIS user. This finding raises a fundamental concern about techniques through which points are aggregated to polygons in any study using point level incidents and their surrounding census tract socioeconomic data to understand health and place. This study concludes with a suggested protocol to test for this source of uncertainty in analysis and an approach that may remove it.
Evaluation of petrifilm series 2000 as a possible rapid method to count coliforms in foods.
Priego, R; Medina, L M; Jordano, R
2000-08-01
This research note is a preliminary comparison between the Petrifilm 2000 method and a widely used traditional enumeration method (on violet red bile agar); six batches of different foods (egg, frozen green beans, fresh sausage, a bakery product, raw minced meat, and raw milk) were studied. The reliability of the presumptive counts taken at 10, 12, and 14 h of incubation using this method was also verified by comparing the counts with the total confirmed counts at 24 h. In all the batches studied, results obtained with Petrifilm 2000 presented a close correlation to those obtained using violet red bile agar (r = 0.860) and greater sensitivity (93.33% of the samples displayed higher counts on Petrifilm 2000), showing that this method is a reliable and efficient alternative. The count taken at 10-h incubation is of clear interest as an early indicator of results in microbiological food control, since it accounted for 90% of the final count in all the batches analyzed. Counts taken at 12 and 14 h bore a greater similarity to those taken at 24 h. The Petrifilm 2000 method provides results in less than 12 h of incubation, making it a possible rapid method that adapts perfectly to hazard analysis critical control point system by enabling the microbiological quality control of the processing.
Impact of donor- and collection-related variables on product quality in ex utero cord blood banking.
Askari, Sabeen; Miller, John; Chrysler, Gayl; McCullough, Jeffrey
2005-02-01
Optimizing product quality is a current focus in cord blood banking. This study evaluates the role of selected donor- and collection-related variables. Retrospective review was performed of cord blood units (CBUs) collected ex utero between February 1, 2000, and February 28, 2002. Preprocessing volume and total nucleated cell (TNC) counts and postprocessing CD34 cell counts were used as product quality indicators. Of 2084 CBUs, volume determinations and TNC counts were performed on 1628 and CD34+ counts on 1124 CBUs. Mean volume and TNC and CD34+ counts were 85.2 mL, 118.9 x 10(7), and 5.2 x 10(6), respectively. In univariate analysis, placental weight of greater than 500 g and meconium in amniotic fluid correlated with better volume and TNC and CD34+ counts. Greater than 40 weeks' gestation predicted enhanced volume and TNC count. Cesarean section, two- versus one-person collection, and not greater than 5 minutes between placental delivery and collection produced superior volume. Increased TNC count was also seen in Caucasian women, primigravidae, female newborns, and collection duration of more than 5 minutes. A time between delivery of newborn and placenta of not greater than 10 minutes predicted better volume and CD34+ count. By regression analysis, collection within not greater than 5 minutes of placental delivery produced superior volume and TNC count. Donor selection and collection technique modifications may improve product quality. TNC count appears to be more affected by different variables than CD34+ count.
Differential white cell count by centrifugal microfluidics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sommer, Gregory Jon; Tentori, Augusto M.; Schaff, Ulrich Y.
We present a method for counting white blood cells that is uniquely compatible with centrifugation based microfluidics. Blood is deposited on top of one or more layers of density media within a microfluidic disk. Spinning the disk causes the cell populations within whole blood to settle through the media, reaching an equilibrium based on the density of each cell type. Separation and fluorescence measurement of cell types stained with a DNA dye is demonstrated using this technique. The integrated signal from bands of fluorescent microspheres is shown to be proportional to their initial concentration in suspension. Among the current generationmore » of medical diagnostics are devices based on the principle of centrifuging a CD sized disk functionalized with microfluidics. These portable 'lab on a disk' devices are capable of conducting multiple assays directly from a blood sample, embodied by platforms developed by Gyros, Samsung, and Abaxis. [1,2] However, no centrifugal platform to date includes a differential white blood cell count, which is an important metric complimentary to diagnostic assays. Measuring the differential white blood cell count (the relative fraction of granulocytes, lymphocytes, and monocytes) is a standard medical diagnostic technique useful for identifying sepsis, leukemia, AIDS, radiation exposure, and a host of other conditions that affect the immune system. Several methods exist for measuring the relative white blood cell count including flow cytometry, electrical impedance, and visual identification from a stained drop of blood under a microscope. However, none of these methods is easily incorporated into a centrifugal microfluidic diagnostic platform.« less
Allen, Robert C; John, Mallory G; Rutan, Sarah C; Filgueira, Marcelo R; Carr, Peter W
2012-09-07
A singular value decomposition-based background correction (SVD-BC) technique is proposed for the reduction of background contributions in online comprehensive two-dimensional liquid chromatography (LC×LC) data. The SVD-BC technique was compared to simply subtracting a blank chromatogram from a sample chromatogram and to a previously reported background correction technique for one dimensional chromatography, which uses an asymmetric weighted least squares (AWLS) approach. AWLS was the only background correction technique to completely remove the background artifacts from the samples as evaluated by visual inspection. However, the SVD-BC technique greatly reduced or eliminated the background artifacts as well and preserved the peak intensity better than AWLS. The loss in peak intensity by AWLS resulted in lower peak counts at the detection thresholds established using standards samples. However, the SVD-BC technique was found to introduce noise which led to detection of false peaks at the lower detection thresholds. As a result, the AWLS technique gave more precise peak counts than the SVD-BC technique, particularly at the lower detection thresholds. While the AWLS technique resulted in more consistent percent residual standard deviation values, a statistical improvement in peak quantification after background correction was not found regardless of the background correction technique used. Copyright © 2012 Elsevier B.V. All rights reserved.
Time-Resolved Rayleigh Scattering Measurements in Hot Gas Flows
NASA Technical Reports Server (NTRS)
Mielke, Amy F.; Elam, Kristie A.; Sung, Chih-Jen
2008-01-01
A molecular Rayleigh scattering technique is developed to measure time-resolved gas velocity, temperature, and density in unseeded gas flows at sampling rates up to 32 kHz. A high power continuous-wave laser beam is focused at a point in an air flow field and Rayleigh scattered light is collected and fiber-optically transmitted to the spectral analysis and detection equipment. The spectrum of the light, which contains information about the temperature and velocity of the flow, is analyzed using a Fabry-Perot interferometer. Photomultipler tubes operated in the photon counting mode allow high frequency sampling of the circular interference pattern to provide time-resolved flow property measurements. Mean and rms velocity and temperature fluctuation measurements in both an electrically-heated jet facility with a 10-mm diameter nozzle and also in a hydrogen-combustor heated jet facility with a 50.8-mm diameter nozzle at NASA Glenn Research Center are presented.
Grainsize evolution and differential comminution in an experimental regolith
NASA Technical Reports Server (NTRS)
Horz, F.; Cintala, M.; See, T.
1984-01-01
The comminution of planetary surfaces by exposure to continuous meteorite bombardment was simulated by impacting the same fragmental gabbro target 200 times. The role of comminution and in situ gardening of planetary regoliths was addressed. Mean grain size continuously decreased with increasing shot number. Initially it decreased linearly with accumulated energy, but at some stage comminution efficiency started to decrease gradually. Point counting techniques, aided by the electron microprobe for mineral identification, were performed on a number of comminution products. Bulk chemical analyses of specific grain size fractions were also carried out. The finest sizes ( 10 microns) display generally the strongest enrichment/depletion factors. Similar, if not exactly identical, trends are reported from lunar soils. It is, therefore, not necessarily correct to explain the chemical characteristics of various grain sizes via different admixtures of materials from distant source terrains. Differential comminution of local source rocks may be the dominating factor.
An Ensemble Method for Spelling Correction in Consumer Health Questions
Kilicoglu, Halil; Fiszman, Marcelo; Roberts, Kirk; Demner-Fushman, Dina
2015-01-01
Orthographic and grammatical errors are a common feature of informal texts written by lay people. Health-related questions asked by consumers are a case in point. Automatic interpretation of consumer health questions is hampered by such errors. In this paper, we propose a method that combines techniques based on edit distance and frequency counts with a contextual similarity-based method for detecting and correcting orthographic errors, including misspellings, word breaks, and punctuation errors. We evaluate our method on a set of spell-corrected questions extracted from the NLM collection of consumer health questions. Our method achieves a F1 score of 0.61, compared to an informed baseline of 0.29, achieved using ESpell, a spelling correction system developed for biomedical queries. Our results show that orthographic similarity is most relevant in spelling error correction in consumer health questions and that frequency and contextual information are complementary to orthographic features. PMID:26958208
Subpixel based defocused points removal in photon-limited volumetric dataset
NASA Astrophysics Data System (ADS)
Muniraj, Inbarasan; Guo, Changliang; Malallah, Ra'ed; Maraka, Harsha Vardhan R.; Ryle, James P.; Sheridan, John T.
2017-03-01
The asymptotic property of the maximum likelihood estimator (MLE) has been utilized to reconstruct three-dimensional (3D) sectional images in the photon counting imaging (PCI) regime. At first, multiple 2D intensity images, known as Elemental images (EI), are captured. Then the geometric ray-tracing method is employed to reconstruct the 3D sectional images at various depth cues. We note that a 3D sectional image consists of both focused and defocused regions, depending on the reconstructed depth position. The defocused portion is redundant and should be removed in order to facilitate image analysis e.g., 3D object tracking, recognition, classification and navigation. In this paper, we present a subpixel level three-step based technique (i.e. involving adaptive thresholding, boundary detection and entropy based segmentation) to discard the defocused sparse-samples from the reconstructed photon-limited 3D sectional images. Simulation results are presented demonstrating the feasibility and efficiency of the proposed method.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liebermeister, Lars, E-mail: lars.liebermeister@physik.uni-muenchen.de; Petersen, Fabian; Münchow, Asmus v.
2014-01-20
A diamond nano-crystal hosting a single nitrogen vacancy (NV) center is optically selected with a confocal scanning microscope and positioned deterministically onto the subwavelength-diameter waist of a tapered optical fiber (TOF) with the help of an atomic force microscope. Based on this nano-manipulation technique, we experimentally demonstrate the evanescent coupling of single fluorescence photons emitted by a single NV-center to the guided mode of the TOF. By comparing photon count rates of the fiber-guided and the free-space modes and with the help of numerical finite-difference time domain simulations, we determine a lower and upper bound for the coupling efficiency ofmore » (9.5 ± 0.6)% and (10.4 ± 0.7)%, respectively. Our results are a promising starting point for future integration of single photon sources into photonic quantum networks and applications in quantum information science.« less
Wild turkey poult survival in southcentral Iowa
Hubbard, M.W.; Garner, D.L.; Klaas, E.E.
1999-01-01
Poult survival is key to understanding annual change in wild turkey (Meleagris gallopavo) populations. Survival of eastern wild turkey poults (M. g. silvestris) 0-4 weeks posthatch was studied in southcentral Iowa during 1994-97. Survival estimates of poults were calculated based on biweekly flush counts and daily locations acquired via radiotelemetry. Poult survival averaged 0.52 ?? 0.14% (?? ?? SE) for telemetry counts and 0.40 ?? 0.15 for flush counts. No within-year or across-year differences were detected between estimation techniques. More than 72% (n = 32) of documented poult mortality occurred ???14 days posthatch, and mammalian predation accounted for 92.9% of documented mortality. If mortality agents are not of concern, we suggest biologists conduct 4-week flush counts to obtain poult survival estimates for use in population models and development of harvest recommendations.
Stochastic hybrid systems for studying biochemical processes.
Singh, Abhyudai; Hespanha, João P
2010-11-13
Many protein and mRNA species occur at low molecular counts within cells, and hence are subject to large stochastic fluctuations in copy numbers over time. Development of computationally tractable frameworks for modelling stochastic fluctuations in population counts is essential to understand how noise at the cellular level affects biological function and phenotype. We show that stochastic hybrid systems (SHSs) provide a convenient framework for modelling the time evolution of population counts of different chemical species involved in a set of biochemical reactions. We illustrate recently developed techniques that allow fast computations of the statistical moments of the population count, without having to run computationally expensive Monte Carlo simulations of the biochemical reactions. Finally, we review different examples from the literature that illustrate the benefits of using SHSs for modelling biochemical processes.
Point-of-entry treatment of petroleum contaminated water supplies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malley, J.P. Jr.; Eliason, P.A.; Wagler, J.L.
1993-03-01
Contamination of individual wells in rural area from leaking petroleum storage tanks poses unique problems for regulatory agencies utilities, and potentially responsible parties. A potential solution is the use of point-of-entry (POE) treatment techniques. Results indicate POE systems using aeration followed by granular activated carbon (GAC) are a viable, cost effective, short-term solution while ground water remediation is performed or an alternate drinking water supply is secured. Selection and design of POE systems should consider variations in water usage and contaminant concentrations. Iron and manganese did not affect POE system performance at the ten sites studied. However, iron precipitation wasmore » observed and may pose problems in some POE applications. Increased concentrations of nonpurgeable dissolved organic carbon consisting primarily of methy-t-butyl ether (MTBE) and hydrophilic petroleum hydrocarbons were found in the raw waters but did not affect volatile organic chemical (VOC) removals by aeration of GAC. Microbial activity as measured by heterotrophie plate count significantly increased through four of the ten POE systems studied. Reliability of the POE systems will best be achieved by specifying top quality system components, educating POE users, and providing routine maintenance and VOC monitoring. 20 refs., 9 figs., 4 tabs.« less