Sample records for simple counting technique

  1. Counting Tree Growth Rings Moderately Difficult to Distinguish

    Treesearch

    C. B. Briscoe; M. Chudnoff

    1964-01-01

    There is an extensive literature dealing with techniques and gadgets to facilitate counting tree growth rings. A relatively simple method is described below, satisfactory for species too difficult to count in the field, but not sufficiently difficult to require the preparation of microscope slides nor staining techniques.

  2. Simple to complex modeling of breathing volume using a motion sensor.

    PubMed

    John, Dinesh; Staudenmayer, John; Freedson, Patty

    2013-06-01

    To compare simple and complex modeling techniques to estimate categories of low, medium, and high ventilation (VE) from ActiGraph™ activity counts. Vertical axis ActiGraph™ GT1M activity counts, oxygen consumption and VE were measured during treadmill walking and running, sports, household chores and labor-intensive employment activities. Categories of low (<19.3 l/min), medium (19.3 to 35.4 l/min) and high (>35.4 l/min) VEs were derived from activity intensity classifications (light <2.9 METs, moderate 3.0 to 5.9 METs and vigorous >6.0 METs). We examined the accuracy of two simple techniques (multiple regression and activity count cut-point analyses) and one complex (random forest technique) modeling technique in predicting VE from activity counts. Prediction accuracy of the complex random forest technique was marginally better than the simple multiple regression method. Both techniques accurately predicted VE categories almost 80% of the time. The multiple regression and random forest techniques were more accurate (85 to 88%) in predicting medium VE. Both techniques predicted the high VE (70 to 73%) with greater accuracy than low VE (57 to 60%). Actigraph™ cut-points for light, medium and high VEs were <1381, 1381 to 3660 and >3660 cpm. There were minor differences in prediction accuracy between the multiple regression and the random forest technique. This study provides methods to objectively estimate VE categories using activity monitors that can easily be deployed in the field. Objective estimates of VE should provide a better understanding of the dose-response relationship between internal exposure to pollutants and disease. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Assessment of cell concentration and viability of isolated hepatocytes using flow cytometry.

    PubMed

    Wigg, Alan J; Phillips, John W; Wheatland, Loretta; Berry, Michael N

    2003-06-01

    The assessment of cell concentration and viability of freshly isolated hepatocyte preparations has been traditionally performed using manual counting with a Neubauer counting chamber and staining for trypan blue exclusion. Despite the simple and rapid nature of this assessment, concerns about the accuracy of these methods exist. Simple flow cytometry techniques which determine cell concentration and viability are available yet surprisingly have not been extensively used or validated with isolated hepatocyte preparations. We therefore investigated the use of flow cytometry using TRUCOUNT Tubes and propidium iodide staining to measure cell concentration and viability of isolated rat hepatocytes in suspension. Analysis using TRUCOUNT Tubes provided more accurate and reproducible measurement of cell concentration than manual cell counting. Hepatocyte viability, assessed using propidium iodide, correlated more closely than did trypan blue exclusion with all indicators of hepatocyte integrity and function measured (lactate dehydrogenase leakage, cytochrome p450 content, cellular ATP concentration, ammonia and lactate removal, urea and albumin synthesis). We conclude that flow cytometry techniques can be used to measure cell concentration and viability of isolated hepatocyte preparations. The techniques are simple, rapid, and more accurate than manual cell counting and trypan blue staining and the results are not affected by protein-containing media.

  4. The contribution of simple random sampling to observed variations in faecal egg counts.

    PubMed

    Torgerson, Paul R; Paul, Michaela; Lewis, Fraser I

    2012-09-10

    It has been over 100 years since the classical paper published by Gosset in 1907, under the pseudonym "Student", demonstrated that yeast cells suspended in a fluid and measured by a haemocytometer conformed to a Poisson process. Similarly parasite eggs in a faecal suspension also conform to a Poisson process. Despite this there are common misconceptions how to analyse or interpret observations from the McMaster or similar quantitative parasitic diagnostic techniques, widely used for evaluating parasite eggs in faeces. The McMaster technique can easily be shown from a theoretical perspective to give variable results that inevitably arise from the random distribution of parasite eggs in a well mixed faecal sample. The Poisson processes that lead to this variability are described and illustrative examples of the potentially large confidence intervals that can arise from observed faecal eggs counts that are calculated from the observations on a McMaster slide. Attempts to modify the McMaster technique, or indeed other quantitative techniques, to ensure uniform egg counts are doomed to failure and belie ignorance of Poisson processes. A simple method to immediately identify excess variation/poor sampling from replicate counts is provided. Copyright © 2012 Elsevier B.V. All rights reserved.

  5. Counting conformal correlators

    NASA Astrophysics Data System (ADS)

    Kravchuk, Petr; Simmons-Duffin, David

    2018-02-01

    We introduce simple group-theoretic techniques for classifying conformallyinvariant tensor structures. With them, we classify tensor structures of general n-point functions of non-conserved operators, and n ≥ 4-point functions of general conserved currents, with or without permutation symmetries, and in any spacetime dimension d. Our techniques are useful for bootstrap applications. The rules we derive simultaneously count tensor structures for flat-space scattering amplitudes in d + 1 dimensions.

  6. Measurement of total-body cobalt-57 vitamin B12 absorption with a gamma camera.

    PubMed

    Cardarelli, J A; Slingerland, D W; Burrows, B A; Miller, A

    1985-08-01

    Previously described techniques for the measurement of the absorption of [57Co]vitamin B12 by total-body counting have required an iron room equipped with scanning or multiple detectors. The present study uses simplifying modifications which make the technique more available and include the use of static geometry, the measurement of body thickness to correct for attenuation, a simple formula to convert the capsule-in-air count to a 100% absorption count, and finally the use of an adequately shielded gamma camera obviating the need of an iron room.

  7. A comprehensive comparison of simple step counting techniques using wrist- and ankle-mounted accelerometer and gyroscope signals.

    PubMed

    Rhudy, Matthew B; Mahoney, Joseph M

    2018-04-01

    The goal of this work is to compare the differences between various step counting algorithms using both accelerometer and gyroscope measurements from wrist and ankle-mounted sensors. Participants completed four different conditions on a treadmill while wearing an accelerometer and gyroscope on the wrist and the ankle. Three different step counting techniques were applied to the data from each sensor type and mounting location. It was determined that using gyroscope measurements allowed for better performance than the typically used accelerometers, and that ankle-mounted sensors provided better performance than those mounted on the wrist.

  8. Modeling the frequency-dependent detective quantum efficiency of photon-counting x-ray detectors.

    PubMed

    Stierstorfer, Karl

    2018-01-01

    To find a simple model for the frequency-dependent detective quantum efficiency (DQE) of photon-counting detectors in the low flux limit. Formula for the spatial cross-talk, the noise power spectrum and the DQE of a photon-counting detector working at a given threshold are derived. Parameters are probabilities for types of events like single counts in the central pixel, double counts in the central pixel and a neighboring pixel or single count in a neighboring pixel only. These probabilities can be derived in a simple model by extensive use of Monte Carlo techniques: The Monte Carlo x-ray propagation program MOCASSIM is used to simulate the energy deposition from the x-rays in the detector material. A simple charge cloud model using Gaussian clouds of fixed width is used for the propagation of the electric charge generated by the primary interactions. Both stages are combined in a Monte Carlo simulation randomizing the location of impact which finally produces the required probabilities. The parameters of the charge cloud model are fitted to the spectral response to a polychromatic spectrum measured with our prototype detector. Based on the Monte Carlo model, the DQE of photon-counting detectors as a function of spatial frequency is calculated for various pixel sizes, photon energies, and thresholds. The frequency-dependent DQE of a photon-counting detector in the low flux limit can be described with an equation containing only a small set of probabilities as input. Estimates for the probabilities can be derived from a simple model of the detector physics. © 2017 American Association of Physicists in Medicine.

  9. Deriving simple empirical relationships between aerodynamic and optical aerosol measurements and their application

    USDA-ARS?s Scientific Manuscript database

    Different measurement techniques for aerosol characterization and quantification either directly or indirectly measure different aerosol properties (i.e. count, mass, speciation, etc.). Comparisons and combinations of multiple measurement techniques sampling the same aerosol can provide insight into...

  10. Conversion from Engineering Units to Telemetry Counts on Dryden Flight Simulators

    NASA Technical Reports Server (NTRS)

    Fantini, Jay A.

    1998-01-01

    Dryden real-time flight simulators encompass the simulation of pulse code modulation (PCM) telemetry signals. This paper presents a new method whereby the calibration polynomial (from first to sixth order), representing the conversion from counts to engineering units (EU), is numerically inverted in real time. The result is less than one-count error for valid EU inputs. The Newton-Raphson method is used to numerically invert the polynomial. A reverse linear interpolation between the EU limits is used to obtain an initial value for the desired telemetry count. The method presented here is not new. What is new is how classical numerical techniques are optimized to take advantage of modem computer power to perform the desired calculations in real time. This technique makes the method simple to understand and implement. There are no interpolation tables to store in memory as in traditional methods. The NASA F-15 simulation converts and transmits over 1000 parameters at 80 times/sec. This paper presents algorithm development, FORTRAN code, and performance results.

  11. Comparison of inguinal approach, scrotal sclerotherapy and subinguinal antegrade sclerotherapy in varicocele treatment: a randomized prospective study.

    PubMed

    Fayez, A; El Shantaly, K M; Abbas, M; Hauser, S; Müller, S C; Fathy, A

    2010-01-01

    We compared outcome and complications of three simple varicocelectomy techniques. Groups were divided according to whether they would receive the Ivanissevich technique (n = 55), Tauber's technique (n = 51) or subinguinal sclerotherapy (n = 49). Selection criteria were: infertility >1 year, subnormal semen, sonographic diameter of veins >3 mm and time of regurge >2 s. Patients were randomly assigned to the groups of treatment, with follow-up every 3 months for 1 year. Improvement was only in sperm count and total motility for all groups. Pregnancy rates were 20, 13.73 and 12.24%, respectively, with no significant difference between groups. Hydrocele occurred only in the group which received the Ivanissevich technique (5.5%). Tauber's technique is simple; however, it has the disadvantage of multiple branching of small veins. Copyright © 2010 S. Karger AG, Basel.

  12. Forecasting in foodservice: model development, testing, and evaluation.

    PubMed

    Miller, J L; Thompson, P A; Orabella, M M

    1991-05-01

    This study was designed to develop, test, and evaluate mathematical models appropriate for forecasting menu-item production demand in foodservice. Data were collected from residence and dining hall foodservices at Ohio State University. Objectives of the study were to collect, code, and analyze the data; develop and test models using actual operation data; and compare forecasting results with current methods in use. Customer count was forecast using deseasonalized simple exponential smoothing. Menu-item demand was forecast by multiplying the count forecast by a predicted preference statistic. Forecasting models were evaluated using mean squared error, mean absolute deviation, and mean absolute percentage error techniques. All models were more accurate than current methods. A broad spectrum of forecasting techniques could be used by foodservice managers with access to a personal computer and spread-sheet and database-management software. The findings indicate that mathematical forecasting techniques may be effective in foodservice operations to control costs, increase productivity, and maximize profits.

  13. An analysis of I/O efficient order-statistic-based techniques for noise power estimation in the HRMS sky survey's operational system

    NASA Technical Reports Server (NTRS)

    Zimmerman, G. A.; Olsen, E. T.

    1992-01-01

    Noise power estimation in the High-Resolution Microwave Survey (HRMS) sky survey element is considered as an example of a constant false alarm rate (CFAR) signal detection problem. Order-statistic-based noise power estimators for CFAR detection are considered in terms of required estimator accuracy and estimator dynamic range. By limiting the dynamic range of the value to be estimated, the performance of an order-statistic estimator can be achieved by simpler techniques requiring only a single pass of the data. Simple threshold-and-count techniques are examined, and it is shown how several parallel threshold-and-count estimation devices can be used to expand the dynamic range to meet HRMS system requirements with minimal hardware complexity. An input/output (I/O) efficient limited-precision order-statistic estimator with wide but limited dynamic range is also examined.

  14. The Flushtration Count Illusion: Attribute substitution tricks our interpretation of a simple visual event sequence.

    PubMed

    Thomas, Cyril; Didierjean, André; Kuhn, Gustav

    2018-04-17

    When faced with a difficult question, people sometimes work out an answer to a related, easier question without realizing that a substitution has taken place (e.g., Kahneman, 2011, Thinking, fast and slow. New York, Farrar, Strauss, Giroux). In two experiments, we investigated whether this attribute substitution effect can also affect the interpretation of a simple visual event sequence. We used a magic trick called the 'Flushtration Count Illusion', which involves a technique used by magicians to give the illusion of having seen multiple cards with identical backs, when in fact only the back of one card (the bottom card) is repeatedly shown. In Experiment 1, we demonstrated that most participants are susceptible to the illusion, even if they have the visual and analytical reasoning capacity to correctly process the sequence. In Experiment 2, we demonstrated that participants construct a biased and simplified representation of the Flushtration Count by substituting some attributes of the event sequence. We discussed of the psychological processes underlying this attribute substitution effect. © 2018 The British Psychological Society.

  15. Calibration and diagnostic accuracy of simple flotation, McMaster and FLOTAC for parasite egg counts in sheep.

    PubMed

    Rinaldi, L; Coles, G C; Maurelli, M P; Musella, V; Cringoli, G

    2011-05-11

    The present study was aimed at carrying out a calibration and a comparison of diagnostic accuracy of three faecal egg counts (FEC) techniques, simple flotation, McMaster and FLOTAC, in order to find the best flotation solution (FS) for Dicrocoelium dendriticum, Moniezia expansa and gastrointestinal (GI) strongyle eggs, and to evaluate the influence of faecal preservation methods combined with FS on egg counts. Simple flotation failed to give satisfactory results with any samples. Overall, FLOTAC resulted in similar or higher eggs per gram of faeces (EPG) and lower coefficient of variation (CV) than McMaster. The "gold standard" for D. dendriticum was obtained with FLOTAC when using FS7 (EPG=219, CV=3.9%) and FS8 (EPG=226, CV=5.2%) on fresh faeces. The "gold standard" for M. expansa was obtained with FLOTAC, using FS3 (EPG=122, CV=4.1%) on fresh faeces. The "gold standard" for GI strongyles was obtained with FLOTAC when using FS5 (EPG=320, CV=4%) and FS2 (EPG=298, CV=5%). As regard to faecal preservation methods, formalin 5% and 10% or freezing showed performance similar to fresh faeces for eggs of D. dendriticum and M. expansa. However, these methods of preservation were not as successful with GI strongyle eggs. Vacuum packing with storage at +4°C permitted storage of GI strongyle eggs for up to 21 days prior to counting. Where accurate egg counts are required in ovine samples the optimum method of counting is the use of FLOTAC. In addition, we suggest the use of two solutions that are easy and cheap to purchase and prepare, saturated sodium chloride (FS2) for nematoda and cestoda eggs and saturated zinc sulphate (FS7) for trematoda eggs and nematoda larvae. Copyright © 2010 Elsevier B.V. All rights reserved.

  16. A rapid method for the simultaneous determination of gross alpha and beta activities in water samples using a low background liquid scintillation counter.

    PubMed

    Sanchez-Cabeza, J A; Pujol, L

    1995-05-01

    The radiological examination of water requires a rapid screening technique that permits the determination of the gross alpha and beta activities of each sample in order to decide if further radiological analyses are necessary. In this work, the use of a low background liquid scintillation system (Quantulus 1220) is proposed to simultaneously determine the gross activities in water samples. Liquid scintillation is compared to more conventional techniques used in most monitoring laboratories. In order to determine the best counting configuration of the system, pulse shape discrimination was optimized for 6 scintillant/vial combinations. It was concluded that the best counting configuration was obtained with the scintillation cocktail Optiphase Hisafe 3 in Zinsser low diffusion vials. The detection limits achieved were 0.012 Bq L-1 and 0.14 Bq L-1 for gross alpha and beta activity respectively, after a 1:10 concentration process by simple evaporation and for a counting time of only 360 min. The proposed technique is rapid, gives spectral information, and is adequate to determine gross activities according to the World Health Organization (WHO) guideline values.

  17. Photon Counting System for High-Sensitivity Detection of Bioluminescence at Optical Fiber End.

    PubMed

    Iinuma, Masataka; Kadoya, Yutaka; Kuroda, Akio

    2016-01-01

    The technique of photon counting is widely used for various fields and also applicable to a high-sensitivity detection of luminescence. Thanks to recent development of single photon detectors with avalanche photodiodes (APDs), the photon counting system with an optical fiber has become powerful for a detection of bioluminescence at an optical fiber end, because it allows us to fully use the merits of compactness, simple operation, highly quantum efficiency of the APD detectors. This optical fiber-based system also has a possibility of improving the sensitivity to a local detection of Adenosine triphosphate (ATP) by high-sensitivity detection of the bioluminescence. In this chapter, we are introducing a basic concept of the optical fiber-based system and explaining how to construct and use this system.

  18. Effects of a new mild shampoo for preventing hair loss in Asian by a simple hand-held phototrichogram technique.

    PubMed

    Baek, J H; Lee, S Y; Yoo, M; Park, W-S; Lee, S J; Boo, Y C; Koh, J-S

    2011-12-01

    This study was conducted to evaluate the effects of a commercially available shampoo in Korean subjects with alopecia using a simple hand-held phototrichogram technique. Forty-four subjects with alopecia were enrolled and forty subjects continued for 16 weeks. In the test group, total hair counts increased significantly at weeks 8 and 16, and the number of shedding hair significantly decreased at week 16. Terminal hair counts significantly increased at week 8. In the control group, hair thickness and the number of vellus hairs significantly decreased at week 16. The number of total hairs significantly increased in the test group than in the control group at weeks 8 and 16. The number of shedding hairs significantly decreased in the test group than in the control group at week 16. Visual assessment using clinical digital images showed that the number of total hairs appeared to increase although there was no statistical significance. In this study, it was found that the test shampoo could prevent hair loss. © 2011 DERMAPRO Co Ltd. ICS © 2011 Society of Cosmetic Scientists and the Société Française de Cosmétologie.

  19. Black hole entropy in massive Type IIA

    NASA Astrophysics Data System (ADS)

    Benini, Francesco; Khachatryan, Hrachya; Milan, Paolo

    2018-02-01

    We study the entropy of static dyonic BPS black holes in AdS4 in 4d N=2 gauged supergravities with vector and hyper multiplets, and how the entropy can be reproduced with a microscopic counting of states in the AdS/CFT dual field theory. We focus on the particular example of BPS black holes in AdS{\\hspace{0pt}}4 × S6 in massive Type IIA, whose dual three-dimensional boundary description is known and simple. To count the states in field theory we employ a supersymmetric topologically twisted index, which can be computed exactly with localization techniques. We find a perfect match at leading order.

  20. Measurement of 224Ra and 226Ra activities in natural waters using a radon-in-air monitor

    USGS Publications Warehouse

    Kim, G.; Burnett, W.C.; Dulaiova, H.; Swarzenski, P.W.; Moore, W.S.

    2001-01-01

    We report a simple new technique for measuring low-level radium isotopes (224Ra and 226Ra) in natural waters. The radium present in natural waters is first preconcentrated onto MnO2-coated acrylic fiber (Mn fiber) in a column mode. The radon produced from the adsorbed radium is then circulated through a closed air-loop connected to a commercial radon-in-air monitor. The monitor counts alpha decays of radon daughters (polonium isotopes) which are electrostatically collected onto a silicon semiconductor detector. Count data are collected in energy-specific windows, which eliminate interference and maintain very low backgrounds. Radium-224 is measured immediately after sampling via 220Rn (216Po), and 226Ra is measured via 222Rn (218Po) after a few days of ingrowth of 222Rn. This technique is rapid, simple, and accurate for measurements of low-level 224Ra and 226Ra activities without requiring any wet chemistry. Rapid measurements of short-lived 222Rn and 224Ra, along with long-lived 226Ra, may thus be made in natural waters using a single portable system for environmental monitoring of radioactivity as well as tracing of various geochemical and geophysical processes. The technique could be especially useful for the on-site rapid determination of 224Ra which has recently been found to occur at elevated activities in some groundwater wells.

  1. The Box-and-Dot Method: A Simple Strategy for Counting Significant Figures

    NASA Astrophysics Data System (ADS)

    Stephenson, W. Kirk

    2009-08-01

    A visual method for counting significant digits is presented. This easy-to-learn (and easy-to-teach) method, designated the box-and-dot method, uses the device of "boxing" significant figures based on two simple rules, then counting the number of digits in the boxes.

  2. Accelerating the two-point and three-point galaxy correlation functions using Fourier transforms

    NASA Astrophysics Data System (ADS)

    Slepian, Zachary; Eisenstein, Daniel J.

    2016-01-01

    Though Fourier transforms (FTs) are a common technique for finding correlation functions, they are not typically used in computations of the anisotropy of the two-point correlation function (2PCF) about the line of sight in wide-angle surveys because the line-of-sight direction is not constant on the Cartesian grid. Here we show how FTs can be used to compute the multipole moments of the anisotropic 2PCF. We also show how FTs can be used to accelerate the 3PCF algorithm of Slepian & Eisenstein. In both cases, these FT methods allow one to avoid the computational cost of pair counting, which scales as the square of the number density of objects in the survey. With the upcoming large data sets of Dark Energy Spectroscopic Instrument, Euclid, and Large Synoptic Survey Telescope, FT techniques will therefore offer an important complement to simple pair or triplet counts.

  3. Fluorometric determination of the DNA concentration in municipal drinking water.

    PubMed Central

    McCoy, W F; Olson, B H

    1985-01-01

    DNA concentrations in municipal drinking water samples were measured by fluorometry, using Hoechst 33258 fluorochrome. The concentration, extraction, and detection methods used were adapted from existing techniques. The method is reproducible, fast, accurate, and simple. The amounts of DNA per cell for five different bacterial isolates obtained from drinking water samples were determined by measuring DNA concentration and total cell concentration (acridine orange epifluorescence direct cell counting) in stationary pure cultures. The relationship between DNA concentration and epifluorescence total direct cell concentration in 11 different drinking water samples was linear and positive; the amounts of DNA per cell in these samples did not differ significantly from the amounts in pure culture isolates. We found significant linear correlations between DNA concentration and colony-forming unit concentration, as well as between epifluorescence direct cell counts and colony-forming unit concentration. DNA concentration measurements of municipal drinking water samples appear to monitor changes in bacteriological quality at least as well as total heterotrophic plate counting and epifluorescence direct cell counting. PMID:3890737

  4. The Box-and-Dot Method: A Simple Strategy for Counting Significant Figures

    ERIC Educational Resources Information Center

    Stephenson, W. Kirk

    2009-01-01

    A visual method for counting significant digits is presented. This easy-to-learn (and easy-to-teach) method, designated the box-and-dot method, uses the device of "boxing" significant figures based on two simple rules, then counting the number of digits in the boxes. (Contains 4 notes.)

  5. Using Mouse Mammary Tumor Cells to Teach Core Biology Concepts: A Simple Lab Module.

    PubMed

    McIlrath, Victoria; Trye, Alice; Aguanno, Ann

    2015-06-18

    Undergraduate biology students are required to learn, understand and apply a variety of cellular and molecular biology concepts and techniques in preparation for biomedical, graduate and professional programs or careers in science. To address this, a simple laboratory module was devised to teach the concepts of cell division, cellular communication and cancer through the application of animal cell culture techniques. Here the mouse mammary tumor (MMT) cell line is used to model for breast cancer. Students learn to grow and characterize these animal cells in culture and test the effects of traditional and non-traditional chemotherapy agents on cell proliferation. Specifically, students determine the optimal cell concentration for plating and growing cells, learn how to prepare and dilute drug solutions, identify the best dosage and treatment time course of the antiproliferative agents, and ascertain the rate of cell death in response to various treatments. The module employs both a standard cell counting technique using a hemocytometer and a novel cell counting method using microscopy software. The experimental procedure lends to open-ended inquiry as students can modify critical steps of the protocol, including testing homeopathic agents and over-the-counter drugs. In short, this lab module requires students to use the scientific process to apply their knowledge of the cell cycle, cellular signaling pathways, cancer and modes of treatment, all while developing an array of laboratory skills including cell culture and analysis of experimental data not routinely taught in the undergraduate classroom.

  6. Diffusion processes in tumors: A nuclear medicine approach

    NASA Astrophysics Data System (ADS)

    Amaya, Helman

    2016-07-01

    The number of counts used in nuclear medicine imaging techniques, only provides physical information about the desintegration of the nucleus present in the the radiotracer molecules that were uptaken in a particular anatomical region, but that information is not a real metabolic information. For this reason a mathematical method was used to find a correlation between number of counts and 18F-FDG mass concentration. This correlation allows a better interpretation of the results obtained in the study of diffusive processes in an agar phantom, and based on it, an image from the PETCETIX DICOM sample image set from OsiriX-viewer software was processed. PET-CT gradient magnitude and Laplacian images could show direct information on diffusive processes for radiopharmaceuticals that enter into the cells by simple diffusion. In the case of the radiopharmaceutical 18F-FDG is necessary to include pharmacokinetic models, to make a correct interpretation of the gradient magnitude and Laplacian of counts images.

  7. Maximum Entropy Discrimination Poisson Regression for Software Reliability Modeling.

    PubMed

    Chatzis, Sotirios P; Andreou, Andreas S

    2015-11-01

    Reliably predicting software defects is one of the most significant tasks in software engineering. Two of the major components of modern software reliability modeling approaches are: 1) extraction of salient features for software system representation, based on appropriately designed software metrics and 2) development of intricate regression models for count data, to allow effective software reliability data modeling and prediction. Surprisingly, research in the latter frontier of count data regression modeling has been rather limited. More specifically, a lack of simple and efficient algorithms for posterior computation has made the Bayesian approaches appear unattractive, and thus underdeveloped in the context of software reliability modeling. In this paper, we try to address these issues by introducing a novel Bayesian regression model for count data, based on the concept of max-margin data modeling, effected in the context of a fully Bayesian model treatment with simple and efficient posterior distribution updates. Our novel approach yields a more discriminative learning technique, making more effective use of our training data during model inference. In addition, it allows of better handling uncertainty in the modeled data, which can be a significant problem when the training data are limited. We derive elegant inference algorithms for our model under the mean-field paradigm and exhibit its effectiveness using the publicly available benchmark data sets.

  8. A Reproducible Computerized Method for Quantitation of Capillary Density using Nailfold Capillaroscopy.

    PubMed

    Cheng, Cynthia; Lee, Chadd W; Daskalakis, Constantine

    2015-10-27

    Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient's microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.(1) This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique.

  9. A Reproducible Computerized Method for Quantitation of Capillary Density using Nailfold Capillaroscopy

    PubMed Central

    Daskalakis, Constantine

    2015-01-01

    Capillaroscopy is a non-invasive, efficient, relatively inexpensive and easy to learn methodology for directly visualizing the microcirculation. The capillaroscopy technique can provide insight into a patient’s microvascular health, leading to a variety of potentially valuable dermatologic, ophthalmologic, rheumatologic and cardiovascular clinical applications. In addition, tumor growth may be dependent on angiogenesis, which can be quantitated by measuring microvessel density within the tumor. However, there is currently little to no standardization of techniques, and only one publication to date reports the reliability of a currently available, complex computer based algorithms for quantitating capillaroscopy data.1 This paper describes a new, simpler, reliable, standardized capillary counting algorithm for quantitating nailfold capillaroscopy data. A simple, reproducible computerized capillaroscopy algorithm such as this would facilitate more widespread use of the technique among researchers and clinicians. Many researchers currently analyze capillaroscopy images by hand, promoting user fatigue and subjectivity of the results. This paper describes a novel, easy-to-use automated image processing algorithm in addition to a reproducible, semi-automated counting algorithm. This algorithm enables analysis of images in minutes while reducing subjectivity; only a minimal amount of training time (in our experience, less than 1 hr) is needed to learn the technique. PMID:26554744

  10. Simple Interval Timers for Microcomputers.

    ERIC Educational Resources Information Center

    McInerney, M.; Burgess, G.

    1985-01-01

    Discusses simple interval timers for microcomputers, including (1) the Jiffy clock; (2) CPU count timers; (3) screen count timers; (4) light pen timers; and (5) chip timers. Also examines some of the general characteristics of all types of timers. (JN)

  11. A simple method for comparing immunogold distributions in two or more experimental groups illustrated using GLUT1 labelling of isolated trophoblast cells.

    PubMed

    Mayhew, T M; Desoye, G

    2004-07-01

    Colloidal gold-labelling, combined with transmission electron microscopy, is a valuable technique for high-resolution immunolocalization of identified antigens in different subcellular compartments. Whilst the technique has been applied to placental tissues, few quantitative studies have been made. Subcellular compartments exist in three main categories (viz. organelles, membranes, filaments/tubules) and this affects the possibilities for quantification. Generally, gold particles are counted in order to compare either (a) compartments within an experimental group or (b) compartmental labelling distributions between groups. For the former, recent developments make it possible to test whether or not there is differential (nonrandom) labelling of compartments. The methods (relative labelling index and labelling density) are ideally suited to analysing label in one category of compartment (organelle or membrane or filament) but may be adapted to deal with a mixture of categories. They also require information about compartment size (e.g. profile area or trace length). Here, a simple and efficient method for drawing between-group comparisons of labelling distributions is presented. The method does not require information about compartment size or specimen magnification. It relies on multistage random sampling of specimens and unbiased counting of gold particles associated with different compartments. Distributions of observed gold counts in different experimental groups are compared by contingency table analysis with degrees of freedom for chi-squared (chi(2)) values being determined by the numbers of compartments and experimental groups. Compartmental values of chi(2)which contribute substantially to total chi(2)identify the principal subcellular sites of between-group differences. The method is illustrated using datasets from immunolabelling studies on the localization of GLUT1 glucose transporters in cultured human trophoblast cells exposed to different treatments.

  12. Real-time passenger counting by active linear cameras

    NASA Astrophysics Data System (ADS)

    Khoudour, Louahdi; Duvieubourg, Luc; Deparis, Jean-Pierre

    1996-03-01

    The companies operating subways are very much concerned with counting the passengers traveling through their transport systems. One of the most widely used systems for counting passengers consists of a mechanical gate equipped with a counter. However, such simple systems are not able to count passengers jumping above the gates. Moreover, passengers carrying large luggage or bags may meet some difficulties when going through such gates. The ideal solution is a contact-free counting system that would bring more comfort of use for the passengers. For these reasons, we propose to use a video processing system instead of these mechanical gates. The optical sensors discussed in this paper offer several advantages including well defined detection areas, fast response time and reliable counting capability. A new technology has been developed and tested, based on linear cameras. Preliminary results show that this system is very efficient when the passengers crossing the optical gate are well separated. In other cases, such as in compact crowd conditions, reasonable accuracy has been demonstrated. These results are illustrated by means of a number of sequences shot in field conditions. It is our belief that more precise measurements could be achieved, in the case of compact crowd, by other algorithms and acquisition techniques of the line images that we are presently developing.

  13. Quantitative evaluation method of the threshold adjustment and the flat field correction performances of hybrid photon counting pixel detectors

    NASA Astrophysics Data System (ADS)

    Medjoubi, K.; Dawiec, A.

    2017-12-01

    A simple method is proposed in this work for quantitative evaluation of the quality of the threshold adjustment and the flat-field correction of Hybrid Photon Counting pixel (HPC) detectors. This approach is based on the Photon Transfer Curve (PTC) corresponding to the measurement of the standard deviation of the signal in flat field images. Fixed pattern noise (FPN), easily identifiable in the curve, is linked to the residual threshold dispersion, sensor inhomogeneity and the remnant errors in flat fielding techniques. The analytical expression of the signal to noise ratio curve is developed for HPC and successfully used as a fit function applied to experimental data obtained with the XPAD detector. The quantitative evaluation of the FPN, described by the photon response non-uniformity (PRNU), is measured for different configurations (threshold adjustment method and flat fielding technique) and is demonstrated to be used in order to evaluate the best setting for having the best image quality from a commercial or a R&D detector.

  14. Toward CMOS image sensor based glucose monitoring.

    PubMed

    Devadhasan, Jasmine Pramila; Kim, Sanghyo

    2012-09-07

    Complementary metal oxide semiconductor (CMOS) image sensor is a powerful tool for biosensing applications. In this present study, CMOS image sensor has been exploited for detecting glucose levels by simple photon count variation with high sensitivity. Various concentrations of glucose (100 mg dL(-1) to 1000 mg dL(-1)) were added onto a simple poly-dimethylsiloxane (PDMS) chip and the oxidation of glucose was catalyzed with the aid of an enzymatic reaction. Oxidized glucose produces a brown color with the help of chromogen during enzymatic reaction and the color density varies with the glucose concentration. Photons pass through the PDMS chip with varying color density and hit the sensor surface. Photon count was recognized by CMOS image sensor depending on the color density with respect to the glucose concentration and it was converted into digital form. By correlating the obtained digital results with glucose concentration it is possible to measure a wide range of blood glucose levels with great linearity based on CMOS image sensor and therefore this technique will promote a convenient point-of-care diagnosis.

  15. Diffusion processes in tumors: A nuclear medicine approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amaya, Helman, E-mail: haamayae@unal.edu.co

    The number of counts used in nuclear medicine imaging techniques, only provides physical information about the desintegration of the nucleus present in the the radiotracer molecules that were uptaken in a particular anatomical region, but that information is not a real metabolic information. For this reason a mathematical method was used to find a correlation between number of counts and {sup 18}F-FDG mass concentration. This correlation allows a better interpretation of the results obtained in the study of diffusive processes in an agar phantom, and based on it, an image from the PETCETIX DICOM sample image set from OsiriX-viewer softwaremore » was processed. PET-CT gradient magnitude and Laplacian images could show direct information on diffusive processes for radiopharmaceuticals that enter into the cells by simple diffusion. In the case of the radiopharmaceutical {sup 18}F-FDG is necessary to include pharmacokinetic models, to make a correct interpretation of the gradient magnitude and Laplacian of counts images.« less

  16. Detection and Estimation of an Optical Image by Photon-Counting Techniques. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Wang, Lily Lee

    1973-01-01

    Statistical description of a photoelectric detector is given. The photosensitive surface of the detector is divided into many small areas, and the moment generating function of the photo-counting statistic is derived for large time-bandwidth product. The detection of a specified optical image in the presence of the background light by using the hypothesis test is discussed. The ideal detector based on the likelihood ratio from a set of numbers of photoelectrons ejected from many small areas of the photosensitive surface is studied and compared with the threshold detector and a simple detector which is based on the likelihood ratio by counting the total number of photoelectrons from a finite area of the surface. The intensity of the image is assumed to be Gaussian distributed spatially against the uniformly distributed background light. The numerical approximation by the method of steepest descent is used, and the calculations of the reliabilities for the detectors are carried out by a digital computer.

  17. Image charge multi-role and function detectors

    NASA Astrophysics Data System (ADS)

    Milnes, James; Lapington, Jon S.; Jagutzki, Ottmar; Howorth, Jon

    2009-06-01

    The image charge technique used with microchannel plate imaging tubes provides several operational and practical benefits by serving to isolate the electronic image readout from the detector. The simple dielectric interface between detector and readout provides vacuum isolation and no vacuum electrical feed-throughs are required. Since the readout is mechanically separate from the detector, an image tube of generic design can be simply optimised for various applications by attaching it to different readout devices and electronics. We present imaging performance results using a single image tube with a variety of readout devices suited to differing applications: (a) A four electrode charge division tetra wedge anode, optimised for best spatial resolution in photon counting mode. (b) A cross delay line anode, enabling higher count rate, and the possibility of discriminating near co-incident events, and an event timing resolution of better than 1 ns. (c) A multi-anode readout connected, either to a multi-channel oscilloscope for analogue measurements of fast optical pulses, or alternately, to a multi-channel time correlated single photon counting (TCSPC) card.

  18. Statistical Aspects of Point Count Sampling

    Treesearch

    Richard J. Barker; John R. Sauer

    1995-01-01

    The dominant feature of point counts is that they do not census birds, but instead provide incomplete counts of individuals present within a survey plot. Considering a simple model for point count sampling, we demonstrate that use of these incomplete counts can bias estimators and testing procedures, leading to inappropriate conclusions. A large portion of the...

  19. Exclusion-Based Capture and Enumeration of CD4+ T Cells from Whole Blood for Low-Resource Settings.

    PubMed

    Howard, Alexander L; Pezzi, Hannah M; Beebe, David J; Berry, Scott M

    2014-06-01

    In developing countries, demand exists for a cost-effective method to evaluate human immunodeficiency virus patients' CD4(+) T-helper cell count. The TH (CD4) cell count is the current marker used to identify when an HIV patient has progressed to acquired immunodeficiency syndrome, which results when the immune system can no longer prevent certain opportunistic infections. A system to perform TH count that obviates the use of costly flow cytometry will enable physicians to more closely follow patients' disease progression and response to therapy in areas where such advanced equipment is unavailable. Our system of two serially-operated immiscible phase exclusion-based cell isolations coupled with a rapid fluorescent readout enables exclusion-based isolation and accurate counting of T-helper cells at lower cost and from a smaller volume of blood than previous methods. TH cell isolation via immiscible filtration assisted by surface tension (IFAST) compares well against the established Dynal T4 Quant Kit and is sensitive at CD4 counts representative of immunocompromised patients (less than 200 TH cells per microliter of blood). Our technique retains use of open, simple-to-operate devices that enable IFAST as a high-throughput, automatable sample preparation method, improving throughput over previous low-resource methods. © 2013 Society for Laboratory Automation and Screening.

  20. A rapid and universal bacteria-counting approach using CdSe/ZnS/SiO2 composite nanoparticles as fluorescence probe.

    PubMed

    Fu, Xin; Huang, Kelong; Liu, Suqin

    2010-02-01

    In this paper, a rapid, simple, and sensitive method was described for detection of the total bacterial count using SiO(2)-coated CdSe/ZnS quantum dots (QDs) as a fluorescence marker that covalently coupled with bacteria using glutaraldehyde as the crosslinker. Highly luminescent CdSe/ZnS were prepared by applying cadmium oxide and zinc stearate as precursors instead of pyrophoric organometallic precursors. A reverse-microemulsion technique was used to synthesize CdSe/ZnS/SiO(2) composite nanoparticles with a SiO(2) surface coating. Our results showed that CdSe/ZnS/SiO(2) composite nanoparticles prepared with this method possessed highly luminescent, biologically functional, and monodispersive characteristics, and could successfully be covalently conjugated with the bacteria. As a demonstration, it was found that the method had higher sensitivity and could count bacteria in 3 x 10(2) CFU/mL, lower than the conventional plate counting and organic dye-based method. A linear relationship of the fluorescence peak intensity (Y) and the total bacterial count (X) was established in the range of 3 x 10(2)-10(7) CFU/mL using the equation Y = 374.82X-938.27 (R = 0.99574). The results of the determination for the total count of bacteria in seven real samples were identical with the conventional plate count method, and the standard deviation was satisfactory.

  1. Simple Identification of Complex ADHD Subtypes Using Current Symptom Counts

    ERIC Educational Resources Information Center

    Volk, Heather E.; Todorov, Alexandre A.; Hay, David A.; Todd, Richard D.

    2009-01-01

    The results of the assessment of the accuracy of simple rules based on symptom count for assigning youths to attention deficit hyperactivity disorder subtypes show that having six or more total symptoms and fewer than three hyperactive-impulsive symptoms is an accurate predictor for the latent class sever inattentive subtype.

  2. Sample to answer visualization pipeline for low-cost point-of-care blood cell counting

    NASA Astrophysics Data System (ADS)

    Smith, Suzanne; Naidoo, Thegaran; Davies, Emlyn; Fourie, Louis; Nxumalo, Zandile; Swart, Hein; Marais, Philip; Land, Kevin; Roux, Pieter

    2015-03-01

    We present a visualization pipeline from sample to answer for point-of-care blood cell counting applications. Effective and low-cost point-of-care medical diagnostic tests provide developing countries and rural communities with accessible healthcare solutions [1], and can be particularly beneficial for blood cell count tests, which are often the starting point in the process of diagnosing a patient [2]. The initial focus of this work is on total white and red blood cell counts, using a microfluidic cartridge [3] for sample processing. Analysis of the processed samples has been implemented by means of two main optical visualization systems developed in-house: 1) a fluidic operation analysis system using high speed video data to determine volumes, mixing efficiency and flow rates, and 2) a microscopy analysis system to investigate homogeneity and concentration of blood cells. Fluidic parameters were derived from the optical flow [4] as well as color-based segmentation of the different fluids using a hue-saturation-value (HSV) color space. Cell count estimates were obtained using automated microscopy analysis and were compared to a widely accepted manual method for cell counting using a hemocytometer [5]. The results using the first iteration microfluidic device [3] showed that the most simple - and thus low-cost - approach for microfluidic component implementation was not adequate as compared to techniques based on manual cell counting principles. An improved microfluidic design has been developed to incorporate enhanced mixing and metering components, which together with this work provides the foundation on which to successfully implement automated, rapid and low-cost blood cell counting tests.

  3. Content analysis of 150 years of British periodicals.

    PubMed

    Lansdall-Welfare, Thomas; Sudhahar, Saatviga; Thompson, James; Lewis, Justin; Cristianini, Nello

    2017-01-24

    Previous studies have shown that it is possible to detect macroscopic patterns of cultural change over periods of centuries by analyzing large textual time series, specifically digitized books. This method promises to empower scholars with a quantitative and data-driven tool to study culture and society, but its power has been limited by the use of data from books and simple analytics based essentially on word counts. This study addresses these problems by assembling a vast corpus of regional newspapers from the United Kingdom, incorporating very fine-grained geographical and temporal information that is not available for books. The corpus spans 150 years and is formed by millions of articles, representing 14% of all British regional outlets of the period. Simple content analysis of this corpus allowed us to detect specific events, like wars, epidemics, coronations, or conclaves, with high accuracy, whereas the use of more refined techniques from artificial intelligence enabled us to move beyond counting words by detecting references to named entities. These techniques allowed us to observe both a systematic underrepresentation and a steady increase of women in the news during the 20th century and the change of geographic focus for various concepts. We also estimate the dates when electricity overtook steam and trains overtook horses as a means of transportation, both around the year 1900, along with observing other cultural transitions. We believe that these data-driven approaches can complement the traditional method of close reading in detecting trends of continuity and change in historical corpora.

  4. Content analysis of 150 years of British periodicals

    PubMed Central

    Lansdall-Welfare, Thomas; Sudhahar, Saatviga; Thompson, James; Lewis, Justin; Cristianini, Nello

    2017-01-01

    Previous studies have shown that it is possible to detect macroscopic patterns of cultural change over periods of centuries by analyzing large textual time series, specifically digitized books. This method promises to empower scholars with a quantitative and data-driven tool to study culture and society, but its power has been limited by the use of data from books and simple analytics based essentially on word counts. This study addresses these problems by assembling a vast corpus of regional newspapers from the United Kingdom, incorporating very fine-grained geographical and temporal information that is not available for books. The corpus spans 150 years and is formed by millions of articles, representing 14% of all British regional outlets of the period. Simple content analysis of this corpus allowed us to detect specific events, like wars, epidemics, coronations, or conclaves, with high accuracy, whereas the use of more refined techniques from artificial intelligence enabled us to move beyond counting words by detecting references to named entities. These techniques allowed us to observe both a systematic underrepresentation and a steady increase of women in the news during the 20th century and the change of geographic focus for various concepts. We also estimate the dates when electricity overtook steam and trains overtook horses as a means of transportation, both around the year 1900, along with observing other cultural transitions. We believe that these data-driven approaches can complement the traditional method of close reading in detecting trends of continuity and change in historical corpora. PMID:28069962

  5. Field test comparison of an autocorrelation technique for determining grain size using a digital 'beachball' camera versus traditional methods

    USGS Publications Warehouse

    Barnard, P.L.; Rubin, D.M.; Harney, J.; Mustain, N.

    2007-01-01

    This extensive field test of an autocorrelation technique for determining grain size from digital images was conducted using a digital bed-sediment camera, or 'beachball' camera. Using 205 sediment samples and >1200 images from a variety of beaches on the west coast of the US, grain size ranging from sand to granules was measured from field samples using both the autocorrelation technique developed by Rubin [Rubin, D.M., 2004. A simple autocorrelation algorithm for determining grain size from digital images of sediment. Journal of Sedimentary Research, 74(1): 160-165.] and traditional methods (i.e. settling tube analysis, sieving, and point counts). To test the accuracy of the digital-image grain size algorithm, we compared results with manual point counts of an extensive image data set in the Santa Barbara littoral cell. Grain sizes calculated using the autocorrelation algorithm were highly correlated with the point counts of the same images (r2 = 0.93; n = 79) and had an error of only 1%. Comparisons of calculated grain sizes and grain sizes measured from grab samples demonstrated that the autocorrelation technique works well on high-energy dissipative beaches with well-sorted sediment such as in the Pacific Northwest (r2 ??? 0.92; n = 115). On less dissipative, more poorly sorted beaches such as Ocean Beach in San Francisco, results were not as good (r2 ??? 0.70; n = 67; within 3% accuracy). Because the algorithm works well compared with point counts of the same image, the poorer correlation with grab samples must be a result of actual spatial and vertical variability of sediment in the field; closer agreement between grain size in the images and grain size of grab samples can be achieved by increasing the sampling volume of the images (taking more images, distributed over a volume comparable to that of a grab sample). In all field tests the autocorrelation method was able to predict the mean and median grain size with ???96% accuracy, which is more than adequate for the majority of sedimentological applications, especially considering that the autocorrelation technique is estimated to be at least 100 times faster than traditional methods.

  6. [Prognostic value on recovery rates for the application of sperm preparation techniques and their evaluation in sperm function].

    PubMed

    Barroso, Gerardo; Chaya, Miguel; Bolaños, Rubén; Rosado, Yadira; García León, Fernando; Ibarrola, Eduardo

    2005-05-01

    To evaluate sperm recovery and total sperm motility in three different sperm preparation techniques (density gradient, simple washing and swim-up). A total of 290 subjects were randomly evaluated from November 2001 to March 2003. The density gradient method required Isolate (upper and lower layers). Centrifugation was performed at 400 g for 10 minutes and evaluation was done using the Makler counting chamber. The simple washing method included the use of HTF-M complemented with 7.5% of SSS, with centrifugation at 250 g, obtaining at the end 0.5 mL of the sperm sample. The swim-up method required HTF-M complemented with 7.5% of SSS, with an incubation period of 60 minutes at 37 degrees C. The demographic characteristics evaluated through their standard error, 95% ICC, and 50th percentile were similar. The application of multiple comparison tests and analysis of variance showed significant differences between the sperm preparations before and after capacitation. It was observed a superior recovery rate with the density gradient and swim-up methods; nevertheless, the samples used for the simple washing method showed a diminished sperm recovery from the original sample. Sperm preparation techniques have become very useful in male infertility treatments allowing higher sperm recovery and motility rates. The seminal parameters evaluated from the original sperm sample will determine the best sperm preparation technique in those patients who require it.

  7. Counting statistics for genetic switches based on effective interaction approximation

    NASA Astrophysics Data System (ADS)

    Ohkubo, Jun

    2012-09-01

    Applicability of counting statistics for a system with an infinite number of states is investigated. The counting statistics has been studied a lot for a system with a finite number of states. While it is possible to use the scheme in order to count specific transitions in a system with an infinite number of states in principle, we have non-closed equations in general. A simple genetic switch can be described by a master equation with an infinite number of states, and we use the counting statistics in order to count the number of transitions from inactive to active states in the gene. To avoid having the non-closed equations, an effective interaction approximation is employed. As a result, it is shown that the switching problem can be treated as a simple two-state model approximately, which immediately indicates that the switching obeys non-Poisson statistics.

  8. When Practice Doesn't Lead to Retrieval: An Analysis of Children's Errors with Simple Addition

    ERIC Educational Resources Information Center

    de Villiers, Celéste; Hopkins, Sarah

    2013-01-01

    Counting strategies initially used by young children to perform simple addition are often replaced by more efficient counting strategies, decomposition strategies and rule-based strategies until most answers are encoded in memory and can be directly retrieved. Practice is thought to be the key to developing fluent retrieval of addition facts. This…

  9. Laboratory blood analysis in Strigiformes-Part I: hematologic reference intervals and agreement between manual blood cell counting techniques.

    PubMed

    Ammersbach, Mélanie; Beaufrère, Hugues; Gionet Rollick, Annick; Tully, Thomas

    2015-03-01

    While hematologic reference intervals (RI) are available for multiple raptorial species of the order Accipitriformes and Falconiformes, there is a lack of valuable hematologic information in Strigiformes that can be used for diagnostic and health monitoring purposes. The objective was to report RI in Strigiformes for hematologic variables and to assess agreement between manual cell counting techniques. A multi-center prospective study was designed to assess hematologic RI and blood cell morphology in owl species. Samples were collected from individuals representing 13 Strigiformes species, including Great Horned Owl, Snowy Owl, Eurasian Eagle Owl, Barred Owl, Great Gray Owl, Ural Owl, Northern Saw-Whet Owls, Northern Hawk Owl, Spectacled Owl, Barn Owl, Eastern Screech Owl, Long-Eared Owl, and Short-Eared Owl. Red blood cell count was determined manually using a hemocytometer. White blood cell count was determined using 3 manual counting techniques: (1) phloxine B technique, (2) Natt and Herrick technique, and (3) estimation from the smear. Differential counts and blood cell morphology were determined on smears. Reference intervals were determined and agreement between methods was calculated. Important species-specific differences were observed in blood cell counts and granulocyte morphology. Differences in WBC count between species did not appear to be predictable based on phylogenetic relationships. Overall, most boreal owl species exhibited a lower WBC count than other species. Important disagreements were found between different manual WBC counting techniques. Disagreements observed between manual counting techniques suggest that technique-specific RI should be used in Strigiformes. © 2015 American Society for Veterinary Clinical Pathology.

  10. Poisson and negative binomial item count techniques for surveys with sensitive question.

    PubMed

    Tian, Guo-Liang; Tang, Man-Lai; Wu, Qin; Liu, Yin

    2017-04-01

    Although the item count technique is useful in surveys with sensitive questions, privacy of those respondents who possess the sensitive characteristic of interest may not be well protected due to a defect in its original design. In this article, we propose two new survey designs (namely the Poisson item count technique and negative binomial item count technique) which replace several independent Bernoulli random variables required by the original item count technique with a single Poisson or negative binomial random variable, respectively. The proposed models not only provide closed form variance estimate and confidence interval within [0, 1] for the sensitive proportion, but also simplify the survey design of the original item count technique. Most importantly, the new designs do not leak respondents' privacy. Empirical results show that the proposed techniques perform satisfactorily in the sense that it yields accurate parameter estimate and confidence interval.

  11. A quartz nanopillar hemocytometer for high-yield separation and counting of CD4+ T lymphocytes

    NASA Astrophysics Data System (ADS)

    Kim, Dong-Joo; Seol, Jin-Kyeong; Wu, Yu; Ji, Seungmuk; Kim, Gil-Sung; Hyung, Jung-Hwan; Lee, Seung-Yong; Lim, Hyuneui; Fan, Rong; Lee, Sang-Kwon

    2012-03-01

    We report the development of a novel quartz nanopillar (QNP) array cell separation system capable of selectively capturing and isolating a single cell population including primary CD4+ T lymphocytes from the whole pool of splenocytes. Integrated with a photolithographically patterned hemocytometer structure, the streptavidin (STR)-functionalized-QNP (STR-QNP) arrays allow for direct quantitation of captured cells using high content imaging. This technology exhibits an excellent separation yield (efficiency) of ~95.3 +/- 1.1% for the CD4+ T lymphocytes from the mouse splenocyte suspensions and good linear response for quantitating captured CD4+ T-lymphoblasts, which is comparable to flow cytometry and outperforms any non-nanostructured surface capture techniques, i.e. cell panning. This nanopillar hemocytometer represents a simple, yet efficient cell capture and counting technology and may find immediate applications for diagnosis and immune monitoring in the point-of-care setting.We report the development of a novel quartz nanopillar (QNP) array cell separation system capable of selectively capturing and isolating a single cell population including primary CD4+ T lymphocytes from the whole pool of splenocytes. Integrated with a photolithographically patterned hemocytometer structure, the streptavidin (STR)-functionalized-QNP (STR-QNP) arrays allow for direct quantitation of captured cells using high content imaging. This technology exhibits an excellent separation yield (efficiency) of ~95.3 +/- 1.1% for the CD4+ T lymphocytes from the mouse splenocyte suspensions and good linear response for quantitating captured CD4+ T-lymphoblasts, which is comparable to flow cytometry and outperforms any non-nanostructured surface capture techniques, i.e. cell panning. This nanopillar hemocytometer represents a simple, yet efficient cell capture and counting technology and may find immediate applications for diagnosis and immune monitoring in the point-of-care setting. Electronic supplementary information (ESI) available. See DOI: 10.1039/c2nr11338d

  12. Whales from space: counting southern right whales by satellite.

    PubMed

    Fretwell, Peter T; Staniland, Iain J; Forcada, Jaume

    2014-01-01

    We describe a method of identifying and counting whales using very high resolution satellite imagery through the example of southern right whales breeding in part of the Golfo Nuevo, Península Valdés in Argentina. Southern right whales have been extensively hunted over the last 300 years and although numbers have recovered from near extinction in the early 20(th) century, current populations are fragmented and are estimated at only a small fraction of pre-hunting total. Recent extreme right whale calf mortality events at Península Valdés, which constitutes the largest single population, have raised fresh concern for the future of the species. The WorldView2 satellite has a maximum 50 cm resolution and a water penetrating coastal band in the far-blue part of the spectrum that allows it to see deeper into the water column. Using an image covering 113 km², we identified 55 probable whales and 23 other features that are possibly whales, with a further 13 objects that are only detected by the coastal band. Comparison of a number of classification techniques, to automatically detect whale-like objects, showed that a simple thresholding technique of the panchromatic and coastal band delivered the best results. This is the first successful study using satellite imagery to count whales; a pragmatic, transferable method using this rapidly advancing technology that has major implications for future surveys of cetacean populations.

  13. Multichannel microfluidic chip for rapid and reliable trapping and imaging plant-parasitic nematodes

    NASA Astrophysics Data System (ADS)

    Amrit, Ratthasart; Sripumkhai, Witsaroot; Porntheeraphat, Supanit; Jeamsaksiri, Wutthinan; Tangchitsomkid, Nuchanart; Sutapun, Boonsong

    2013-05-01

    Faster and reliable testing technique to count and identify nematode species resided in plant roots is therefore essential for export control and certification. This work proposes utilizing a multichannel microfluidic chip with an integrated flow-through microfilter to retain the nematodes in a trapping chamber. When trapped, it is rather simple and convenient to capture images of the nematodes and later identify their species by a trained technician. Multiple samples can be tested in parallel using the proposed microfluidic chip therefore increasing number of samples tested per day.

  14. Radiative neutron capture as a counting technique at pulsed spallation neutron sources: a review of current progress

    NASA Astrophysics Data System (ADS)

    Schooneveld, E. M.; Pietropaolo, A.; Andreani, C.; Perelli Cippo, E.; Rhodes, N. J.; Senesi, R.; Tardocchi, M.; Gorini, G.

    2016-09-01

    Neutron scattering techniques are attracting an increasing interest from scientists in various research fields, ranging from physics and chemistry to biology and archaeometry. The success of these neutron scattering applications is stimulated by the development of higher performance instrumentation. The development of new techniques and concepts, including radiative capture based neutron detection, is therefore a key issue to be addressed. Radiative capture based neutron detectors utilize the emission of prompt gamma rays after neutron absorption in a suitable isotope and the detection of those gammas by a photon counter. They can be used as simple counters in the thermal region and (simultaneously) as energy selector and counters for neutrons in the eV energy region. Several years of extensive development have made eV neutron spectrometers operating in the so-called resonance detector spectrometer (RDS) configuration outperform their conventional counterparts. In fact, the VESUVIO spectrometer, a flagship instrument at ISIS serving a continuous user programme for eV inelastic neutron spectroscopy measurements, is operating in the RDS configuration since 2007. In this review, we discuss the physical mechanism underlying the RDS configuration and the development of associated instrumentation. A few successful neutron scattering experiments that utilize the radiative capture counting techniques will be presented together with the potential of this technique for thermal neutron diffraction measurements. We also outline possible improvements and future perspectives for radiative capture based neutron detectors in neutron scattering application at pulsed neutron sources.

  15. Novel flat datacenter network architecture based on scalable and flow-controlled optical switch system.

    PubMed

    Miao, Wang; Luo, Jun; Di Lucente, Stefano; Dorren, Harm; Calabretta, Nicola

    2014-02-10

    We propose and demonstrate an optical flat datacenter network based on scalable optical switch system with optical flow control. Modular structure with distributed control results in port-count independent optical switch reconfiguration time. RF tone in-band labeling technique allowing parallel processing of the label bits ensures the low latency operation regardless of the switch port-count. Hardware flow control is conducted at optical level by re-using the label wavelength without occupying extra bandwidth, space, and network resources which further improves the performance of latency within a simple structure. Dynamic switching including multicasting operation is validated for a 4 x 4 system. Error free operation of 40 Gb/s data packets has been achieved with only 1 dB penalty. The system could handle an input load up to 0.5 providing a packet loss lower that 10(-5) and an average latency less that 500 ns when a buffer size of 16 packets is employed. Investigation on scalability also indicates that the proposed system could potentially scale up to large port count with limited power penalty.

  16. Five-Factor Model personality disorder prototypes: a review of their development, validity, and comparison to alternative approaches.

    PubMed

    Miller, Joshua D

    2012-12-01

    In this article, the development of Five-Factor Model (FFM) personality disorder (PD) prototypes for the assessment of DSM-IV PDs are reviewed, as well as subsequent procedures for scoring individuals' FFM data with regard to these PD prototypes, including similarity scores and simple additive counts that are based on a quantitative prototype matching methodology. Both techniques, which result in very strongly correlated scores, demonstrate convergent and discriminant validity, and provide clinically useful information with regard to various forms of functioning. The techniques described here for use with FFM data are quite different from the prototype matching methods used elsewhere. © 2012 The Author. Journal of Personality © 2012, Wiley Periodicals, Inc.

  17. Betti numbers of holomorphic symplectic quotients via arithmetic Fourier transform.

    PubMed

    Hausel, Tamás

    2006-04-18

    A Fourier transform technique is introduced for counting the number of solutions of holomorphic moment map equations over a finite field. This technique in turn gives information on Betti numbers of holomorphic symplectic quotients. As a consequence, simple unified proofs are obtained for formulas of Poincaré polynomials of toric hyperkähler varieties (recovering results of Bielawski-Dancer and Hausel-Sturmfels), Poincaré polynomials of Hilbert schemes of points and twisted Atiyah-Drinfeld-Hitchin-Manin (ADHM) spaces of instantons on C2 (recovering results of Nakajima-Yoshioka), and Poincaré polynomials of all Nakajima quiver varieties. As an application, a proof of a conjecture of Kac on the number of absolutely indecomposable representations of a quiver is announced.

  18. White blood cell counts and neutrophil to lymphocyte ratio in the diagnosis of testicular cancer: a simple secondary serum tumor marker.

    PubMed

    Yuksel, Ozgur Haki; Verit, Ayhan; Sahin, Aytac; Urkmez, Ahmet; Uruc, Fatih

    2016-01-01

    The aim of the study was to investigate white blood cell counts and neutrophil to lymphocyte ratio (NLR) as markers of systemic inflammation in the diagnosis of localized testicular cancer as a malignancy with initially low volume. Thirty-six patients with localized testicular cancer with a mean age of 34.22±14.89 years and 36 healthy controls with a mean age of 26.67±2.89 years were enrolled in the study. White blood cell counts and NLR were calculated from complete blood cell counts. White blood cell counts and NLR were statistically significantly higher in patients with testicular cancer compared with the control group (p<0.0001 for all). Both white blood cell counts and NLR can be used as a simple test in the diagnosis of testicular cancer besides the well-known accurate serum tumor markers as AFP (alpha fetoprotein), hCG (human chorionic gonadotropin) and LDH (lactate dehydrogenase).

  19. Influence of Reading Material Characteristics on Study Time for Pre-Class Quizzes in a Flipped Classroom

    PubMed Central

    Hogg, Abigail

    2017-01-01

    Objective. To examine how instructor-developed reading material relates to pre-class time spent preparing for the readiness assurance process (RAP) in a team-based learning (TBL) course. Methods. Students within pharmacokinetics and physiology were asked to self-report the amount of time spent studying for the RAP. Correlation analysis and multilevel linear regression techniques were used to identify factors within the pre-class reading material that contribute to self-reported study time. Results. On average students spent 3.2 hours preparing for a section of material in the TBL format. The ratio of predicted reading time, based on reading speed and word count, and self-reported study time was greater than 1:3. Self-reported study time was positively correlated with word count, number of tables and figures, and overall page length. For predictors of self-reported study time, topic difficulty and number of figures were negative predictors whereas word count and number of self-assessments were positive predictors. Conclusion. Factors related to reading material are moderate predictors of self-reported student study time for an accountability assessment. A more significant finding is student self-reported study time is much greater than the time predicted by simple word count. PMID:28970604

  20. Estimation method for serial dilution experiments.

    PubMed

    Ben-David, Avishai; Davidson, Charles E

    2014-12-01

    Titration of microorganisms in infectious or environmental samples is a corner stone of quantitative microbiology. A simple method is presented to estimate the microbial counts obtained with the serial dilution technique for microorganisms that can grow on bacteriological media and develop into a colony. The number (concentration) of viable microbial organisms is estimated from a single dilution plate (assay) without a need for replicate plates. Our method selects the best agar plate with which to estimate the microbial counts, and takes into account the colony size and plate area that both contribute to the likelihood of miscounting the number of colonies on a plate. The estimate of the optimal count given by our method can be used to narrow the search for the best (optimal) dilution plate and saves time. The required inputs are the plate size, the microbial colony size, and the serial dilution factors. The proposed approach shows relative accuracy well within ±0.1log10 from data produced by computer simulations. The method maintains this accuracy even in the presence of dilution errors of up to 10% (for both the aliquot and diluent volumes), microbial counts between 10(4) and 10(12) colony-forming units, dilution ratios from 2 to 100, and plate size to colony size ratios between 6.25 to 200. Published by Elsevier B.V.

  1. Influence of Reading Material Characteristics on Study Time for Pre-Class Quizzes in a Flipped Classroom.

    PubMed

    Persky, Adam M; Hogg, Abigail

    2017-08-01

    Objective. To examine how instructor-developed reading material relates to pre-class time spent preparing for the readiness assurance process (RAP) in a team-based learning (TBL) course. Methods. Students within pharmacokinetics and physiology were asked to self-report the amount of time spent studying for the RAP. Correlation analysis and multilevel linear regression techniques were used to identify factors within the pre-class reading material that contribute to self-reported study time. Results. On average students spent 3.2 hours preparing for a section of material in the TBL format. The ratio of predicted reading time, based on reading speed and word count, and self-reported study time was greater than 1:3. Self-reported study time was positively correlated with word count, number of tables and figures, and overall page length. For predictors of self-reported study time, topic difficulty and number of figures were negative predictors whereas word count and number of self-assessments were positive predictors. Conclusion. Factors related to reading material are moderate predictors of self-reported student study time for an accountability assessment. A more significant finding is student self-reported study time is much greater than the time predicted by simple word count.

  2. Direct measurement of carbon-14 in carbon dioxide by liquid scintillation counting

    NASA Technical Reports Server (NTRS)

    Horrocks, D. L.

    1969-01-01

    Liquid scintillation counting technique is applied to the direct measurement of carbon-14 in carbon dioxide. This method has high counting efficiency and eliminates many of the basic problems encountered with previous techniques. The technique can be used to achieve a percent substitution reaction and is of interest as an analytical technique.

  3. On estimating scale invariance in stratocumulus cloud fields

    NASA Technical Reports Server (NTRS)

    Seze, Genevieve; Smith, Leonard A.

    1990-01-01

    Examination of cloud radiance fields derived from satellite observations sometimes indicates the existence of a range of scales over which the statistics of the field are scale invariant. Many methods were developed to quantify this scaling behavior in geophysics. The usefulness of such techniques depends both on the physics of the process being robust over a wide range of scales and on the availability of high resolution, low noise observations over these scales. These techniques (area perimeter relation, distribution of areas, estimation of the capacity, d0, through box counting, correlation exponent) are applied to the high resolution satellite data taken during the FIRE experiment and provides initial estimates of the quality of data required by analyzing simple sets. The results of the observed fields are contrasted with those of images of objects with known characteristics (e.g., dimension) where the details of the constructed image simulate current observational limits. Throughout when cloud elements and cloud boundaries are mentioned; it should be clearly understood that by this structures in the radiance field are meant: all the boundaries considered are defined by simple threshold arguments.

  4. A miniaturized counting technique for anaerobic bacteria.

    PubMed

    Sharpe, A N; Pettipher, G L; Lloyd, G R

    1976-12-01

    A miniaturized counting technique gave results as good as the pour-plate and Most Probable Number (MPN) techniques for enumeration of clostridia spp. and anaerobic isolates from the gut. Highest counts were obtained when ascorbic acid (1%) and dithiothreitol (0.015%) were added to the reinforced clostridial medium used for counting. This minimized the effect of exposure to air before incubation. The miniature technique allowed up to 40 samples to be plated and incubated in one McIntosh-Filde's-type anaerobic jar, compared with 3 or 4 by the normal pour plate.

  5. PPO-ethanol system as wavelength shifter for the Cherenkov counting technique using a liquid scintillation counter

    NASA Astrophysics Data System (ADS)

    Takiue, Makoto; Fujii, Haruo; Ishikawa, Hiroaki

    1984-12-01

    2, 5-diphenyloxazole (PPO) has been proposed as a wavelength shifter for Cherenkov counting. Since PPO is not incorporated with water, we have introduced the fluor into water in the form of micelle using a PPO-ethanol system. This technique makes it possible to obtain a high Cherenkov counting efficiency under stable sample conditions, attributed to the proper spectrometric features of the PPO. The 32P Cherenkov counting efficiency (68.4%) obtained from this technique is large as that measured with a conventional Cherenkov technique.

  6. Passive hand movements disrupt adults' counting strategies.

    PubMed

    Imbo, Ineke; Vandierendonck, André; Fias, Wim

    2011-01-01

    In the present study, we experimentally tested the role of hand motor circuits in simple-arithmetic strategies. Educated adults solved simple additions (e.g., 8 + 3) or simple subtractions (e.g., 11 - 3) while they were required to retrieve the answer from long-term memory (e.g., knowing that 8 + 3 = 11), to transform the problem by making an intermediate step (e.g., 8 + 3 = 8 + 2 + 1 = 10 + 1 = 11) or to count one-by-one (e.g., 8 + 3 = 8…9…10…11). During the process of solving the arithmetic problems, the experimenter did or did not move the participants' hand on a four-point matrix. The results show that passive hand movements disrupted the counting strategy while leaving the other strategies unaffected. This pattern of results is in agreement with a procedural account, showing that the involvement of hand motor circuits in adults' mathematical abilities is reminiscent of finger counting during childhood.

  7. Determination of microbial contamination of plastic cups for dairy products and utilization of electron beam treatment for sterilization.

    PubMed

    Tacker, M; Hametner, C; Wepner, B

    2002-01-01

    Packaging materials are often considered a critical control point in HACCP systems of food companies. Methods for the determination of the microbial contamination rate of plastic cups, especially for dairy products, must reliably detect single moulds, yeasts or coliforms. In this study, a comparison of a specially adapted coating method, impedance method, direct inoculation and membrane filter technique was carried out to determine contamination with yeasts, moulds, coliforms and total bacterial counts using the appropriate agar in each case. The coating method is recommended for determining yeasts, moulds and coliforms as it allows the localization of the microorganisms as well as the determination of single microorganisms. For total bacterial count, a direct inoculation technique is proposed. The employing of simple measures in the production and during transport of packaging materials, such as dust-prevention or tight sealing in polyethylene bags, heavily reduces microbial contamination rates of packaging material. To reduce contamination rates further, electron beam irradiation was applied: plastic cups sealed in polyethylene bags were treated with 4-5 kGy, a dose that already leads to sterile polystyrene and polypropylene cups without influencing mechanical characteristics of the packaging material.

  8. A powerful and flexible approach to the analysis of RNA sequence count data.

    PubMed

    Zhou, Yi-Hui; Xia, Kai; Wright, Fred A

    2011-10-01

    A number of penalization and shrinkage approaches have been proposed for the analysis of microarray gene expression data. Similar techniques are now routinely applied to RNA sequence transcriptional count data, although the value of such shrinkage has not been conclusively established. If penalization is desired, the explicit modeling of mean-variance relationships provides a flexible testing regimen that 'borrows' information across genes, while easily incorporating design effects and additional covariates. We describe BBSeq, which incorporates two approaches: (i) a simple beta-binomial generalized linear model, which has not been extensively tested for RNA-Seq data and (ii) an extension of an expression mean-variance modeling approach to RNA-Seq data, involving modeling of the overdispersion as a function of the mean. Our approaches are flexible, allowing for general handling of discrete experimental factors and continuous covariates. We report comparisons with other alternate methods to handle RNA-Seq data. Although penalized methods have advantages for very small sample sizes, the beta-binomial generalized linear model, combined with simple outlier detection and testing approaches, appears to have favorable characteristics in power and flexibility. An R package containing examples and sample datasets is available at http://www.bios.unc.edu/research/genomic_software/BBSeq yzhou@bios.unc.edu; fwright@bios.unc.edu Supplementary data are available at Bioinformatics online.

  9. Generalized estimators of avian abundance from count survey data

    USGS Publications Warehouse

    Royle, J. Andrew

    2004-01-01

    I consider modeling avian abundance from spatially referenced bird count data collected according to common protocols such as capture?recapture, multiple observer, removal sampling and simple point counts. Small sample sizes and large numbers of parameters have motivated many analyses that disregard the spatial indexing of the data, and thus do not provide an adequate treatment of spatial structure. I describe a general framework for modeling spatially replicated data that regards local abundance as a random process, motivated by the view that the set of spatially referenced local populations (at the sample locations) constitute a metapopulation. Under this view, attention can be focused on developing a model for the variation in local abundance independent of the sampling protocol being considered. The metapopulation model structure, when combined with the data generating model, define a simple hierarchical model that can be analyzed using conventional methods. The proposed modeling framework is completely general in the sense that broad classes of metapopulation models may be considered, site level covariates on detection and abundance may be considered, and estimates of abundance and related quantities may be obtained for sample locations, groups of locations, unsampled locations. Two brief examples are given, the first involving simple point counts, and the second based on temporary removal counts. Extension of these models to open systems is briefly discussed.

  10. 65Zn and 133Ba standardizing by photon-photon coincidence counting

    NASA Astrophysics Data System (ADS)

    Loureiro, Jamir S.; da Cruz, Paulo A. L.; Iwahara, Akira; Delgado, José U.; Lopes, Ricardo T.

    2018-03-01

    The LNMRI/Brazil has deployed a system using X-gamma coincidence technique for the standardizing radionuclide, which present simple and complex decay scheme with X-rays of energy below 100 keV. The work was carried on radionuclide metrology laboratory using a sodium iodide detector, for gamma photons, in combination with a high purity germanium detector for X-rays. Samples of 65Zn and 133Ba were standardized and the results for both radionuclides showed good precision and accuracy when compared with reference values. The standardization differences were 0.72 % for 65Zn and 0.48 % for 133Ba samples.

  11. Extension of the Dytlewski-style dead time correction formalism for neutron multiplicity counting to any order

    NASA Astrophysics Data System (ADS)

    Croft, Stephen; Favalli, Andrea

    2017-10-01

    Neutron multiplicity counting using shift-register calculus is an established technique in the science of international nuclear safeguards for the identification, verification, and assay of special nuclear materials. Typically passive counting is used for Pu and mixed Pu-U items and active methods are used for U materials. Three measured counting rates, singles, doubles and triples are measured and, in combination with a simple analytical point-model, are used to calculate characteristics of the measurement item in terms of known detector and nuclear parameters. However, the measurement problem usually involves more than three quantities of interest, but even in cases where the next higher order count rate, quads, is statistically viable, it is not quantitatively applied because corrections for dead time losses are currently not available in the predominant analysis paradigm. In this work we overcome this limitation by extending the commonly used dead time correction method, developed by Dytlewski, to quads. We also give results for pents, which may be of interest for certain special investigations. Extension to still higher orders, may be accomplished by inspection based on the sequence presented. We discuss the foundations of the Dytlewski method, give limiting cases, and highlight the opportunities and implications that these new results expose. In particular there exist a number of ways in which the new results may be combined with other approaches to extract the correlated rates, and this leads to various practical implementations.

  12. Extension of the Dytlewski-style dead time correction formalism for neutron multiplicity counting to any order

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Croft, Stephen; Favalli, Andrea

    Here, neutron multiplicity counting using shift-register calculus is an established technique in the science of international nuclear safeguards for the identification, verification, and assay of special nuclear materials. Typically passive counting is used for Pu and mixed Pu-U items and active methods are used for U materials. Three measured counting rates, singles, doubles and triples are measured and, in combination with a simple analytical point-model, are used to calculate characteristics of the measurement item in terms of known detector and nuclear parameters. However, the measurement problem usually involves more than three quantities of interest, but even in cases where themore » next higher order count rate, quads, is statistically viable, it is not quantitatively applied because corrections for dead time losses are currently not available in the predominant analysis paradigm. In this work we overcome this limitation by extending the commonly used dead time correction method, developed by Dytlewski, to quads. We also give results for pents, which may be of interest for certain special investigations. Extension to still higher orders, may be accomplished by inspection based on the sequence presented. We discuss the foundations of the Dytlewski method, give limiting cases, and highlight the opportunities and implications that these new results expose. In particular there exist a number of ways in which the new results may be combined with other approaches to extract the correlated rates, and this leads to various practical implementations.« less

  13. Extension of the Dytlewski-style dead time correction formalism for neutron multiplicity counting to any order

    DOE PAGES

    Croft, Stephen; Favalli, Andrea

    2017-07-16

    Here, neutron multiplicity counting using shift-register calculus is an established technique in the science of international nuclear safeguards for the identification, verification, and assay of special nuclear materials. Typically passive counting is used for Pu and mixed Pu-U items and active methods are used for U materials. Three measured counting rates, singles, doubles and triples are measured and, in combination with a simple analytical point-model, are used to calculate characteristics of the measurement item in terms of known detector and nuclear parameters. However, the measurement problem usually involves more than three quantities of interest, but even in cases where themore » next higher order count rate, quads, is statistically viable, it is not quantitatively applied because corrections for dead time losses are currently not available in the predominant analysis paradigm. In this work we overcome this limitation by extending the commonly used dead time correction method, developed by Dytlewski, to quads. We also give results for pents, which may be of interest for certain special investigations. Extension to still higher orders, may be accomplished by inspection based on the sequence presented. We discuss the foundations of the Dytlewski method, give limiting cases, and highlight the opportunities and implications that these new results expose. In particular there exist a number of ways in which the new results may be combined with other approaches to extract the correlated rates, and this leads to various practical implementations.« less

  14. LOW LEVEL COUNTING TECHNIQUES WITH SPECIAL REFERENCE TO BIOMEDICAL TRACER PROBLEMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hosain, F.; Nag, B.D.

    1959-12-01

    Low-level counting techniques in tracer experiments are discussed with emphasis on the measurement of beta and gamma radiations with Geiger and scintillation counting methods. The basic principles of low-level counting are outlined. Screen-wall counters, internal gas counters, low-level beta counters, scintillation spectrometers, liquid scintillators, and big scintillation installations are described. Biomedical tracer investigations are discussed. Applications of low-level techniques in archaeological dating, biology, and other problems are listed. (M.C.G.)

  15. Application of the backward extrapolation method to pulsed neutron sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Talamo, Alberto; Gohar, Yousry

    We report particle detectors operated in pulse mode are subjected to the dead-time effect. When the average of the detector counts is constant over time, correcting for the dead-time effect is simple and can be accomplished by analytical formulas. However, when the average of the detector counts changes over time it is more difficult to take into account the dead-time effect. When a subcritical nuclear assembly is driven by a pulsed neutron source, simple analytical formulas cannot be applied to the measured detector counts to correct for the dead-time effect because of the sharp change of the detector counts overmore » time. This work addresses this issue by using the backward extrapolation method. The latter can be applied not only to a continuous (e.g. californium) external neutron source but also to a pulsed external neutron source (e.g. by a particle accelerator) driving a subcritical nuclear assembly. Finally, the backward extrapolation method allows to obtain from the measured detector counts both the dead-time value and the real detector counts.« less

  16. A matrix-inversion method for gamma-source mapping from gamma-count data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adsley, Ian; Burgess, Claire; Bull, Richard K

    In a previous paper it was proposed that a simple matrix inversion method could be used to extract source distributions from gamma-count maps, using simple models to calculate the response matrix. The method was tested using numerically generated count maps. In the present work a 100 kBq Co{sup 60} source has been placed on a gridded surface and the count rate measured using a NaI scintillation detector. The resulting map of gamma counts was used as input to the matrix inversion procedure and the source position recovered. A multi-source array was simulated by superposition of several single-source count maps andmore » the source distribution was again recovered using matrix inversion. The measurements were performed for several detector heights. The effects of uncertainties in source-detector distances on the matrix inversion method are also examined. The results from this work give confidence in the application of the method to practical applications, such as the segregation of highly active objects amongst fuel-element debris. (authors)« less

  17. Application of the backward extrapolation method to pulsed neutron sources

    DOE PAGES

    Talamo, Alberto; Gohar, Yousry

    2017-09-23

    We report particle detectors operated in pulse mode are subjected to the dead-time effect. When the average of the detector counts is constant over time, correcting for the dead-time effect is simple and can be accomplished by analytical formulas. However, when the average of the detector counts changes over time it is more difficult to take into account the dead-time effect. When a subcritical nuclear assembly is driven by a pulsed neutron source, simple analytical formulas cannot be applied to the measured detector counts to correct for the dead-time effect because of the sharp change of the detector counts overmore » time. This work addresses this issue by using the backward extrapolation method. The latter can be applied not only to a continuous (e.g. californium) external neutron source but also to a pulsed external neutron source (e.g. by a particle accelerator) driving a subcritical nuclear assembly. Finally, the backward extrapolation method allows to obtain from the measured detector counts both the dead-time value and the real detector counts.« less

  18. Linkage Analysis of a Model Quantitative Trait in Humans: Finger Ridge Count Shows Significant Multivariate Linkage to 5q14.1

    PubMed Central

    Medland, Sarah E; Loesch, Danuta Z; Mdzewski, Bogdan; Zhu, Gu; Montgomery, Grant W; Martin, Nicholas G

    2007-01-01

    The finger ridge count (a measure of pattern size) is one of the most heritable complex traits studied in humans and has been considered a model human polygenic trait in quantitative genetic analysis. Here, we report the results of the first genome-wide linkage scan for finger ridge count in a sample of 2,114 offspring from 922 nuclear families. Both univariate linkage to the absolute ridge count (a sum of all the ridge counts on all ten fingers), and multivariate linkage analyses of the counts on individual fingers, were conducted. The multivariate analyses yielded significant linkage to 5q14.1 (Logarithm of odds [LOD] = 3.34, pointwise-empirical p-value = 0.00025) that was predominantly driven by linkage to the ring, index, and middle fingers. The strongest univariate linkage was to 1q42.2 (LOD = 2.04, point-wise p-value = 0.002, genome-wide p-value = 0.29). In summary, the combination of univariate and multivariate results was more informative than simple univariate analyses alone. Patterns of quantitative trait loci factor loadings consistent with developmental fields were observed, and the simple pleiotropic model underlying the absolute ridge count was not sufficient to characterize the interrelationships between the ridge counts of individual fingers. PMID:17907812

  19. SURVIVAL OF SALMONELLA SPECIES IN RIVER WATER.

    EPA Science Inventory

    The survival of four Salmonella strains in river water microcosms was monitored using culturing techniques, direct counts, whole cell hybridization, scanning electron microscopy, and resuscitation techniques via the direct viable count method and flow cytrometry. Plate counts of...

  20. SURVIVAL OF SALMONELLA SPECIES IN RIVER WATER

    EPA Science Inventory

    The survival of four Salmonella strains in river water microcosms was monitored by culturing techniques, direct counts, whole-cell hybridization, scanning electron microscopy, and resuscitation techniques via the direct viable count method and flow cytometry. Plate counts of bact...

  1. Circulating Tumor Cell Count Correlates with Colorectal Neoplasm Progression and Is a Prognostic Marker for Distant Metastasis in Non-Metastatic Patients

    NASA Astrophysics Data System (ADS)

    Tsai, Wen-Sy; Chen, Jinn-Shiun; Shao, Hung-Jen; Wu, Jen-Chia; Lai-Ming, Jr.; Lu, Si-Hong; Hung, Tsung-Fu; Chiu, Yen-Chi; You, Jeng-Fu; Hsieh, Pao-Shiu; Yeh, Chien-Yuh; Hung, Hsin-Yuan; Chiang, Sum-Fu; Lin, Geng-Ping; Tang, Reiping; Chang, Ying-Chih

    2016-04-01

    Enumeration of circulating tumor cells (CTCs) has been proven as a prognostic marker for metastatic colorectal cancer (m-CRC) patients. However, the currently available techniques for capturing and enumerating CTCs lack of required sensitivity to be applicable as a prognostic marker for non-metastatic patients as CTCs are even more rare. We have developed a microfluidic device utilizing antibody-conjugated non-fouling coating to eliminate nonspecific binding and to promote the multivalent binding of target cells. We then established the correlation of CTC counts and neoplasm progression through applying this platform to capture and enumerate CTCs in 2 mL of peripheral blood from healthy (n = 27), benign (n = 21), non-metastatic (n = 95), and m-CRC (n = 15) patients. The results showed that the CTC counts progressed from 0, 1, 5, to 36. Importantly, after 2-year follow-up on the non-metastatic CRC patients, we found that those who had ≥5 CTCs were 8 times more likely to develop distant metastasis within one year after curable surgery than those who had <5. In conclusion, by employing a sensitive device, CTC counts show good correlation with colorectal neoplasm, thus CTC may be as a simple, independent prognostic marker for the non-metastatic CRC patients who are at high risk of early recurrence.

  2. Can reliable sage-grouse lek counts be obtained using aerial infrared technology

    USGS Publications Warehouse

    Gillette, Gifford L.; Coates, Peter S.; Petersen, Steven; Romero, John P.

    2013-01-01

    More effective methods for counting greater sage-grouse (Centrocercus urophasianus) are needed to better assess population trends through enumeration or location of new leks. We describe an aerial infrared technique for conducting sage-grouse lek counts and compare this method with conventional ground-based lek count methods. During the breeding period in 2010 and 2011, we surveyed leks from fixed-winged aircraft using cryogenically cooled mid-wave infrared cameras and surveyed the same leks on the same day from the ground following a standard lek count protocol. We did not detect significant differences in lek counts between surveying techniques. These findings suggest that using a cryogenically cooled mid-wave infrared camera from an aerial platform to conduct lek surveys is an effective alternative technique to conventional ground-based methods, but further research is needed. We discuss multiple advantages to aerial infrared surveys, including counting in remote areas, representing greater spatial variation, and increasing the number of counted leks per season. Aerial infrared lek counts may be a valuable wildlife management tool that releases time and resources for other conservation efforts. Opportunities exist for wildlife professionals to refine and apply aerial infrared techniques to wildlife monitoring programs because of the increasing reliability and affordability of this technology.

  3. Parallel image logical operations using cross correlation

    NASA Technical Reports Server (NTRS)

    Strong, J. P., III

    1972-01-01

    Methods are presented for counting areas in an image in a parallel manner using noncoherent optical techniques. The techniques presented include the Levialdi algorithm for counting, optical techniques for binary operations, and cross-correlation.

  4. Effects of eliminating tension by means of epineural stitches: a comparative electrophysiological and histomorphometrical study using different suture techniques in an animal model.

    PubMed

    Bustamante, Jorge; Socolovsky, Mariano; Martins, Roberto S; Emmerich, Juan; Pennini, Maria Gabriela; Lausada, Natalia; Domitrovic, Luis

    2011-01-01

    Epineural stitches are a means to avoid tension in a nerve suture. We evaluate this technique, relative to interposed grafts and simple neurorraphy, in a rat model. Twenty rats were allocated to four groups. For Group 1, sectioning of the sciatic nerve was performed, a segment 4 mm long discarded, and epineural suture with distal anchoring stitches were placed resulting in slight tension neurorraphy. For Group 2, a simple neurorraphy was performed. For Group 3, a 4 mm long graft was employed and Group 4 served as control. Ninety days after, reoperation, latency of motor action potentials recording and axonal counts were performed. Inter-group comparison was done by means of ANOVA and the non-parametric Kruskal-Wallis test. The mean motor latency for the simple suture (2.27±0.77 ms) was lower than for the other two surgical groups, but lower than among controls (1.69±0.56 ms). Similar values were founding in both group 1 (2.66±0.71 ms) and group 3 (2.64±0.6 ms). When fibers diameters were compared a significant difference was identified between groups 2 and 3 (p=0.048). Good results can be obtained when suturing a nerve employ with epineural anchoring stitches. However, more studies are needed before extrapolating results to human nerve sutures.

  5. Validation of FFM PD counts for screening personality pathology and psychopathy in adolescence.

    PubMed

    Decuyper, Mieke; De Clercq, Barbara; De Bolle, Marleen; De Fruyt, Filip

    2009-12-01

    Miller and colleagues (Miller, Bagby, Pilkonis, Reynolds, & Lynam, 2005) recently developed a Five-Factor Model (FFM) personality disorder (PD) count technique for describing and diagnosing PDs and psychopathy in adulthood. This technique conceptualizes PDs relying on general trait models and uses facets from the expert-generated PD prototypes to score the FFM PDs. The present study corroborates on the study of Miller and colleagues (2005) and investigates in Study 1 whether the PD count technique shows discriminant validity to describe PDs in adolescence. Study 2 extends this objective to psychopathy. Results suggest that the FFM PD count technique is equally successful in adolescence as in adulthood to describe PD symptoms, supporting the use of this descriptive method in adolescence. The normative data and accompanying PD count benchmarks enable to use FFM scores for PD screening purposes in adolescence.

  6. The MIT/OSO 7 catalog of X-ray sources - Intensities, spectra, and long-term variability

    NASA Technical Reports Server (NTRS)

    Markert, T. H.; Laird, F. N.; Clark, G. W.; Hearn, D. R.; Sprott, G. F.; Li, F. K.; Bradt, H. V.; Lewin, W. H. G.; Schnopper, H. W.; Winkler, P. F.

    1979-01-01

    This paper is a summary of the observations of the cosmic X-ray sky performed by the MIT 1-40-keV X-ray detectors on OSO 7 between October 1971 and May 1973. Specifically, mean intensities or upper limits of all third Uhuru or OSO 7 cataloged sources (185 sources) in the 3-10-keV range are computed. For those sources for which a statistically significant (greater than 20) intensity was found in the 3-10-keV band (138 sources), further intensity determinations were made in the 1-15-keV, 1-6-keV, and 15-40-keV energy bands. Graphs and other simple techniques are provided to aid the user in converting the observed counting rates to convenient units and in determining spectral parameters. Long-term light curves (counting rates in one or more energy bands as a function of time) are plotted for 86 of the brighter sources.

  7. An Automated Statistical Technique for Counting Distinct Multiple Sclerosis Lesions.

    PubMed

    Dworkin, J D; Linn, K A; Oguz, I; Fleishman, G M; Bakshi, R; Nair, G; Calabresi, P A; Henry, R G; Oh, J; Papinutto, N; Pelletier, D; Rooney, W; Stern, W; Sicotte, N L; Reich, D S; Shinohara, R T

    2018-04-01

    Lesion load is a common biomarker in multiple sclerosis, yet it has historically shown modest association with clinical outcome. Lesion count, which encapsulates the natural history of lesion formation and is thought to provide complementary information, is difficult to assess in patients with confluent (ie, spatially overlapping) lesions. We introduce a statistical technique for cross-sectionally counting pathologically distinct lesions. MR imaging was used to assess the probability of a lesion at each location. The texture of this map was quantified using a novel technique, and clusters resembling the center of a lesion were counted. Validity compared with a criterion standard count was demonstrated in 60 subjects observed longitudinally, and reliability was determined using 14 scans of a clinically stable subject acquired at 7 sites. The proposed count and the criterion standard count were highly correlated ( r = 0.97, P < .001) and not significantly different (t 59 = -.83, P = .41), and the variability of the proposed count across repeat scans was equivalent to that of lesion load. After accounting for lesion load and age, lesion count was negatively associated ( t 58 = -2.73, P < .01) with the Expanded Disability Status Scale. Average lesion size had a higher association with the Expanded Disability Status Scale ( r = 0.35, P < .01) than lesion load ( r = 0.10, P = .44) or lesion count ( r = -.12, P = .36) alone. This study introduces a novel technique for counting pathologically distinct lesions using cross-sectional data and demonstrates its ability to recover obscured longitudinal information. The proposed count allows more accurate estimation of lesion size, which correlated more closely with disability scores than either lesion load or lesion count alone. © 2018 by American Journal of Neuroradiology.

  8. Cell counting in whole mount tissue volumes using expansion OCT (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Liu, Yehe; Gu, Shi; Watanabe, Michiko; Rollins, Andrew M.; Jenkins, Michael W.

    2017-02-01

    Abnormal cell proliferation and migration during heart development can lead to severe congenital heart defects (CHDs). Studying the spatial distribution of cells during embryonic development helps our understanding of how the heart develops and the etiology of certain CHDs. However, imaging large groups of single cells in intact tissue volumes is challenging. No current technique can accomplish this task in both a time-efficient and cost-effective manner. OCT has potential with its large field of view and micron-scale resolution, but even the highest resolution OCT systems have poor contrast for counting cells and have a small field of view compared to conventional OCT. We propose using a conventional OCT system and processing the sample to enhance cellular contrast. Inspired by the recently developed Expansion Microscopy, we permeated whole-mount embryonic tissue with a superabsorbent monomer solution and polymerized into a hydrogel. When hydrated in DI water, the tissue-hydrogel complex was uniformly enlarged ( 5X in all dimensions) without distorting the microscopic structure. This had a twofold effect: it increased the resolution by a factor of 5 and decreased scattering, which allowed us to resolve cellular level features deep in the tissue with high contrast using conventional OCT. We noted that cell nuclei caused significantly more backscattering than the other subcellular structures after expansion. Based on this property, we were able to distinguish individual cell nuclei, and thus count cells, in expanded OCT images with simple intensity thresholding. We demonstrate the technique with embryonic quail hearts at various developmental stages.

  9. A powerful and flexible approach to the analysis of RNA sequence count data

    PubMed Central

    Zhou, Yi-Hui; Xia, Kai; Wright, Fred A.

    2011-01-01

    Motivation: A number of penalization and shrinkage approaches have been proposed for the analysis of microarray gene expression data. Similar techniques are now routinely applied to RNA sequence transcriptional count data, although the value of such shrinkage has not been conclusively established. If penalization is desired, the explicit modeling of mean–variance relationships provides a flexible testing regimen that ‘borrows’ information across genes, while easily incorporating design effects and additional covariates. Results: We describe BBSeq, which incorporates two approaches: (i) a simple beta-binomial generalized linear model, which has not been extensively tested for RNA-Seq data and (ii) an extension of an expression mean–variance modeling approach to RNA-Seq data, involving modeling of the overdispersion as a function of the mean. Our approaches are flexible, allowing for general handling of discrete experimental factors and continuous covariates. We report comparisons with other alternate methods to handle RNA-Seq data. Although penalized methods have advantages for very small sample sizes, the beta-binomial generalized linear model, combined with simple outlier detection and testing approaches, appears to have favorable characteristics in power and flexibility. Availability: An R package containing examples and sample datasets is available at http://www.bios.unc.edu/research/genomic_software/BBSeq Contact: yzhou@bios.unc.edu; fwright@bios.unc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:21810900

  10. Problem Solvers: Problem--Counting Is for the Birds and Solutions--Multiplication along the Silk Road

    ERIC Educational Resources Information Center

    Jones, Carolyn M.

    2010-01-01

    Connecting mathematical thinking to the natural world can be as simple as looking up to the sky. Volunteer bird watchers around the world help scientists gather data about bird populations. Counting flying birds might inspire new estimation methods, such as counting the number of birds per unit of time and then timing the whole flock's flight. In…

  11. Cryogenic, high-resolution x-ray detector with high count rate capability

    DOEpatents

    Frank, Matthias; Mears, Carl A.; Labov, Simon E.; Hiller, Larry J.; Barfknecht, Andrew T.

    2003-03-04

    A cryogenic, high-resolution X-ray detector with high count rate capability has been invented. The new X-ray detector is based on superconducting tunnel junctions (STJs), and operates without thermal stabilization at or below 500 mK. The X-ray detector exhibits good resolution (.about.5-20 eV FWHM) for soft X-rays in the keV region, and is capable of counting at count rates of more than 20,000 counts per second (cps). Simple, FET-based charge amplifiers, current amplifiers, or conventional spectroscopy shaping amplifiers can provide the electronic readout of this X-ray detector.

  12. Laser Transmitter Design and Performance for the Slope Imaging Multi-Polarization Photon-Counting Lidar (SIMPL) Instrument

    NASA Technical Reports Server (NTRS)

    Yu, Anthony W.; Harding, David J.; Dabney, Philip W.

    2016-01-01

    The Slope Imaging Multi-polarization Photon-counting Lidar (SIMPL) instrument is a polarimetric, two-color, multibeam push broom laser altimeter developed through the NASA Earth Science Technology Office Instrument Incubator Program and has been flown successfully on multiple airborne platforms since 2008. In this talk we will discuss the laser transmitter performance and present recent science data collected over the Greenland ice sheet and sea ice in support of the NASA Ice Cloud and land Elevation Satellite 2 (ICESat-2) mission to be launched in 2017.

  13. Improved confidence intervals when the sample is counted an integer times longer than the blank.

    PubMed

    Potter, William Edward; Strzelczyk, Jadwiga Jodi

    2011-05-01

    Past computer solutions for confidence intervals in paired counting are extended to the case where the ratio of the sample count time to the blank count time is taken to be an integer, IRR. Previously, confidence intervals have been named Neyman-Pearson confidence intervals; more correctly they should have been named Neyman confidence intervals or simply confidence intervals. The technique utilized mimics a technique used by Pearson and Hartley to tabulate confidence intervals for the expected value of the discrete Poisson and Binomial distributions. The blank count and the contribution of the sample to the gross count are assumed to be Poisson distributed. The expected value of the blank count, in the sample count time, is assumed known. The net count, OC, is taken to be the gross count minus the product of IRR with the blank count. The probability density function (PDF) for the net count can be determined in a straightforward manner.

  14. Operator Priming and Generalization of Practice in Adults' Simple Arithmetic

    ERIC Educational Resources Information Center

    Chen, Yalin; Campbell, Jamie I. D.

    2016-01-01

    There is a renewed debate about whether educated adults solve simple addition problems (e.g., 2 + 3) by direct fact retrieval or by fast, automatic counting-based procedures. Recent research testing adults' simple addition and multiplication showed that a 150-ms preview of the operator (+ or ×) facilitated addition, but not multiplication,…

  15. No Generalization of Practice for Nonzero Simple Addition

    ERIC Educational Resources Information Center

    Campbell, Jamie I. D.; Beech, Leah C.

    2014-01-01

    Several types of converging evidence have suggested recently that skilled adults solve very simple addition problems (e.g., 2 + 1, 4 + 2) using a fast, unconscious counting algorithm. These results stand in opposition to the long-held assumption in the cognitive arithmetic literature that such simple addition problems normally are solved by fact…

  16. Performance of a Discrete Wavelet Transform for Compressing Plasma Count Data and its Application to the Fast Plasma Investigation on NASA's Magnetospheric Multiscale Mission

    NASA Technical Reports Server (NTRS)

    Barrie, Alexander C.; Yeh, Penshu; Dorelli, John C.; Clark, George B.; Paterson, William R.; Adrian, Mark L.; Holland, Matthew P.; Lobell, James V.; Simpson, David G.; Pollock, Craig J.; hide

    2015-01-01

    Plasma measurements in space are becoming increasingly faster, higher resolution, and distributed over multiple instruments. As raw data generation rates can exceed available data transfer bandwidth, data compression is becoming a critical design component. Data compression has been a staple of imaging instruments for years, but only recently have plasma measurement designers become interested in high performance data compression. Missions will often use a simple lossless compression technique yielding compression ratios of approximately 2:1, however future missions may require compression ratios upwards of 10:1. This study aims to explore how a Discrete Wavelet Transform combined with a Bit Plane Encoder (DWT/BPE), implemented via a CCSDS standard, can be used effectively to compress count information common to plasma measurements to high compression ratios while maintaining little or no compression error. The compression ASIC used for the Fast Plasma Investigation (FPI) on board the Magnetospheric Multiscale mission (MMS) is used for this study. Plasma count data from multiple sources is examined: resampled data from previous missions, randomly generated data from distribution functions, and simulations of expected regimes. These are run through the compression routines with various parameters to yield the greatest possible compression ratio while maintaining little or no error, the latter indicates that fully lossless compression is obtained. Finally, recommendations are made for future missions as to what can be achieved when compressing plasma count data and how best to do so.

  17. Standard Transistor Array (STAR). Volume 1: Placement technique

    NASA Technical Reports Server (NTRS)

    Cox, G. W.; Caroll, B. D.

    1979-01-01

    A large scale integration (LSI) technology, the standard transistor array uses a prefabricated understructure of transistors and a comprehensive library of digital logic cells to allow efficient fabrication of semicustom digital LSI circuits. The cell placement technique for this technology involves formation of a one dimensional cell layout and "folding" of the one dimensional placement onto the chip. It was found that, by use of various folding methods, high quality chip layouts can be achieved. Methods developed to measure of the "goodness" of the generated placements include efficient means for estimating channel usage requirements and for via counting. The placement and rating techniques were incorporated into a placement program (CAPSTAR). By means of repetitive use of the folding methods and simple placement improvement strategies, this program provides near optimum placements in a reasonable amount of time. The program was tested on several typical LSI circuits to provide performance comparisons both with respect to input parameters and with respect to the performance of other placement techniques. The results of this testing indicate that near optimum placements can be achieved by use of the procedures incurring severe time penalties.

  18. Estimating animal populations and body sizes from burrows: Marine ecologists have their heads buried in the sand

    NASA Astrophysics Data System (ADS)

    Schlacher, Thomas A.; Lucrezi, Serena; Peterson, Charles H.; Connolly, Rod M.; Olds, Andrew D.; Althaus, Franziska; Hyndes, Glenn A.; Maslo, Brooke; Gilby, Ben L.; Leon, Javier X.; Weston, Michael A.; Lastra, Mariano; Williams, Alan; Schoeman, David S.

    2016-06-01

    Most ecological studies require knowledge of animal abundance, but it can be challenging and destructive of habitat to obtain accurate density estimates for cryptic species, such as crustaceans that tunnel deeply into the seafloor, beaches, or mudflats. Such fossorial species are, however, widely used in environmental impact assessments, requiring sampling techniques that are reliable, efficient, and environmentally benign for these species and environments. Counting and measuring the entrances of burrows made by cryptic species is commonly employed to index population and body sizes of individuals. The fundamental premise is that burrow metrics consistently predict density and size. Here we review the evidence for this premise. We also review criteria for selecting among sampling methods: burrow counts, visual censuses, and physical collections. A simple 1:1 correspondence between the number of holes and population size cannot be assumed. Occupancy rates, indexed by the slope of regression models, vary widely between species and among sites for the same species. Thus, 'average' or 'typical' occupancy rates should not be extrapolated from site- or species specific field validations and then be used as conversion factors in other situations. Predictions of organism density made from burrow counts often have large uncertainty, being double to half of the predicted mean value. Whether such prediction uncertainty is 'acceptable' depends on investigators' judgements regarding the desired detectable effect sizes. Regression models predicting body size from burrow entrance dimensions are more precise, but parameter estimates of most models are specific to species and subject to site-to-site variation within species. These results emphasise the need to undertake thorough field validations of indirect census techniques that include tests of how sensitive predictive models are to changes in habitat conditions or human impacts. In addition, new technologies (e.g. drones, thermal-, acoustic- or chemical sensors) should be used to enhance visual census techniques of burrows and surface-active animals.

  19. The Slope Imaging Multi-Polarization Photon-Counting Lidar: Development and Performance Results

    NASA Technical Reports Server (NTRS)

    Dabney, Phillip

    2010-01-01

    The Slope Imaging Multi-polarization Photon-counting Lidar is an airborne instrument developed to demonstrate laser altimetry measurement methods that will enable more efficient observations of topography and surface properties from space. The instrument was developed through the NASA Earth Science Technology Office Instrument Incubator Program with a focus on cryosphere remote sensing. The SIMPL transmitter is an 11 KHz, 1064 nm, plane-polarized micropulse laser transmitter that is frequency doubled to 532 nm and split into four push-broom beams. The receiver employs single-photon, polarimetric ranging at 532 and 1064 nm using Single Photon Counting Modules in order to achieve simultaneous sampling of surface elevation, slope, roughness and depolarizing scattering properties, the latter used to differentiate surface types. Data acquired over ice-covered Lake Erie in February, 2009 are documenting SIMPL s measurement performance and capabilities, demonstrating differentiation of open water and several ice cover types. ICESat-2 will employ several of the technologies advanced by SIMPL, including micropulse, single photon ranging in a multi-beam, push-broom configuration operating at 532 nm.

  20. Setting a limit on anthropogenic sources of atmospheric 81Kr through Atom Trap Trace Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zappala, J. C.; Bailey, K.; Jiang, W.

    In this study, we place a 2.5% limit on the anthropogenic contribution to the modern abundance of 81Kr/Kr in the atmosphere at the 90% confidence level. Due to its simple production and transport in the terrestrial environment, 81Kr (half-life = 230,000 years) is an ideal tracer for old water and ice with mean residence times in the range of 105–106 years. In recent years, 81Kr-dating has been made available to the earth science community thanks to the development of Atom Trap Trace Analysis (ATTA), a laser-based atom counting technique. Further upgrades and improvements to the ATTA technique now allow usmore » to demonstrate 81Kr/Kr measurements with relative uncertainties of 1% and place this new limit on anthropogenic 81Kr. As a result of this limit, we have removed a potential systematic constraint for 81Kr-dating.« less

  1. Setting a limit on anthropogenic sources of atmospheric 81Kr through Atom Trap Trace Analysis

    DOE PAGES

    Zappala, J. C.; Bailey, K.; Jiang, W.; ...

    2017-02-09

    In this study, we place a 2.5% limit on the anthropogenic contribution to the modern abundance of 81Kr/Kr in the atmosphere at the 90% confidence level. Due to its simple production and transport in the terrestrial environment, 81Kr (half-life = 230,000 years) is an ideal tracer for old water and ice with mean residence times in the range of 105–106 years. In recent years, 81Kr-dating has been made available to the earth science community thanks to the development of Atom Trap Trace Analysis (ATTA), a laser-based atom counting technique. Further upgrades and improvements to the ATTA technique now allow usmore » to demonstrate 81Kr/Kr measurements with relative uncertainties of 1% and place this new limit on anthropogenic 81Kr. As a result of this limit, we have removed a potential systematic constraint for 81Kr-dating.« less

  2. Analysis on laser plasma emission for characterization of colloids by video-based computer program

    NASA Astrophysics Data System (ADS)

    Putri, Kirana Yuniati; Lumbantoruan, Hendra Damos; Isnaeni

    2016-02-01

    Laser-induced breakdown detection (LIBD) is a sensitive technique for characterization of colloids with small size and low concentration. There are two types of detection, optical and acoustic. Optical LIBD employs CCD camera to capture the plasma emission and uses the information to quantify the colloids. This technique requires sophisticated technology which is often pricey. In order to build a simple, home-made LIBD system, a dedicated computer program based on MATLAB™ for analyzing laser plasma emission was developed. The analysis was conducted by counting the number of plasma emissions (breakdowns) during a certain period of time. Breakdown probability provided information on colloid size and concentration. Validation experiment showed that the computer program performed well on analyzing the plasma emissions. Optical LIBD has A graphical user interface (GUI) was also developed to make the program more user-friendly.

  3. S-SPatt: simple statistics for patterns on Markov chains.

    PubMed

    Nuel, Grégory

    2005-07-01

    S-SPatt allows the counting of patterns occurrences in text files and, assuming these texts are generated from a random Markovian source, the computation of the P-value of a given observation using a simple binomial approximation.

  4. Considerations for monitoring raptor population trends based on counts of migrants

    USGS Publications Warehouse

    Titus, K.; Fuller, M.R.; Ruos, J.L.; Meyburg, B-U.; Chancellor, R.D.

    1989-01-01

    Various problems were identified with standardized hawk count data as annually collected at six sites. Some of the hawk lookouts increased their hours of observation from 1979-1985, thereby confounding the total counts. Data recording and missing data hamper coding of data and their use with modern analytical techniques. Coefficients of variation among years in counts averaged about 40%. The advantages and disadvantages of various analytical techniques are discussed including regression, non-parametric rank correlation trend analysis, and moving averages.

  5. A SIMPLE RADIO-CHROMATOGRAM SCANNER

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McWeeny, D.J.; Burton, H.S.

    1962-07-01

    A sturdy, simple, and reliable radiochromatogram scanner is described. It is constructed from a Panax Universal Castle, a Panax 5054 rate meter, and a recording milliamometer. The castle houses 2 thin endwindows, G--M tubes type GE- EHM-2 mounted one above the other, windows 1/4 in. apart. The 1-in. chromatogram passes continuously thru a selection of slits permitting a choice of views by the G-M tubes. The background count is 10.5 counts per minute and the detection limit for S/sup 35/ as a 3 mm spot on Whatman no. 1 paper is less than 0.2 nc. (T.R.H.)

  6. Time Evolving Fission Chain Theory and Fast Neutron and Gamma-Ray Counting Distributions

    DOE PAGES

    Kim, K. S.; Nakae, L. F.; Prasad, M. K.; ...

    2015-11-01

    Here, we solve a simple theoretical model of time evolving fission chains due to Feynman that generalizes and asymptotically approaches the point model theory. The point model theory has been used to analyze thermal neutron counting data. This extension of the theory underlies fast counting data for both neutrons and gamma rays from metal systems. Fast neutron and gamma-ray counting is now possible using liquid scintillator arrays with nanosecond time resolution. For individual fission chains, the differential equations describing three correlated probability distributions are solved: the time-dependent internal neutron population, accumulation of fissions in time, and accumulation of leaked neutronsmore » in time. Explicit analytic formulas are given for correlated moments of the time evolving chain populations. The equations for random time gate fast neutron and gamma-ray counting distributions, due to randomly initiated chains, are presented. Correlated moment equations are given for both random time gate and triggered time gate counting. There are explicit formulas for all correlated moments are given up to triple order, for all combinations of correlated fast neutrons and gamma rays. The nonlinear differential equations for probabilities for time dependent fission chain populations have a remarkably simple Monte Carlo realization. A Monte Carlo code was developed for this theory and is shown to statistically realize the solutions to the fission chain theory probability distributions. Combined with random initiation of chains and detection of external quanta, the Monte Carlo code generates time tagged data for neutron and gamma-ray counting and from these data the counting distributions.« less

  7. Effect of electron count and chemical complexity in the Ta-Nb-Hf-Zr-Ti high-entropy alloy superconductor.

    PubMed

    von Rohr, Fabian; Winiarski, Michał J; Tao, Jing; Klimczuk, Tomasz; Cava, Robert Joseph

    2016-11-15

    High-entropy alloys are made from random mixtures of principal elements on simple lattices, stabilized by a high mixing entropy. The recently discovered body-centered cubic (BCC) Ta-Nb-Hf-Zr-Ti high-entropy alloy superconductor appears to display properties of both simple crystalline intermetallics and amorphous materials; e.g., it has a well-defined superconducting transition along with an exceptional robustness against disorder. Here we show that the valence electron count dependence of the superconducting transition temperature in the high-entropy alloy falls between those of analogous simple solid solutions and amorphous materials and test the effect of alloy complexity on the superconductivity. We propose high-entropy alloys as excellent intermediate systems for studying superconductivity as it evolves between crystalline and amorphous materials.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, K. S.; Nakae, L. F.; Prasad, M. K.

    Here, we solve a simple theoretical model of time evolving fission chains due to Feynman that generalizes and asymptotically approaches the point model theory. The point model theory has been used to analyze thermal neutron counting data. This extension of the theory underlies fast counting data for both neutrons and gamma rays from metal systems. Fast neutron and gamma-ray counting is now possible using liquid scintillator arrays with nanosecond time resolution. For individual fission chains, the differential equations describing three correlated probability distributions are solved: the time-dependent internal neutron population, accumulation of fissions in time, and accumulation of leaked neutronsmore » in time. Explicit analytic formulas are given for correlated moments of the time evolving chain populations. The equations for random time gate fast neutron and gamma-ray counting distributions, due to randomly initiated chains, are presented. Correlated moment equations are given for both random time gate and triggered time gate counting. There are explicit formulas for all correlated moments are given up to triple order, for all combinations of correlated fast neutrons and gamma rays. The nonlinear differential equations for probabilities for time dependent fission chain populations have a remarkably simple Monte Carlo realization. A Monte Carlo code was developed for this theory and is shown to statistically realize the solutions to the fission chain theory probability distributions. Combined with random initiation of chains and detection of external quanta, the Monte Carlo code generates time tagged data for neutron and gamma-ray counting and from these data the counting distributions.« less

  9. A three-wavelength multi-channel brain functional imager based on digital lock-in photon-counting technique

    NASA Astrophysics Data System (ADS)

    Ding, Xuemei; Wang, Bingyuan; Liu, Dongyuan; Zhang, Yao; He, Jie; Zhao, Huijuan; Gao, Feng

    2018-02-01

    During the past two decades there has been a dramatic rise in the use of functional near-infrared spectroscopy (fNIRS) as a neuroimaging technique in cognitive neuroscience research. Diffuse optical tomography (DOT) and optical topography (OT) can be employed as the optical imaging techniques for brain activity investigation. However, most current imagers with analogue detection are limited by sensitivity and dynamic range. Although photon-counting detection can significantly improve detection sensitivity, the intrinsic nature of sequential excitations reduces temporal resolution. To improve temporal resolution, sensitivity and dynamic range, we develop a multi-channel continuous-wave (CW) system for brain functional imaging based on a novel lock-in photon-counting technique. The system consists of 60 Light-emitting device (LED) sources at three wavelengths of 660nm, 780nm and 830nm, which are modulated by current-stabilized square-wave signals at different frequencies, and 12 photomultiplier tubes (PMT) based on lock-in photon-counting technique. This design combines the ultra-high sensitivity of the photon-counting technique with the parallelism of the digital lock-in technique. We can therefore acquire the diffused light intensity for all the source-detector pairs (SD-pairs) in parallel. The performance assessments of the system are conducted using phantom experiments, and demonstrate its excellent measurement linearity, negligible inter-channel crosstalk, strong noise robustness and high temporal resolution.

  10. An Intercomparison Between Radar Reflectivity and the IR Cloud Classification Technique for the TOGA-COARE Area

    NASA Technical Reports Server (NTRS)

    Carvalho, L. M. V.; Rickenbach, T.

    1999-01-01

    Satellite infrared (IR) and visible (VIS) images from the Tropical Ocean Global Atmosphere - Coupled Ocean Atmosphere Response Experiment (TOGA-COARE) experiment are investigated through the use of Clustering Analysis. The clusters are obtained from the values of IR and VIS counts and the local variance for both channels. The clustering procedure is based on the standardized histogram of each variable obtained from 179 pairs of images. A new approach to classify high clouds using only IR and the clustering technique is proposed. This method allows the separation of the enhanced convection in two main classes: convective tops, more closely related to the most active core of the storm, and convective systems, which produce regions of merged, thick anvil clouds. The resulting classification of different portions of cloudiness is compared to the radar reflectivity field for intensive events. Convective Systems and Convective Tops are followed during their life cycle using the IR clustering method. The areal coverage of precipitation and features related to convective and stratiform rain is obtained from the radar for each stage of the evolving Mesoscale Convective Systems (MCS). In order to compare the IR clustering method with a simple threshold technique, two IR thresholds (Tir) were used to identify different portions of cloudiness, Tir=240K which roughly defines the extent of all cloudiness associated with the MCS, and Tir=220K which indicates the presence of deep convection. It is shown that the IR clustering technique can be used as a simple alternative to identify the actual portion of convective and stratiform rainfall.

  11. A COMPARISON OF GALAXY COUNTING TECHNIQUES IN SPECTROSCOPICALLY UNDERSAMPLED REGIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Specian, Mike A.; Szalay, Alex S., E-mail: mspecia1@jhu.edu, E-mail: szalay@jhu.edu

    2016-11-01

    Accurate measures of galactic overdensities are invaluable for precision cosmology. Obtaining these measurements is complicated when members of one’s galaxy sample lack radial depths, most commonly derived via spectroscopic redshifts. In this paper, we utilize the Sloan Digital Sky Survey’s Main Galaxy Sample to compare seven methods of counting galaxies in cells when many of those galaxies lack redshifts. These methods fall into three categories: assigning galaxies discrete redshifts, scaling the numbers counted using regions’ spectroscopic completeness properties, and employing probabilistic techniques. We split spectroscopically undersampled regions into three types—those inside the spectroscopic footprint, those outside but adjacent to it,more » and those distant from it. Through Monte Carlo simulations, we demonstrate that the preferred counting techniques are a function of region type, cell size, and redshift. We conclude by reporting optimal counting strategies under a variety of conditions.« less

  12. "Compacted" procedures for adults' simple addition: A review and critique of the evidence.

    PubMed

    Chen, Yalin; Campbell, Jamie I D

    2018-04-01

    We review recent empirical findings and arguments proffered as evidence that educated adults solve elementary addition problems (3 + 2, 4 + 1) using so-called compacted procedures (e.g., unconscious, automatic counting); a conclusion that could have significant pedagogical implications. We begin with the large-sample experiment reported by Uittenhove, Thevenot and Barrouillet (2016, Cognition, 146, 289-303), which tested 90 adults on the 81 single-digit addition problems from 1 + 1 to 9 + 9. They identified the 12 very-small addition problems with different operands both ≤ 4 (e.g., 4 + 3) as a distinct subgroup of problems solved by unconscious, automatic counting: These items yielded a near-perfectly linear increase in answer response time (RT) yoked to the sum of the operands. Using the data reported in the article, however, we show that there are clear violations of the sum-counting model's predictions among the very-small addition problems, and that there is no real RT boundary associated with addends ≤4. Furthermore, we show that a well-known associative retrieval model of addition facts-the network interference theory (Campbell, 1995)-predicts the results observed for these problems with high precision. We also review the other types of evidence adduced for the compacted procedure theory of simple addition and conclude that these findings are unconvincing in their own right and only distantly consistent with automatic counting. We conclude that the cumulative evidence for fast compacted procedures for adults' simple addition does not justify revision of the long-standing assumption that direct memory retrieval is ultimately the most efficient process of simple addition for nonzero problems, let alone sufficient to recommend significant changes to basic addition pedagogy.

  13. Document retrieval on repetitive string collections.

    PubMed

    Gagie, Travis; Hartikainen, Aleksi; Karhu, Kalle; Kärkkäinen, Juha; Navarro, Gonzalo; Puglisi, Simon J; Sirén, Jouni

    2017-01-01

    Most of the fastest-growing string collections today are repetitive, that is, most of the constituent documents are similar to many others. As these collections keep growing, a key approach to handling them is to exploit their repetitiveness, which can reduce their space usage by orders of magnitude. We study the problem of indexing repetitive string collections in order to perform efficient document retrieval operations on them. Document retrieval problems are routinely solved by search engines on large natural language collections, but the techniques are less developed on generic string collections. The case of repetitive string collections is even less understood, and there are very few existing solutions. We develop two novel ideas, interleaved LCPs and precomputed document lists , that yield highly compressed indexes solving the problem of document listing (find all the documents where a string appears), top- k document retrieval (find the k documents where a string appears most often), and document counting (count the number of documents where a string appears). We also show that a classical data structure supporting the latter query becomes highly compressible on repetitive data. Finally, we show how the tools we developed can be combined to solve ranked conjunctive and disjunctive multi-term queries under the simple [Formula: see text] model of relevance. We thoroughly evaluate the resulting techniques in various real-life repetitiveness scenarios, and recommend the best choices for each case.

  14. Negative Avalanche Feedback Detectors for Photon-Counting Optical Communications

    NASA Technical Reports Server (NTRS)

    Farr, William H.

    2009-01-01

    Negative Avalanche Feedback photon counting detectors with near-infrared spectral sensitivity offer an alternative to conventional Geiger mode avalanche photodiode or phototube detectors for free space communications links at 1 and 1.55 microns. These devices demonstrate linear mode photon counting without requiring any external reset circuitry and may even be operated at room temperature. We have now characterized the detection efficiency, dark count rate, after-pulsing, and single photon jitter for three variants of this new detector class, as well as operated these uniquely simple to use devices in actual photon starved free space optical communications links.

  15. Improving inferences in population studies of rare species that are detected imperfectly

    USGS Publications Warehouse

    MacKenzie, D.I.; Nichols, J.D.; Sutton, N.; Kawanishi, K.; Bailey, L.L.

    2005-01-01

    For the vast majority of cases, it is highly unlikely that all the individuals of a population will be encountered during a study. Furthermore, it is unlikely that a constant fraction of the population is encountered over times, locations, or species to be compared. Hence, simple counts usually will not be good indices of population size. We recommend that detection probabilities (the probability of including an individual in a count) be estimated and incorporated into inference procedures. However, most techniques for estimating detection probability require moderate sample sizes, which may not be achievable when studying rare species. In order to improve the reliability of inferences from studies of rare species, we suggest two general approaches that researchers may wish to consider that incorporate the concept of imperfect detectability: (1) borrowing information about detectability or the other quantities of interest from other times, places, or species; and (2) using state variables other than abundance (e.g., species richness and occupancy). We illustrate these suggestions with examples and discuss the relative benefits and drawbacks of each approach.

  16. Effects of student pairing and public review on physical activity during school recess.

    PubMed

    Zerger, Heather M; Miller, Bryon G; Valbuena, Diego; Miltenberger, Raymond G

    2017-07-01

    The purpose of this study was to evaluate the effects of student pairing and feedback during recess on children's step counts. During baseline, participants wore a sealed pedometer during recess. During intervention, we paired participants with higher step counts with participants with lower step counts. We encouraged teams to compete for the highest step count each day and provided feedback on their performance during each recess session. Results showed a large mean increase in step count from baseline to intervention. These results suggest that children's steps during recess can be increased with a simple and cost-effective intervention. © 2017 Society for the Experimental Analysis of Behavior.

  17. [Acquisition of arithmetic knowledge].

    PubMed

    Fayol, Michel

    2008-01-01

    The focus of this paper is on contemporary research on the number counting and arithmetical competencies that emerge during infancy, the preschool years, and the elementary school. I provide a brief overview of the evolution of children's conceptual knowledge of arithmetic knowledge, the acquisition and use of counting and how they solve simple arithmetic problems (e.g. 4 + 3).

  18. Triple-Label β Liquid Scintillation Counting

    PubMed Central

    Bukowski, Thomas R.; Moffett, Tyler C.; Revkin, James H.; Ploger, James D.; Bassingthwaighte, James B.

    2010-01-01

    The detection of radioactive compounds by liquid scintillation has revolutionized modern biology, yet few investigators make full use of the power of this technique. Even though multiple isotope counting is considerably more difficult than single isotope counting, many experimental designs would benefit from using more than one isotope. The development of accurate isotope counting techniques enabling the simultaneous use of three β-emitting tracers has facilitated studies in our laboratory using the multiple tracer indicator dilution technique for assessing rates of transmembrane transport and cellular metabolism. The details of sample preparation, and of stabilizing the liquid scintillation spectra of the tracers, are critical to obtaining good accuracy. Reproducibility is enhanced by obtaining detailed efficiency/quench curves for each particular set of tracers and solvent media. The numerical methods for multiple-isotope quantitation depend on avoiding error propagation (inherent to successive subtraction techniques) by using matrix inversion. Experimental data obtained from triple-label β counting illustrate reproducibility and good accuracy even when the relative amounts of different tracers in samples of protein/electrolyte solutions, plasma, and blood are changed. PMID:1514684

  19. Effect of electron count and chemical complexity in the Ta-Nb-Hf-Zr-Ti high-entropy alloy superconductor

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    von Rohr, Fabian; Winiarski, Michał J.; Tao, Jing

    High-entropy alloys are made from random mixtures of principal elements on simple lattices, stabilized by a high mixing entropy. The recently discovered body-centered cubic (BCC) Ta-Nb-Hf-Zr-Ti high-entropy alloy superconductor appears to display properties of both simple crystalline intermetallics and amorphous materials; e.g., it has a well-defined superconducting transition along with an exceptional robustness against disorder. Here we show that the valence electron count dependence of the superconducting transition temperature in the high-entropy alloy falls between those of analogous simple solid solutions and amorphous materials and test the effect of alloy complexity on the superconductivity. We propose high-entropy alloys as excellentmore » intermediate systems for studying superconductivity as it evolves between crystalline and amorphous materials.« less

  20. Effect of electron count and chemical complexity in the Ta-Nb-Hf-Zr-Ti high-entropy alloy superconductor

    PubMed Central

    von Rohr, Fabian; Winiarski, Michał J.; Tao, Jing; Klimczuk, Tomasz; Cava, Robert Joseph

    2016-01-01

    High-entropy alloys are made from random mixtures of principal elements on simple lattices, stabilized by a high mixing entropy. The recently discovered body-centered cubic (BCC) Ta-Nb-Hf-Zr-Ti high-entropy alloy superconductor appears to display properties of both simple crystalline intermetallics and amorphous materials; e.g., it has a well-defined superconducting transition along with an exceptional robustness against disorder. Here we show that the valence electron count dependence of the superconducting transition temperature in the high-entropy alloy falls between those of analogous simple solid solutions and amorphous materials and test the effect of alloy complexity on the superconductivity. We propose high-entropy alloys as excellent intermediate systems for studying superconductivity as it evolves between crystalline and amorphous materials. PMID:27803330

  1. Effect of electron count and chemical complexity in the Ta-Nb-Hf-Zr-Ti high-entropy alloy superconductor

    DOE PAGES

    von Rohr, Fabian; Winiarski, Michał J.; Tao, Jing; ...

    2016-11-01

    High-entropy alloys are made from random mixtures of principal elements on simple lattices, stabilized by a high mixing entropy. The recently discovered body-centered cubic (BCC) Ta-Nb-Hf-Zr-Ti high-entropy alloy superconductor appears to display properties of both simple crystalline intermetallics and amorphous materials; e.g., it has a well-defined superconducting transition along with an exceptional robustness against disorder. Here we show that the valence electron count dependence of the superconducting transition temperature in the high-entropy alloy falls between those of analogous simple solid solutions and amorphous materials and test the effect of alloy complexity on the superconductivity. We propose high-entropy alloys as excellentmore » intermediate systems for studying superconductivity as it evolves between crystalline and amorphous materials.« less

  2. High throughput single cell counting in droplet-based microfluidics.

    PubMed

    Lu, Heng; Caen, Ouriel; Vrignon, Jeremy; Zonta, Eleonora; El Harrak, Zakaria; Nizard, Philippe; Baret, Jean-Christophe; Taly, Valérie

    2017-05-02

    Droplet-based microfluidics is extensively and increasingly used for high-throughput single-cell studies. However, the accuracy of the cell counting method directly impacts the robustness of such studies. We describe here a simple and precise method to accurately count a large number of adherent and non-adherent human cells as well as bacteria. Our microfluidic hemocytometer provides statistically relevant data on large populations of cells at a high-throughput, used to characterize cell encapsulation and cell viability during incubation in droplets.

  3. Polarimetric, Two-Color, Photon-Counting Laser Altimeter Measurements of Forest Canopy Structure

    NASA Technical Reports Server (NTRS)

    Harding, David J.; Dabney, Philip W.; Valett, Susan

    2011-01-01

    Laser altimeter measurements of forest stands with distinct structures and compositions have been acquired at 532 nm (green) and 1064 nm (near-infrared) wavelengths and parallel and perpendicular polarization states using the Slope Imaging Multi-polarization Photon Counting Lidar (SIMPL). The micropulse, single photon ranging measurement approach employed by SIMPL provides canopy structure measurements with high vertical and spatial resolution. Using a height distribution analysis method adapted from conventional, 1064 nm, full-waveform lidar remote sensing, the sensitivity of two parameters commonly used for above-ground biomass estimation are compared as a function of wavelength. The results for the height of median energy (HOME) and canopy cover are for the most part very similar, indicating biomass estimations using lidars operating at green and near-infrared wavelengths will yield comparable estimates. The expected detection of increasing depolarization with depth into the canopies due to volume multiple-scattering was not observed, possibly due to the small laser footprint and the small detector field of view used in the SIMPL instrument. The results of this work provide pathfinder information for NASA's ICESat-2 mission that will employ a 532 nm, micropulse, photon counting laser altimeter.

  4. Self-Reported Alcohol Consumption and Sexual Behavior in Males and Females: Using the Unmatched-Count Technique to Examine Reporting Practices of Socially Sensitive Subjects in a Sample of University Students

    ERIC Educational Resources Information Center

    Walsh, Jeffrey A.; Braithwaite, Jeremy

    2008-01-01

    This work, drawing on the literature on alcohol consumption, sexual behavior, and researching sensitive topics, tests the efficacy of the unmatched-count technique (UCT) in establishing higher rates of truthful self-reporting when compared to traditional survey techniques. Traditional techniques grossly underestimate the scope of problems…

  5. Rapid on-site monitoring of Legionella pneumophila in cooling tower water using a portable microfluidic system.

    PubMed

    Yamaguchi, Nobuyasu; Tokunaga, Yusuke; Goto, Satoko; Fujii, Yudai; Banno, Fumiya; Edagawa, Akiko

    2017-06-08

    Legionnaires' disease, predominantly caused by the bacterium Legionella pneumophila, has increased in prevalence worldwide. The most common mode of transmission of Legionella is inhalation of contaminated aerosols, such as those generated by cooling towers. Simple, rapid and accurate methods to enumerate L. pneumophila are required to prevent the spread of this organism. Here, we applied a microfluidic device for on-chip fluorescent staining and semi-automated counting of L. pneumophila in cooling tower water. We also constructed a portable system for rapid on-site monitoring and used it to enumerate target bacterial cells rapidly flowing in the microchannel. A fluorescently-labelled polyclonal antibody was used for the selective detection of L. pneumophila serogroup 1 in the samples. The counts of L. pneumophila in cooling tower water obtained using the system and fluorescence microscopy were similar. The detection limit of the system was 10 4  cells/ml, but lower numbers of L. pneumophila cells (10 1 to 10 3  cells/ml) could be detected following concentration of 0.5-3 L of the water sample by filtration. Our technique is rapid to perform (1.5 h), semi-automated (on-chip staining and counting), and portable for on-site measurement, and it may therefore be effective in the initial screening of Legionella contamination in freshwater.

  6. Determination of (241)Pu by the method of disturbed radioactive equilibrium using 2πα-counting and precision gamma-spectrometry.

    PubMed

    Alekseev, I; Kuzmina, T

    2016-04-01

    A simple technique is proposed for the determination of the content of (241)Pu, which is based on disturbance of radioactive equilibrium in the genetically related (237)U←(241)Pu→(241)Am decay chain of radionuclides, with the subsequent use of 2πα-counting and precision gamma-spectroscopy for monitoring the process of restoration of that equilibrium. It has been shown that the data on dynamics of accumulation of the daughter (241)Am, which were obtained from the results of measurements of α- and γ-spectra of the samples, correspond to the estimates calculated for the chain of two genetically related radionuclides, the differences in the estimates of (241)Pu radioactivity not exceeding 2%. Combining the different methods of registration (2πα-counting, semiconductor alpha- and gamma-spectrometry) enables the proposed method to be efficiently applied both for calibration of (241)Pu-sources (from several hundreds of kBq and higher) and for radioisotopic analysis of plutonium mixtures. In doing so, there is a deep purification of (241)Pu from its daughter decay products required due to unavailability of commercial detectors that could make it possible, based only on analysis of alpha-spectra, to conduct quantitative analysis of the content of (238)Pu and (241)Am. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Isospectral discrete and quantum graphs with the same flip counts and nodal counts

    NASA Astrophysics Data System (ADS)

    Juul, Jonas S.; Joyner, Christopher H.

    2018-06-01

    The existence of non-isomorphic graphs which share the same Laplace spectrum (to be referred to as isospectral graphs) leads naturally to the following question: what additional information is required in order to resolve isospectral graphs? It was suggested by Band, Shapira and Smilansky that this might be achieved by either counting the number of nodal domains or the number of times the eigenfunctions change sign (the so-called flip count) (Band et al 2006 J. Phys. A: Math. Gen. 39 13999–4014 Band and Smilansky 2007 Eur. Phys. J. Spec. Top. 145 171–9). Recent examples of (discrete) isospectral graphs with the same flip count and nodal count have been constructed by Ammann by utilising Godsil–McKay switching (Ammann private communication). Here, we provide a simple alternative mechanism that produces systematic examples of both discrete and quantum isospectral graphs with the same flip and nodal counts.

  8. Validation of diffuse correlation spectroscopy sensitivity to nicotinamide-induced blood flow elevation in the murine hindlimb using the fluorescent microsphere technique

    NASA Astrophysics Data System (ADS)

    Proctor, Ashley R.; Ramirez, Gabriel A.; Han, Songfeng; Liu, Ziping; Bubel, Tracy M.; Choe, Regine

    2018-03-01

    Nicotinamide has been shown to affect blood flow in both tumor and normal tissues, including skeletal muscle. Intraperitoneal injection of nicotinamide was used as a simple intervention to test the sensitivity of noninvasive diffuse correlation spectroscopy (DCS) to changes in blood flow in the murine left quadriceps femoris skeletal muscle. DCS was then compared with the gold-standard fluorescent microsphere (FM) technique for validation. The nicotinamide dose-response experiment showed that relative blood flow measured by DCS increased following treatment with 500- and 1000-mg / kg nicotinamide. The DCS and FM technique comparison showed that blood flow index measured by DCS was correlated with FM counts quantified by image analysis. The results of this study show that DCS is sensitive to nicotinamide-induced blood flow elevation in the murine left quadriceps femoris. Additionally, the results of the comparison were consistent with similar studies in higher-order animal models, suggesting that mouse models can be effectively employed to investigate the utility of DCS for various blood flow measurement applications.

  9. Tracking flow of leukocytes in blood for drug analysis

    NASA Astrophysics Data System (ADS)

    Basharat, Arslan; Turner, Wesley; Stephens, Gillian; Badillo, Benjamin; Lumpkin, Rick; Andre, Patrick; Perera, Amitha

    2011-03-01

    Modern microscopy techniques allow imaging of circulating blood components under vascular flow conditions. The resulting video sequences provide unique insights into the behavior of blood cells within the vasculature and can be used as a method to monitor and quantitate the recruitment of inflammatory cells at sites of vascular injury/ inflammation and potentially serve as a pharmacodynamic biomarker, helping screen new therapies and individualize dose and combinations of drugs. However, manual analysis of these video sequences is intractable, requiring hours per 400 second video clip. In this paper, we present an automated technique to analyze the behavior and recruitment of human leukocytes in whole blood under physiological conditions of shear through a simple multi-channel fluorescence microscope in real-time. This technique detects and tracks the recruitment of leukocytes to a bioactive surface coated on a flow chamber. Rolling cells (cells which partially bind to the bioactive matrix) are detected counted, and have their velocity measured and graphed. The challenges here include: high cell density, appearance similarity, and low (1Hz) frame rate. Our approach performs frame differencing based motion segmentation, track initialization and online tracking of individual leukocytes.

  10. Covariance mapping techniques

    NASA Astrophysics Data System (ADS)

    Frasinski, Leszek J.

    2016-08-01

    Recent technological advances in the generation of intense femtosecond pulses have made covariance mapping an attractive analytical technique. The laser pulses available are so intense that often thousands of ionisation and Coulomb explosion events will occur within each pulse. To understand the physics of these processes the photoelectrons and photoions need to be correlated, and covariance mapping is well suited for operating at the high counting rates of these laser sources. Partial covariance is particularly useful in experiments with x-ray free electron lasers, because it is capable of suppressing pulse fluctuation effects. A variety of covariance mapping methods is described: simple, partial (single- and multi-parameter), sliced, contingent and multi-dimensional. The relationship to coincidence techniques is discussed. Covariance mapping has been used in many areas of science and technology: inner-shell excitation and Auger decay, multiphoton and multielectron ionisation, time-of-flight and angle-resolved spectrometry, infrared spectroscopy, nuclear magnetic resonance imaging, stimulated Raman scattering, directional gamma ray sensing, welding diagnostics and brain connectivity studies (connectomics). This review gives practical advice for implementing the technique and interpreting the results, including its limitations and instrumental constraints. It also summarises recent theoretical studies, highlights unsolved problems and outlines a personal view on the most promising research directions.

  11. Quantification of breast density with spectral mammography based on a scanned multi-slit photon-counting detector: a feasibility study.

    PubMed

    Ding, Huanjun; Molloi, Sabee

    2012-08-07

    A simple and accurate measurement of breast density is crucial for the understanding of its impact in breast cancer risk models. The feasibility to quantify volumetric breast density with a photon-counting spectral mammography system has been investigated using both computer simulations and physical phantom studies. A computer simulation model involved polyenergetic spectra from a tungsten anode x-ray tube and a Si-based photon-counting detector has been evaluated for breast density quantification. The figure-of-merit (FOM), which was defined as the signal-to-noise ratio of the dual energy image with respect to the square root of mean glandular dose, was chosen to optimize the imaging protocols, in terms of tube voltage and splitting energy. A scanning multi-slit photon-counting spectral mammography system has been employed in the experimental study to quantitatively measure breast density using dual energy decomposition with glandular and adipose equivalent phantoms of uniform thickness. Four different phantom studies were designed to evaluate the accuracy of the technique, each of which addressed one specific variable in the phantom configurations, including thickness, density, area and shape. In addition to the standard calibration fitting function used for dual energy decomposition, a modified fitting function has been proposed, which brought the tube voltages used in the imaging tasks as the third variable in dual energy decomposition. For an average sized 4.5 cm thick breast, the FOM was maximized with a tube voltage of 46 kVp and a splitting energy of 24 keV. To be consistent with the tube voltage used in current clinical screening exam (∼32 kVp), the optimal splitting energy was proposed to be 22 keV, which offered a FOM greater than 90% of the optimal value. In the experimental investigation, the root-mean-square (RMS) error in breast density quantification for all four phantom studies was estimated to be approximately 1.54% using standard calibration function. The results from the modified fitting function, which integrated the tube voltage as a variable in the calibration, indicated a RMS error of approximately 1.35% for all four studies. The results of the current study suggest that photon-counting spectral mammography systems may potentially be implemented for an accurate quantification of volumetric breast density, with an RMS error of less than 2%, using the proposed dual energy imaging technique.

  12. A simple and inexpensive method for maintaining a defined flora mouse colony.

    PubMed

    Sedlacek, R S; Mason, K A

    1977-10-01

    The use of autoclaved cages, feed, bedding, water, and filter caps combined with aseptic techniques of animal husbandry in an existing mouse colony was ineffective in maintaining a defined flora colony. The addition of a laminar air flow bench equipped with a high efficiency particulate air filter provided a sterile environment in which to manipulate mice when the filter caps were removed. The installation of a duct to direct all air entering the room through the bench filter reduced the airborne bacterial counts in the room. This modification combined with the culling or marking of infected cages so that no future breeders would be taken from these cages eliminated a number of bacterial contaminants (Staphylococcus aureus, S epidermidis, and streptococci) from the colony.

  13. Immature germ cells in semen - correlation with total sperm count and sperm motility.

    PubMed

    Patil, Priya S; Humbarwadi, Rajendra S; Patil, Ashalata D; Gune, Anita R

    2013-07-01

    Current data regarding infertility suggests that male factor contributes up to 30% of the total cases of infertility. Semen analysis reveals the presence of spermatozoa as well as a number of non-sperm cells, presently being mentioned in routine semen report as "round cells" without further differentiating them into leucocytes or immature germ cells. The aim of this work was to study a simple, cost-effective, and convenient method for differentiating the round cells in semen into immature germ cells and leucocytes and correlating them with total sperm counts and motility. Semen samples from 120 males, who had come for investigation for infertility, were collected, semen parameters recorded, and stained smears studied for different round cells. Statistical analysis of the data was done to correlate total sperm counts and sperm motility with the occurrence of immature germ cells and leucocytes. The average shedding of immature germ cells in different groups with normal and low sperm counts was compared. The clinical significance of "round cells" in semen and their differentiation into leucocytes and immature germ cells are discussed. Round cells in semen can be differentiated into immature germ cells and leucocytes using simple staining methods. The differential counts mentioned in a semen report give valuable and clinically relevant information. In this study, we observed a negative correlation between total count and immature germ cells, as well as sperm motility and shedding of immature germ cells. The latter was statistically significant with a P value 0.000.

  14. A microchip CD4 counting method for HIV monitoring in resource-poor settings.

    PubMed

    Rodriguez, William R; Christodoulides, Nicolaos; Floriano, Pierre N; Graham, Susan; Mohanty, Sanghamitra; Dixon, Meredith; Hsiang, Mina; Peter, Trevor; Zavahir, Shabnam; Thior, Ibou; Romanovicz, Dwight; Bernard, Bruce; Goodey, Adrian P; Walker, Bruce D; McDevitt, John T

    2005-07-01

    More than 35 million people in developing countries are living with HIV infection. An enormous global effort is now underway to bring antiretroviral treatment to at least 3 million of those infected. While drug prices have dropped considerably, the cost and technical complexity of laboratory tests essential for the management of HIV disease, such as CD4 cell counts, remain prohibitive. New, simple, and affordable methods for measuring CD4 cells that can be implemented in resource-scarce settings are urgently needed. Here we describe the development of a prototype for a simple, rapid, and affordable method for counting CD4 lymphocytes. Microliter volumes of blood without further sample preparation are stained with fluorescent antibodies, captured on a membrane within a miniaturized flow cell and imaged through microscope optics with the type of charge-coupled device developed for digital camera technology. An associated computer algorithm converts the raw digital image into absolute CD4 counts and CD4 percentages in real time. The accuracy of this prototype system was validated through testing in the United States and Botswana, and showed close agreement with standard flow cytometry (r = 0.95) over a range of absolute CD4 counts, and the ability to discriminate clinically relevant CD4 count thresholds with high sensitivity and specificity. Advances in the adaptation of new technologies to biomedical detection systems, such as the one described here, promise to make complex diagnostics for HIV and other infectious diseases a practical global reality.

  15. Mapping the layer count of few-layer hexagonal boron nitride at high lateral spatial resolutions

    NASA Astrophysics Data System (ADS)

    Mohsin, Ali; Cross, Nicholas G.; Liu, Lei; Watanabe, Kenji; Taniguchi, Takashi; Duscher, Gerd; Gu, Gong

    2018-01-01

    Layer count control and uniformity of two dimensional (2D) layered materials are critical to the investigation of their properties and to their electronic device applications, but methods to map 2D material layer count at nanometer-level lateral spatial resolutions have been lacking. Here, we demonstrate a method based on two complementary techniques widely available in transmission electron microscopes (TEMs) to map the layer count of multilayer hexagonal boron nitride (h-BN) films. The mass-thickness contrast in high-angle annular dark-field (HAADF) imaging in the scanning transmission electron microscope (STEM) mode allows for thickness determination in atomically clean regions with high spatial resolution (sub-nanometer), but is limited by surface contamination. To complement, another technique based on the boron K ionization edge in the electron energy loss spectroscopy spectrum (EELS) of h-BN is developed to quantify the layer count so that surface contamination does not cause an overestimate, albeit at a lower spatial resolution (nanometers). The two techniques agree remarkably well in atomically clean regions with discrepancies within  ±1 layer. For the first time, the layer count uniformity on the scale of nanometers is quantified for a 2D material. The methodology is applicable to layer count mapping of other 2D layered materials, paving the way toward the synthesis of multilayer 2D materials with homogeneous layer count.

  16. Statistical aspects of point count sampling

    USGS Publications Warehouse

    Barker, R.J.; Sauer, J.R.; Ralph, C.J.; Sauer, J.R.; Droege, S.

    1995-01-01

    The dominant feature of point counts is that they do not census birds, but instead provide incomplete counts of individuals present within a survey plot. Considering a simple model for point count sampling, we demon-strate that use of these incomplete counts can bias estimators and testing procedures, leading to inappropriate conclusions. A large portion of the variability in point counts is caused by the incomplete counting, and this within-count variation can be confounded with ecologically meaningful varia-tion. We recommend caution in the analysis of estimates obtained from point counts. Using; our model, we also consider optimal allocation of sampling effort. The critical step in the optimization process is in determining the goals of the study and methods that will be used to meet these goals. By explicitly defining the constraints on sampling and by estimating the relationship between precision and bias of estimators and time spent counting, we can predict the optimal time at a point for each of several monitoring goals. In general, time spent at a point will differ depending on the goals of the study.

  17. An Isotopic Dilution Experiment Using Liquid Scintillation: A Simple Two-System, Two-Phase Analysis.

    ERIC Educational Resources Information Center

    Moehs, Peter J.; Levine, Samuel

    1982-01-01

    A simple isotonic, dilution analysis whose principles apply to methods of more complex radioanalyses is described. Suitable for clinical and instrumental analysis chemistry students, experimental manipulations are kept to a minimum involving only aqueous extraction before counting. Background information, procedures, and results are discussed.…

  18. Radionuclides in haematology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lewis, S.M.; Bayly, R.J.

    1986-01-01

    This book contains the following chapters: Some prerequisites to the use of radionuclides in haematology; Instrumentation and counting techniques; In vitro techniques; Cell labelling; Protein labelling; Autoradiography; Imaging and quantitative scanning; Whole body counting; Absorption and excretion studies; Blood volume studies; Plasma clearance studies; and Radionuclide blood cell survival studies.

  19. Using the One-More-Than Technique to Teach Money Counting to Individuals with Moderate Mental Retardation: A Systematic Replication.

    ERIC Educational Resources Information Center

    Denny, Paula J.; Test, David W.

    1995-01-01

    This study extended use of the One-More-Than technique by using a "cents-pile modification"; one-, five-, and ten-dollar bills; and mixed training of all dollar amounts. Three high school students with moderate mental retardation each learned to use the technique to count out nontrained amounts and to make community purchases. (Author/PB)

  20. Better Than Counting: Density Profiles from Force Sampling

    NASA Astrophysics Data System (ADS)

    de las Heras, Daniel; Schmidt, Matthias

    2018-05-01

    Calculating one-body density profiles in equilibrium via particle-based simulation methods involves counting of events of particle occurrences at (histogram-resolved) space points. Here, we investigate an alternative method based on a histogram of the local force density. Via an exact sum rule, the density profile is obtained with a simple spatial integration. The method circumvents the inherent ideal gas fluctuations. We have tested the method in Monte Carlo, Brownian dynamics, and molecular dynamics simulations. The results carry a statistical uncertainty smaller than that of the standard counting method, reducing therefore the computation time.

  1. Probabilistic techniques for obtaining accurate patient counts in Clinical Data Warehouses

    PubMed Central

    Myers, Risa B.; Herskovic, Jorge R.

    2011-01-01

    Proposal and execution of clinical trials, computation of quality measures and discovery of correlation between medical phenomena are all applications where an accurate count of patients is needed. However, existing sources of this type of patient information, including Clinical Data Warehouses (CDW) may be incomplete or inaccurate. This research explores applying probabilistic techniques, supported by the MayBMS probabilistic database, to obtain accurate patient counts from a clinical data warehouse containing synthetic patient data. We present a synthetic clinical data warehouse (CDW), and populate it with simulated data using a custom patient data generation engine. We then implement, evaluate and compare different techniques for obtaining patients counts. We model billing as a test for the presence of a condition. We compute billing’s sensitivity and specificity both by conducting a “Simulated Expert Review” where a representative sample of records are reviewed and labeled by experts, and by obtaining the ground truth for every record. We compute the posterior probability of a patient having a condition through a “Bayesian Chain”, using Bayes’ Theorem to calculate the probability of a patient having a condition after each visit. The second method is a “one-shot” approach that computes the probability of a patient having a condition based on whether the patient is ever billed for the condition Our results demonstrate the utility of probabilistic approaches, which improve on the accuracy of raw counts. In particular, the simulated review paired with a single application of Bayes’ Theorem produces the best results, with an average error rate of 2.1% compared to 43.7% for the straightforward billing counts. Overall, this research demonstrates that Bayesian probabilistic approaches improve patient counts on simulated patient populations. We believe that total patient counts based on billing data are one of the many possible applications of our Bayesian framework. Use of these probabilistic techniques will enable more accurate patient counts and better results for applications requiring this metric. PMID:21986292

  2. Single molecule photobleaching (SMPB) technology for counting of RNA, DNA, protein and other molecules in nanoparticles and biological complexes by TIRF instrumentation.

    PubMed

    Zhang, Hui; Guo, Peixuan

    2014-05-15

    Direct counting of biomolecules within biological complexes or nanomachines is demanding. Single molecule counting using optical microscopy is challenging due to the diffraction limit. The single molecule photobleaching (SMPB) technology for direct counting developed by our team (Shu et al., 2007 [18]; Zhang et al., 2007 [19]) offers a simple and straightforward method to determine the stoichiometry of molecules or subunits within biocomplexes or nanomachines at nanometer scales. Stoichiometry is determined by real-time observation of the number of descending steps resulted from the photobleaching of individual fluorophore. This technology has now been used extensively for single molecule counting of protein, RNA, and other macromolecules in a variety of complexes or nanostructures. Here, we elucidate the SMPB technology, using the counting of RNA molecules within a bacteriophage phi29 DNA-packaging biomotor as an example. The method described here can be applied to the single molecule counting of other molecules in other systems. The construction of a concise, simple and economical single molecule total internal reflection fluorescence (TIRF) microscope combining prism-type and objective-type TIRF is described. The imaging system contains a deep-cooled sensitive EMCCD camera with single fluorophore detection sensitivity, a laser combiner for simultaneous dual-color excitation, and a Dual-View™ imager to split the multiple outcome signals to different detector channels based on their wavelengths. Methodology of the single molecule photobleaching assay used to elucidate the stoichiometry of RNA on phi29 DNA packaging motor and the mechanism of protein/RNA interaction are described. Different methods for single fluorophore labeling of RNA molecules are reviewed. The process of statistical modeling to reveal the true copy number of the biomolecules based on binomial distribution is also described. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. The Hole-Count Test Revisited: Effects of Test Specimen Thickness

    NASA Technical Reports Server (NTRS)

    Lyman, C. E.; Ackland, D. W.; Williams, D. B.; Goldstein, J. I.

    1989-01-01

    For historical reasons the hole count, an important performance test for the Analytical Electron Microscope (AEM), is somewhat arbitrary yielding different numbers for different investigators. This was not a problem a decade ago when AEM specimens were often bathed with large fluxes of stray electrons and hard x rays. At that time the presence or absence of a thick Pt second condenser (C2) aperture could be detected by a simple comparison of the x-ray spectrum taken 'somewhere in the hole' with a spectrum collected on a 'typical thickness' of Mo or Ag foil. A high hole count of about 10-20% indicated that the electron column needed modifications; whereas a hole count of 1-2% was accepted for most AEM work. The absolute level of the hole count is a function of test specimen atomic number, overall specimen shape, and thin-foil thickness. In order that equivalent results may be obtained for any AEM in any laboratory in the world, this test must become standardized. The hole-count test we seek must be as simpl and as nonsubjective as the graphite 0.344nm lattice-line-resolution test. This lattice-resolution test spurred manufacturers to improve the image resolution of the TEM significantly in the 1970s and led to the even more stringent resolution tests of today. A similar phenomenon for AEM instruments would be welcome. The hole-count test can also indicate whether the spurious x-ray signal is generated by high-energy continuum x rays (bremsstrahlung) generated in the electron column (high K-line to L-line ratio) or uncollimated electrons passing through or around the C2 aperture (low K/L ratio).

  4. Development of a nematode offspring counting assay for rapid and simple soil toxicity assessment.

    PubMed

    Kim, Shin Woong; Moon, Jongmin; Jeong, Seung-Woo; An, Youn-Joo

    2018-05-01

    Since the introduction of standardized nematode toxicity assays by the American Society for Testing and Materials (ASTM) and International Organization for Standardization (ISO), many studies have reported their use. Given that the currently used standardized nematode toxicity assays have certain limitations, in this study, we examined the use of a novel nematode offspring counting assay for evaluating soil ecotoxicity based on a previous soil-agar isolation method used to recover live adult nematodes. In this new assay, adult Caenorhabditis elegans were exposed to soil using a standardized toxicity assay procedure, and the resulting offspring in test soils attracted by a microbial food source in agar plates were counted. This method differs from previously used assays in terms of its endpoint, namely, the number of nematode offspring. The applicability of the bioassay was demonstrated using metal-spiked soils, which revealed metal concentration-dependent responses, and with 36 field soil samples characterized by different physicochemical properties and containing various metals. Principal component analysis revealed that texture fraction (clay, sand, and silt) and electrical conductivity values were the main factors influencing the nematode offspring counting assay, and these findings warrant further investigation. The nematode offspring counting assay is a rapid and simple process that can provide multi-directional toxicity assessment when used in conjunction with other standard methods. Copyright © 2018 Elsevier Ltd. All rights reserved.

  5. Chi-squared and C statistic minimization for low count per bin data

    NASA Astrophysics Data System (ADS)

    Nousek, John A.; Shue, David R.

    1989-07-01

    Results are presented from a computer simulation comparing two statistical fitting techniques on data samples with large and small counts per bin; the results are then related specifically to X-ray astronomy. The Marquardt and Powell minimization techniques are compared by using both to minimize the chi-squared statistic. In addition, Cash's C statistic is applied, with Powell's method, and it is shown that the C statistic produces better fits in the low-count regime than chi-squared.

  6. Chi-squared and C statistic minimization for low count per bin data. [sampling in X ray astronomy

    NASA Technical Reports Server (NTRS)

    Nousek, John A.; Shue, David R.

    1989-01-01

    Results are presented from a computer simulation comparing two statistical fitting techniques on data samples with large and small counts per bin; the results are then related specifically to X-ray astronomy. The Marquardt and Powell minimization techniques are compared by using both to minimize the chi-squared statistic. In addition, Cash's C statistic is applied, with Powell's method, and it is shown that the C statistic produces better fits in the low-count regime than chi-squared.

  7. A technique for verifying the input response function of neutron time-of-flight scintillation detectors using cosmic rays.

    PubMed

    Bonura, M A; Ruiz, C L; Fehl, D L; Cooper, G W; Chandler, G; Hahn, K D; Nelson, A J; Styron, J D; Torres, J A

    2014-11-01

    An accurate interpretation of DD or DT fusion neutron time-of-flight (nTOF) signals from current mode detectors employed at the Z-facility at Sandia National Laboratories requires that the instrument response functions (IRF's) be deconvolved from the measured nTOF signals. A calibration facility that produces detectable sub-ns radiation pulses is typically used to measure the IRF of such detectors. This work, however, reports on a simple method that utilizes cosmic radiation to measure the IRF of nTOF detectors, operated in pulse-counting mode. The characterizing metrics reported here are the throughput delay and full-width-at-half-maximum. This simple approach yields consistent IRF results with the same detectors calibrated in 2007 at a LINAC bremsstrahlung accelerator (Idaho State University). In particular, the IRF metrics from these two approaches and their dependence on the photomultipliers bias agree to within a few per cent. This information may thus be used to verify if the IRF for a given nTOF detector employed at Z has changed since its original current-mode calibration and warrants re-measurement.

  8. Single Particle-Inductively Coupled Plasma Mass Spectroscopy Analysis of Metallic Nanoparticles in Environmental Samples with Large Dissolved Analyte Fractions.

    PubMed

    Schwertfeger, D M; Velicogna, Jessica R; Jesmer, Alexander H; Scroggins, Richard P; Princz, Juliska I

    2016-10-18

    There is an increasing interest to use single particle-inductively coupled plasma mass spectroscopy (SP-ICPMS) to help quantify exposure to engineered nanoparticles, and their transformation products, released into the environment. Hindering the use of this analytical technique for environmental samples is the presence of high levels of dissolved analyte which impedes resolution of the particle signal from the dissolved. While sample dilution is often necessary to achieve the low analyte concentrations necessary for SP-ICPMS analysis, and to reduce the occurrence of matrix effects on the analyte signal, it is used here to also reduce the dissolved signal relative to the particulate, while maintaining a matrix chemistry that promotes particle stability. We propose a simple, systematic dilution series approach where by the first dilution is used to quantify the dissolved analyte, the second is used to optimize the particle signal, and the third is used as an analytical quality control. Using simple suspensions of well characterized Au and Ag nanoparticles spiked with the dissolved analyte form, as well as suspensions of complex environmental media (i.e., extracts from soils previously contaminated with engineered silver nanoparticles), we show how this dilution series technique improves resolution of the particle signal which in turn improves the accuracy of particle counts, quantification of particulate mass and determination of particle size. The technique proposed here is meant to offer a systematic and reproducible approach to the SP-ICPMS analysis of environmental samples and improve the quality and consistency of data generated from this relatively new analytical tool.

  9. Strategy Choices in Simple and Complex Addition: Contributions of Working Memory and Counting Knowledge for Children with Mathematical Disability

    ERIC Educational Resources Information Center

    Geary, David C.; Hoard, Mary K.; Byrd-Craven, Jennifer; DeSoto, M. Catherine

    2004-01-01

    Groups of first-grade (mean age = 82 months), third-grade (mean age = 107 months), and fifth-grade (mean age = 131 months) children with a learning disability in mathematics (MD, n=58) and their normally achieving peers (n = 91) were administered tasks that assessed their knowledge of counting principles, working memory, and the strategies used to…

  10. Assessment of the atrial electromechanical properties of patients with human immunodeficiency virus.

    PubMed

    Ertem, Ahmet G; Yayla, Çağrı; Açar, Burak; Ünal, Sefa; Erdol, Mehmet A; Sonmezer, Meliha Ç; Kaya Kiliç, Esra; Ataman Hatipoglu, Çiğdem; Gokaslan, Serkan; Kafes, Habibe; Akboga, Mehmet K; Aladag, Pelin; Demirtas, Koray; Tulek, Necla; Erdinç, Fatma S; Aydogdu, Sinan

    The relationship between atrial fibrillation and human immunodeficiency virus (HIV) infection was evaluated. Electro-echocardiographic methods can be used to predict the development of atrial fibrillation (AF). In this study, we aimed to investigate the atrial electromechanical delay (AEMD) parameters of HIV (+) patients. Forty-two HIV (+) patients and 40 HIV (-) healthy volunteers were prospectively enrolled in this study. The electromechanical properties of the subjects' atria were evaluated with tissue Doppler imaging. The left-AEMD, right-AEMD and inter-AEMD were increased in the HIV (+) patients relative to the controls (p=0.003, p<0.001, and p<0.001, respectively). The CD4 count was inversely correlated with the inter-AEMD (r=-0.428, p<0.001). The CD4 count was an independent predictor of the inter-AEMD (β=0.523, p=0.007). Our study demonstrated that both the inter- and intra-atrial electromechanical delays were prolonged in the patients with HIV. This non-invasive and simple technique may provide significant contributions to the assessment of the risk of atrial arrhythmia in patients with HIV. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  11. For Mole Problems, Call Avogadro: 602-1023.

    ERIC Educational Resources Information Center

    Uthe, R. E.

    2002-01-01

    Describes techniques to help introductory students become familiar with Avogadro's number and mole calculations. Techniques involve estimating numbers of common objects then calculating the length of time needed to count large numbers of them. For example, the immense amount of time required to count a mole of sand grains at one grain per second…

  12. High-Density Liquid-State Machine Circuitry for Time-Series Forecasting.

    PubMed

    Rosselló, Josep L; Alomar, Miquel L; Morro, Antoni; Oliver, Antoni; Canals, Vincent

    2016-08-01

    Spiking neural networks (SNN) are the last neural network generation that try to mimic the real behavior of biological neurons. Although most research in this area is done through software applications, it is in hardware implementations in which the intrinsic parallelism of these computing systems are more efficiently exploited. Liquid state machines (LSM) have arisen as a strategic technique to implement recurrent designs of SNN with a simple learning methodology. In this work, we show a new low-cost methodology to implement high-density LSM by using Boolean gates. The proposed method is based on the use of probabilistic computing concepts to reduce hardware requirements, thus considerably increasing the neuron count per chip. The result is a highly functional system that is applied to high-speed time series forecasting.

  13. A comparative evaluation of Oratest with the microbiological method of assessing caries activity in children

    PubMed Central

    Sundaram, Meenakshi; Nayak, Ullal Anand; Ramalingam, Krishnakumar; Reddy, Venugopal; Rao, Arun Prasad; Mathian, Mahesh

    2013-01-01

    Aims: The aim of this study is to find out whether Oratest can be used as a diagnostic tool in assessing the caries activity by evaluating its relationship to the existing caries status and the salivary streptococcus mutans level. Materials and Methods: The study sample consists of 90 students divided into two groups. Group I (test group) and Group II (control group) consisting of 30 children for control group and 60 children for test group. The sampling of unstimulated saliva for the estimation of streptococcus mutans was done as per the method suggested by Kohler and Bratthall. The plates were then incubated. Rough surface colonies were identified as streptococcus mutans on a pre-determined area of the tip (approximately 1.5 cm2) were counted for each side of spatula pressed against mitis salivarius bacitracin agar using digital colony counter. The results were expressed in colony forming units (CFU). Oratest was carried out in the same patients after the collection of salivary sample for the microbiological method to evaluate the relationship between the two tests. Statistical Analysis Used: The tests used were ANOVA, Pearson Chi-square test, Pearson′s correlation analysis, Mann-Whitney U test and Student′s independent t-test. Results: In the control group and test group, when the streptococcus mutans count (CFU) and Oratest time (minutes) were correlated using Pearson′s correlation analysis, the streptococcus mutans counts was found to be in a statistically significant negative linear relationship with the Oratest time. When the caries status of the children, participated in the test group were correlated with mutans count (CFU) and Oratest time, caries status were found to be in a statistically significant positive linear relationship with streptococcus mutans count and in a significant negative linear relationship with Oratest time. Conclusions: The test proved to be a simple, inexpensive and rapid technique for assessing caries activity since a significant relationship exists clinically with caries status and microbiologically with the streptococcus mutans count of the individual. PMID:23946577

  14. Analysis of radioactive strontium-90 in food by Čerenkov liquid scintillation counting.

    PubMed

    Pan, Jingjing; Emanuele, Kathryn; Maher, Eileen; Lin, Zhichao; Healey, Stephanie; Regan, Patrick

    2017-08-01

    A simple liquid scintillation counting method using DGA/TRU resins for removal of matrix/radiometric interferences, Čerenkov counting for measuring 90 Y, and EDXRF for quantifying Y recovery was validated for analyzing 90 Sr in various foods. Analysis of samples containing energetic β emitters required using TRU resin to avoid false detection and positive bias. Additional 34% increase in Y recovery was obtained by stirring the resin while eluting Y with H 2 C 2 O 4 . The method showed acceptable accuracy (±10%), precision (10%), and detectability (~0.09Bqkg -1 ). Published by Elsevier Ltd.

  15. Assessment of background hydrogen by the Monte Carlo computer code MCNP-4A during measurements of total body nitrogen.

    PubMed

    Ryde, S J; al-Agel, F A; Evans, C J; Hancock, D A

    2000-05-01

    The use of a hydrogen internal standard to enable the estimation of absolute mass during measurement of total body nitrogen by in vivo neutron activation is an established technique. Central to the technique is a determination of the H prompt gamma ray counts arising from the subject. In practice, interference counts from other sources--e.g., neutron shielding--are included. This study reports use of the Monte Carlo computer code, MCNP-4A, to investigate the interference counts arising from shielding both with and without a phantom containing a urea solution. Over a range of phantom size (depth 5 to 30 cm, width 20 to 40 cm), the counts arising from shielding increased by between 4% and 32% compared with the counts without a phantom. For any given depth, the counts increased approximately linearly with width. For any given width, there was little increase for depths exceeding 15 centimeters. The shielding counts comprised between 15% and 26% of those arising from the urea phantom. These results, although specific to the Swansea apparatus, suggest that extraneous hydrogen counts can be considerable and depend strongly on the subject's size.

  16. Quality assurance for sperm concentration using latex beads.

    PubMed

    Peters, A J; Zaneveld, L J; Jeyendran, R S

    1993-10-01

    To provide a simple, universally applicable method of quality assurance for sperm counting, thereby reducing intercounting chamber variation. By using a known concentration of latex beads, the sperm:bead ratio can be used to calculate the actual sperm count. The mean sperm and bead counts were determined in both a Spot-lite hemocytometer (Baxter Diagnostics, McGaw Park, IL) and a Makler chamber (Polymedco Inc., Yorktown, NY) from 21 different ejaculates mixed with a known concentration of beads. The hemocytometer chamber was used as the standard counting chamber because it consistently yielded a low variation in sperm count. The adjusted sperm concentration of the Makler chamber was calculated using the following formula [hemocytometer beads]/[Makler beads] x [Makler sperm]. Observed mean +/- SD sperm counts were significantly different between the hemocytometer chamber (110.6 +/- 66.2 x 10(6)/mL) and Makler chamber (173.3 +/- 103.5 x 10(6)/mL). However, calculated Makler chamber sperm counts (118.1 +/- 76.1 x 10(6)/mL) was not statistically different from observed hemocytometer sperm counts. This novel approach to sperm counting using a known concentration of latex beads as a reference material can be used to reduce variation in sperm counting between observers, counting chambers, and possibly computerized sperm analyzers.

  17. Rapid Membrane Filtration-Epifluorescent Microscopy Technique for Direct Enumeration of Bacteria in Raw Milk

    PubMed Central

    Pettipher, Graham L.; Mansell, Roderick; McKinnon, Charles H.; Cousins, Christina M.

    1980-01-01

    Membrane filtration and epifluorescent microscopy were used for the direct enumeration of bacteria in raw milk. Somatic cells were lysed by treatment with trypsin and Triton X-100 so that 2 ml of milk containing up to 5 × 106 somatic cells/ml could be filtered. The majority of the bacteria (ca. 80%) remained intact and were concentrated on the membrane. After being stained with acridine organe, the bacteria fluoresced under ultraviolet light and could easily be counted. The clump count of orange fluorescing cells on the membrane correlated well (r = 0.91) with the corresponding plate count for farm, tanker, and silo milks. Differences between counts obtained by different operators and between the membrane clump count and plate count were not significant. The technique is rapid, taking less than 25 min, inexpensive, costing less than 50 cents per sample, and is suitable for milks containing 5 × 103 to 5 × 108 bacteria per ml. Images PMID:16345515

  18. Evaluation of accuracy and precision of a smartphone based automated parasite egg counting system in comparison to the McMaster and Mini-FLOTAC methods.

    PubMed

    Scare, J A; Slusarewicz, P; Noel, M L; Wielgus, K M; Nielsen, M K

    2017-11-30

    Fecal egg counts are emphasized for guiding equine helminth parasite control regimens due to the rise of anthelmintic resistance. This, however, poses further challenges, since egg counting results are prone to issues such as operator dependency, method variability, equipment requirements, and time commitment. The use of image analysis software for performing fecal egg counts is promoted in recent studies to reduce the operator dependency associated with manual counts. In an attempt to remove operator dependency associated with current methods, we developed a diagnostic system that utilizes a smartphone and employs image analysis to generate automated egg counts. The aims of this study were (1) to determine precision of the first smartphone prototype, the modified McMaster and ImageJ; (2) to determine precision, accuracy, sensitivity, and specificity of the second smartphone prototype, the modified McMaster, and Mini-FLOTAC techniques. Repeated counts on fecal samples naturally infected with equine strongyle eggs were performed using each technique to evaluate precision. Triplicate counts on 36 egg count negative samples and 36 samples spiked with strongyle eggs at 5, 50, 500, and 1000 eggs per gram were performed using a second smartphone system prototype, Mini-FLOTAC, and McMaster to determine technique accuracy. Precision across the techniques was evaluated using the coefficient of variation. In regards to the first aim of the study, the McMaster technique performed with significantly less variance than the first smartphone prototype and ImageJ (p<0.0001). The smartphone and ImageJ performed with equal variance. In regards to the second aim of the study, the second smartphone system prototype had significantly better precision than the McMaster (p<0.0001) and Mini-FLOTAC (p<0.0001) methods, and the Mini-FLOTAC was significantly more precise than the McMaster (p=0.0228). Mean accuracies for the Mini-FLOTAC, McMaster, and smartphone system were 64.51%, 21.67%, and 32.53%, respectively. The Mini-FLOTAC was significantly more accurate than the McMaster (p<0.0001) and the smartphone system (p<0.0001), while the smartphone and McMaster counts did not have statistically different accuracies. Overall, the smartphone system compared favorably to manual methods with regards to precision, and reasonably with regards to accuracy. With further refinement, this system could become useful in veterinary practice. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Simple Counts of ADL Dependencies Do Not Adequately Reflect Older Adults' Preferences toward States of Functional Impairment

    PubMed Central

    Sims, Tamara; Holmes, Tyson H.; Bravata, Dena M.; Garber, Alan M.; Nelson, Lorene M.; Goldstein, Mary K.

    2008-01-01

    Objective To use unweighted counts of dependencies in Activities of Daily Living (ADLs) to assess the impact of functional impairment requires an assumption of equal preferences for each ADL dependency. To test this assumption, we analyzed standard gamble utilities of single and combination ADL dependencies among older adults. Study Design and Setting: Four hundred older adults used multimedia software (FLAIR1) to report standard gamble utilities for their current health and hypothetical health states of dependency in each of 7 ADLs and 8 of 30 combinations of ADL dependencies. Results Utilities for health states of multiple ADL dependencies were often greater than for states of single ADL dependencies. Dependence in eating, the ADL dependency with the lowest utility rating of the single ADL dependencies, ranked lower than 7 combination states. Similarly, some combination states with fewer ADL dependencies had lower utilities than those with more ADL dependencies. These findings were consistent across groups by gender, age, and education. Conclusion Our results suggest that the count of ADL dependencies does not adequately represent the utility for a health state. Cost-effectiveness analyses and other evaluations of programs that prevent or treat functional dependency should apply utility-weights rather than relying on simple ADL counts. PMID:18722749

  20. Real-time direct measurement of liquid (water) evaporation by simple disturbance inhibited interfometry technique

    NASA Astrophysics Data System (ADS)

    Kim, Yong Gi

    2017-11-01

    A real-time in-situ interferometry method was proposed to measure water (liquid) evaporation directly over the liquid surface inside the reservoir. The direct evaporation measurement relied on the counting the number of sinusoidal fringes. As the water inside reservoir evaporated, the depth of the water decreases a little thus the optical path length changes. Evaporation signals have been determined as a function of the focusing beam position of the signal beam over the liquid surface. In interferometry technique, the most limiting factors are surface disturbances and vibrations over the liquid surface. This limiting factor was simply inhibited by placing a long cylindrical aluminum tube around the signal beam of the interferometer over the liquid surface. A small diameter cylindrical Al tube diminished vibrations and wind induced surface ripples more effectively than that of the larger one. Water evaporation was successfully measured in real-time with a warm water and cold water even under windy condition with an electric fan. The experimental results demonstrated that the interferometry technique allows determining of liquid evaporation in real-time. Interferometric technique opens up a new possibility of methodology for liquid evaporation measurement even in several environmental disturbances, such as, vibration, surface disturbance, temperature change and windy environments.

  1. Can simple mobile phone applications provide reliable counts of respiratory rates in sick infants and children? An initial evaluation of three new applications.

    PubMed

    Black, James; Gerdtz, Marie; Nicholson, Pat; Crellin, Dianne; Browning, Laura; Simpson, Julie; Bell, Lauren; Santamaria, Nick

    2015-05-01

    Respiratory rate is an important sign that is commonly either not recorded or recorded incorrectly. Mobile phone ownership is increasing even in resource-poor settings. Phone applications may improve the accuracy and ease of counting of respiratory rates. The study assessed the reliability and initial users' impressions of four mobile phone respiratory timer approaches, compared to a 60-second count by the same participants. Three mobile applications (applying four different counting approaches plus a standard 60-second count) were created using the Java Mobile Edition and tested on Nokia C1-01 phones. Apart from the 60-second timer application, the others included a counter based on the time for ten breaths, and three based on the time interval between breaths ('Once-per-Breath', in which the user presses for each breath and the application calculates the rate after 10 or 20 breaths, or after 60s). Nursing and physiotherapy students used the applications to count respiratory rates in a set of brief video recordings of children with different respiratory illnesses. Limits of agreement (compared to the same participant's standard 60-second count), intra-class correlation coefficients and standard errors of measurement were calculated to compare the reliability of the four approaches, and a usability questionnaire was completed by the participants. There was considerable variation in the counts, with large components of the variation related to the participants and the videos, as well as the methods. None of the methods was entirely reliable, with no limits of agreement better than -10 to +9 breaths/min. Some of the methods were superior to the others, with ICCs from 0.24 to 0.92. By ICC the Once-per-Breath 60-second count and the Once-per-Breath 20-breath count were the most consistent, better even than the 60-second count by the participants. The 10-breath approaches performed least well. Users' initial impressions were positive, with little difference between the applications found. This study provides evidence that applications running on simple phones can be used to count respiratory rates in children. The Once-per-Breath methods are the most reliable, outperforming the 60-second count. For children with raised respiratory rates the 20-breath version of the Once-per-Breath method is faster, so it is a more suitable option where health workers are under time pressure. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. A clocked high-pass-filter-based offset cancellation technique for high-gain biomedical amplifiers

    NASA Astrophysics Data System (ADS)

    Pal, Dipankar; Goswami, Manish

    2010-05-01

    In this article, a simple offset cancellation technique based on a clocked high-pass filter with extremely low output offset is presented. The configuration uses the on-resistance of a complementary metal oxide semiconductor (CMOS) transmission gate (X-gate) and tunes the lower 3-dB cut-off frequency with a matched pair of floating capacitors. The results compare favourably with the more complex auto-zeroing and chopper stabilisation techniques of offset cancellation in terms of power dissipation, component count and bandwidth, while reporting inferior output noise performance. The design is suitable for use in biomedical amplifier systems for applications such as ENG-recording. The system is simulated in Spectre Cadence 5.1.41 using 0.6 μm CMOS technology and the total block gain is ∼83.0 dB while the phase error is <5°. The power consumption is 10.2 mW and the output offset obtained for an input monotone signal of 5 μVpp is 1.28 μV. The input-referred root mean square noise voltage between 1 and 5 kHz is 26.32 nV/√Hz.

  3. Early endothelial damage detected by circulating particles in baboons fed a diet high in simple carbohydrates in conjunction with saturated or unsaturated fat.

    PubMed

    Shi, Qiang; Hodara, Vida; Meng, Qinghe; Voruganti, V Saroja; Rice, Karen; Michalek, Joel E; Comuzzie, Anthony G; VandeBerg, John L

    2014-01-01

    Studies have shown that high-fat diets cause blood vessel damage, however, assessing pathological effects accurately and efficiently is difficult. In this study, we measured particle levels of static endothelium (CD31+ and CD105+) and activated endothelium (CD62E+, CD54+ and CD106+) in plasma. We determined individual responses to two dietary regimens in two groups of baboons. One group (n = 10), was fed a diet high in simple carbohydrates and saturated fats (the HSF diet) and the other (n = 8) received a diet high in simple carbohydrates and unsaturated fats (the HUF diet). Plasma samples were collected at 0, 3, and 7 weeks. The percentages of CD31+ and CD62E+ particles were elevated at 3 weeks in animals fed either diet, but these elevations were statistically significant only in animals fed the HUF diet. Surprisingly, both percentages and counts of CD31+ particles were significantly lower at week 7 compared to week 0 and 3 in the HSF group. The median absolute counts of CD105+ particles were progressively elevated over time in the HSF group with a significant increase from week 0 to 7; the pattern was somewhat different for the HUF group with significant increase from week 3 to 7. The counts of CD54+ particles exhibited wide variation in both groups during the dietary challenge, while the median counts of CD106+ particles were significantly lower at week 3 than at week 0 and week 7. Endothelial particles exhibited time-dependent changes, suggesting they were behaving as quantifiable surrogates for the early detection of vascular damage caused by dietary factors.

  4. Determination of mammalian cell counts, cell size and cell health using the Moxi Z mini automated cell counter.

    PubMed

    Dittami, Gregory M; Sethi, Manju; Rabbitt, Richard D; Ayliffe, H Edward

    2012-06-21

    Particle and cell counting is used for a variety of applications including routine cell culture, hematological analysis, and industrial controls(1-5). A critical breakthrough in cell/particle counting technologies was the development of the Coulter technique by Wallace Coulter over 50 years ago. The technique involves the application of an electric field across a micron-sized aperture and hydrodynamically focusing single particles through the aperture. The resulting occlusion of the aperture by the particles yields a measurable change in electric impedance that can be directly and precisely correlated to cell size/volume. The recognition of the approach as the benchmark in cell/particle counting stems from the extraordinary precision and accuracy of its particle sizing and counts, particularly as compared to manual and imaging based technologies (accuracies on the order of 98% for Coulter counters versus 75-80% for manual and vision-based systems). This can be attributed to the fact that, unlike imaging-based approaches to cell counting, the Coulter Technique makes a true three-dimensional (3-D) measurement of cells/particles which dramatically reduces count interference from debris and clustering by calculating precise volumetric information about the cells/particles. Overall this provides a means for enumerating and sizing cells in a more accurate, less tedious, less time-consuming, and less subjective means than other counting techniques(6). Despite the prominence of the Coulter technique in cell counting, its widespread use in routine biological studies has been prohibitive due to the cost and size of traditional instruments. Although a less expensive Coulter-based instrument has been produced, it has limitations as compared to its more expensive counterparts in the correction for "coincidence events" in which two or more cells pass through the aperture and are measured simultaneously. Another limitation with existing Coulter technologies is the lack of metrics on the overall health of cell samples. Consequently, additional techniques must often be used in conjunction with Coulter counting to assess cell viability. This extends experimental setup time and cost since the traditional methods of viability assessment require cell staining and/or use of expensive and cumbersome equipment such as a flow cytometer. The Moxi Z mini automated cell counter, described here, is an ultra-small benchtop instrument that combines the accuracy of the Coulter Principle with a thin-film sensor technology to enable precise sizing and counting of particles ranging from 3-25 microns, depending on the cell counting cassette used. The M type cassette can be used to count particles from with average diameters of 4 - 25 microns (dynamic range 2 - 34 microns), and the Type S cassette can be used to count particles with and average diameter of 3 - 20 microns (dynamic range 2 - 26 microns). Since the system uses a volumetric measurement method, the 4-25 microns corresponds to a cell volume range of 34 - 8,180 fL and the 3 - 20 microns corresponds to a cell volume range of 14 - 4200 fL, which is relevant when non-spherical particles are being measured. To perform mammalian cell counts using the Moxi Z, the cells to be counted are first diluted with ORFLO or similar diluent. A cell counting cassette is inserted into the instrument, and the sample is loaded into the port of the cassette. Thousands of cells are pulled, single-file through a "Cell Sensing Zone" (CSZ) in the thin-film membrane over 8-15 seconds. Following the run, the instrument uses proprietary curve-fitting in conjunction with a proprietary software algorithm to provide coincidence event correction along with an assessment of overall culture health by determining the ratio of the number of cells in the population of interest to the total number of particles. The total particle counts include shrunken and broken down dead cells, as well as other debris and contaminants. The results are presented in histogram format with an automatic curve fit, with gates that can be adjusted manually as needed. Ultimately, the Moxi Z enables counting with a precision and accuracy comparable to a Coulter Z2, the current gold standard, while providing additional culture health information. Furthermore it achieves these results in less time, with a smaller footprint, with significantly easier operation and maintenance, and at a fraction of the cost of comparable technologies.

  5. Validation of the FFM PD count technique for screening personality pathology in later middle-aged and older adults.

    PubMed

    Van den Broeck, Joke; Rossi, Gina; De Clercq, Barbara; Dierckx, Eva; Bastiaansen, Leen

    2013-01-01

    Research on the applicability of the five factor model (FFM) to capture personality pathology coincided with the development of a FFM personality disorder (PD) count technique, which has been validated in adolescent, young, and middle-aged samples. This study extends the literature by validating this technique in an older sample. Five alternative FFM PD counts based upon the Revised NEO Personality Inventory (NEO PI-R) are computed and evaluated in terms of both convergent and divergent validity with the Assessment of DSM-IV Personality Disorders Questionnaire (shortly ADP-IV; DSM-IV, Diagnostic and Statistical Manual of Mental Disorders - Fourth edition). For the best working count for each PD normative data are presented, from which cut-off scores are derived. The validity of these cut-offs and their usefulness as a screening tool is tested against both a categorical (i.e., the DSM-IV - Text Revision), and a dimensional (i.e., the Dimensional Assessment of Personality Pathology; DAPP) measure of personality pathology. All but the Antisocial and Obsessive-Compulsive counts exhibited adequate convergent and divergent validity, supporting the use of this method in older adults. Using the ADP-IV and the DAPP - Short Form as validation criteria, results corroborate the use of the FFM PD count technique to screen for PDs in older adults, in particular for the Paranoid, Borderline, Histrionic, Avoidant, and Dependent PDs. Given the age-neutrality of the NEO PI-R and the considerable lack of valid personality assessment tools, current findings appear to be promising for the assessment of pathology in older adults.

  6. Ultrasound-guided fine-needle aspiration biopsy of clinically suspicious thyroid nodules with an automatic aspirator: a novel technique.

    PubMed

    Nagarajah, James; Sheu-Grabellus, Sien-Yi; Farahati, Jamshid; Kamruddin, Kamer A; Bockisch, Andreas; Schmid, Kurt Werner; Görges, Rainer

    2012-07-01

    Fine-needle aspiration biopsy (FNAB) is a simple technique for the investigation of suspicious thyroid nodules. However, low success rates are reported in the literature. The aim of this prospective study was to compare the clinical performance and impact of an automatic aspirator, referred to here as Aspirator 3, to those of the manual technique for the FNAB of clinically suspicious thyroid nodules. One hundred nine consecutive patients with 121 clinically suspicious thyroid nodules underwent a biopsy twice of the same site with the clinically approved Aspirator 3 and with the manual technique. The number of follicular cell formations and the total number of follicular cells in the aspirate were counted using the ThinPrep® method. With the Aspirator 3, the total number and the mean number of extracted cell formations were significantly higher than the values achieved with the manual technique (total: 3222 vs. 1951, p=0.02; mean: 27 vs. 16). The total number of cells that were biopsied was also higher when the Aspirator 3 was utilized (47,480 vs. 23,080, p=0.005). Overall, the Aspirator 3 was superior in 65 biopsies, and the manual technique was superior in 39 biopsies. In terms of cell formations and the total number of cells aspirated, the Aspirator 3 was superior to the manual technique. Further, the Aspirator 3 was more convenient to use and had a greater precision in needle guidance.

  7. Using DNA to test the utility of pellet-group counts as an index of deer counts

    Treesearch

    T. J. Brinkman; D. K. Person; W. Smith; F. Stuart Chapin; K. McCoy; M. Leonawicz; K. Hundertmark

    2013-01-01

    Despite widespread use of fecal pellet-group counts as an index of ungulate density, techniques used to convert pellet-group numbers to ungulate numbers rarely are based on counts of known individuals, seldom evaluated across spatial and temporal scales, and precision is infrequently quantified. Using DNA from fecal pellets to identify individual deer, we evaluated the...

  8. Mapping the acquisition of the number word sequence in the first year of school

    NASA Astrophysics Data System (ADS)

    Gould, Peter

    2017-03-01

    Learning to count and to produce the correct sequence of number words in English is not a simple process. In NSW government schools taking part in Early Action for Success, over 800 students in each of the first 3 years of school were assessed every 5 weeks over the school year to determine the highest correct oral count they could produce. Rather than displaying a steady increase in the accurate sequence of the number words produced, the kindergarten data reported here identified clear, substantial hurdles in the acquisition of the counting sequence. The large-scale, longitudinal data also provided evidence of learning to count through the teens being facilitated by the semi-regular structure of the number words in English. Instead of occurring as hurdles to starting the next counting sequence, number words corresponding to some multiples of ten (10, 20 and 100) acted as if they were rest points. These rest points appear to be artefacts of how the counting sequence is acquired.

  9. [Reassessment of a combination of cerebrospinal fluid scintigraphy and nasal pledget counts in patients with suspected rhinorrhea].

    PubMed

    Kosuda, S; Arai, S; Hohshito, Y; Tokumitsu, H; Kusano, S; Ishihara, S; Shima, K

    1998-07-01

    A combination study of cerebrospinal fluid scintigraphy and nasal pledget counts was performed using 37 MBq of 111In-DTPA in 12 patients with suspected rhinorrhea. A pledget was inserted and dwelled in each nasal cavity for 6 hours, with the patient prone during at least 30 minutes. A total of 18 studies was implemented and nasal pledget counting method successfully diagnosed all of CSF rhinorrhea. Diagnosis was possible when pledget counts were greater than 1 kcpm. In patients with persistent, intermittent and occult/no nasal discharge, rhinorrhea was found in 100% (5/5), 60% (3/5), 25% (2/8), respectively. Two cases only exhibited positive scintigraphy. MRI or CT cisternography should be first performed in patients with persistent discharge, but in patients with intermittent/occult discharge pledget counting method might take priority of other diagnostic modalities. In conclusion, nasal pledget counting method is a simple and useful tool for detecting rhinorrhea.

  10. Quantification of breast density with spectral mammography based on a scanned multi-slit photon-counting detector: A feasibility study

    PubMed Central

    Ding, Huanjun; Molloi, Sabee

    2012-01-01

    Purpose A simple and accurate measurement of breast density is crucial for the understanding of its impact in breast cancer risk models. The feasibility to quantify volumetric breast density with a photon-counting spectral mammography system has been investigated using both computer simulations and physical phantom studies. Methods A computer simulation model involved polyenergetic spectra from a tungsten anode x-ray tube and a Si-based photon-counting detector has been evaluated for breast density quantification. The figure-of-merit (FOM), which was defined as the signal-to-noise ratio (SNR) of the dual energy image with respect to the square root of mean glandular dose (MGD), was chosen to optimize the imaging protocols, in terms of tube voltage and splitting energy. A scanning multi-slit photon-counting spectral mammography system has been employed in the experimental study to quantitatively measure breast density using dual energy decomposition with glandular and adipose equivalent phantoms of uniform thickness. Four different phantom studies were designed to evaluate the accuracy of the technique, each of which addressed one specific variable in the phantom configurations, including thickness, density, area and shape. In addition to the standard calibration fitting function used for dual energy decomposition, a modified fitting function has been proposed, which brought the tube voltages used in the imaging tasks as the third variable in dual energy decomposition. Results For an average sized breast of 4.5 cm thick, the FOM was maximized with a tube voltage of 46kVp and a splitting energy of 24 keV. To be consistent with the tube voltage used in current clinical screening exam (~ 32 kVp), the optimal splitting energy was proposed to be 22 keV, which offered a FOM greater than 90% of the optimal value. In the experimental investigation, the root-mean-square (RMS) error in breast density quantification for all four phantom studies was estimated to be approximately 1.54% using standard calibration function. The results from the modified fitting function, which integrated the tube voltage as a variable in the calibration, indicated a RMS error of approximately 1.35% for all four studies. Conclusions The results of the current study suggest that photon-counting spectral mammography systems may potentially be implemented for an accurate quantification of volumetric breast density, with an RMS error of less than 2%, using the proposed dual energy imaging technique. PMID:22771941

  11. Recent trends in counts of migrant hawks from northeastern North America

    USGS Publications Warehouse

    Titus, K.; Fuller, M.R.

    1990-01-01

    Using simple regression, pooled-sites route-regression, and nonparametric rank-trend analyses, we evaluated trends in counts of hawks migrating past 6 eastern hawk lookouts from 1972 to 1987. The indexing variable was the total count for a season. Bald eagle (Haliaeetus leucocephalus), peregrine falcon (Falco peregrinus), merlin (F. columbarius), osprey (Pandion haliaetus), and Cooper's hawk (Accipiter cooperii) counts increased using route-regression and nonparametric methods (P 0.10). We found no consistent trends (P > 0.10) in counts of sharp-shinned hawks (A. striatus), northern goshawks (A. gentilis) red-shouldered hawks (Buteo lineatus), red-tailed hawks (B. jamaicensis), rough-legged hawsk (B. lagopus), and American kestrels (F. sparverius). Broad-winged hawk (B. platypterus) counts declined (P < 0.05) based on the route-regression method. Empirical comparisons of our results with those for well-studied species such as the peregrine falcon, bald eagle, and osprey indicated agreement with nesting surveys. We suggest that counts of migrant hawks are a useful and economical method for detecting long-term trends in species across regions, particularly for species that otherwise cannot be easily surveyed.

  12. Stimulus novelty, task relevance and the visual evoked potential in man

    NASA Technical Reports Server (NTRS)

    Courchesne, E.; Hillyard, S. A.; Galambos, R.

    1975-01-01

    The effect of task relevance on P3 (waveform of human evoked potential) waves and the methodologies used to deal with them are outlined. Visual evoked potentials (VEPs) were recorded from normal adult subjects performing in a visual discrimination task. Subjects counted the number of presentations of the numeral 4 which was interposed rarely and randomly within a sequence of tachistoscopically flashed background stimuli. Intrusive, task-irrelevant (not counted) stimuli were also interspersed rarely and randomly in the sequence of 2s; these stimuli were of two types: simples, which were easily recognizable, and novels, which were completely unrecognizable. It was found that the simples and the counted 4s evoked posteriorly distributed P3 waves while the irrelevant novels evoked large, frontally distributed P3 waves. These large, frontal P3 waves to novels were also found to be preceded by large N2 waves. These findings indicate that the P3 wave is not a unitary phenomenon but should be considered in terms of a family of waves, differing in their brain generators and in their psychological correlates.

  13. Image-based spectral distortion correction for photon-counting x-ray detectors

    PubMed Central

    Ding, Huanjun; Molloi, Sabee

    2012-01-01

    Purpose: To investigate the feasibility of using an image-based method to correct for distortions induced by various artifacts in the x-ray spectrum recorded with photon-counting detectors for their application in breast computed tomography (CT). Methods: The polyenergetic incident spectrum was simulated with the tungsten anode spectral model using the interpolating polynomials (TASMIP) code and carefully calibrated to match the x-ray tube in this study. Experiments were performed on a Cadmium-Zinc-Telluride (CZT) photon-counting detector with five energy thresholds. Energy bins were adjusted to evenly distribute the recorded counts above the noise floor. BR12 phantoms of various thicknesses were used for calibration. A nonlinear function was selected to fit the count correlation between the simulated and the measured spectra in the calibration process. To evaluate the proposed spectral distortion correction method, an empirical fitting derived from the calibration process was applied on the raw images recorded for polymethyl methacrylate (PMMA) phantoms of 8.7, 48.8, and 100.0 mm. Both the corrected counts and the effective attenuation coefficient were compared to the simulated values for each of the five energy bins. The feasibility of applying the proposed method to quantitative material decomposition was tested using a dual-energy imaging technique with a three-material phantom that consisted of water, lipid, and protein. The performance of the spectral distortion correction method was quantified using the relative root-mean-square (RMS) error with respect to the expected values from simulations or areal analysis of the decomposition phantom. Results: The implementation of the proposed method reduced the relative RMS error of the output counts in the five energy bins with respect to the simulated incident counts from 23.0%, 33.0%, and 54.0% to 1.2%, 1.8%, and 7.7% for 8.7, 48.8, and 100.0 mm PMMA phantoms, respectively. The accuracy of the effective attenuation coefficient of PMMA estimate was also improved with the proposed spectral distortion correction. Finally, the relative RMS error of water, lipid, and protein decompositions in dual-energy imaging was significantly reduced from 53.4% to 6.8% after correction was applied. Conclusions: The study demonstrated that dramatic distortions in the recorded raw image yielded from a photon-counting detector could be expected, which presents great challenges for applying the quantitative material decomposition method in spectral CT. The proposed semi-empirical correction method can effectively reduce these errors caused by various artifacts, including pulse pileup and charge sharing effects. Furthermore, rather than detector-specific simulation packages, the method requires a relatively simple calibration process and knowledge about the incident spectrum. Therefore, it may be used as a generalized procedure for the spectral distortion correction of different photon-counting detectors in clinical breast CT systems. PMID:22482608

  14. COMPLEXITY&APPROXIMABILITY OF QUANTIFIED&STOCHASTIC CONSTRAINT SATISFACTION PROBLEMS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunt, H. B.; Marathe, M. V.; Stearns, R. E.

    2001-01-01

    Let D be an arbitrary (not necessarily finite) nonempty set, let C be a finite set of constant symbols denoting arbitrary elements of D, and let S and T be an arbitrary finite set of finite-arity relations on D. We denote the problem of determining the satisfiability of finite conjunctions of relations in S applied to variables (to variables and symbols in C) by SAT(S) (by SATc(S).) Here, we study simultaneously the complexity of decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. We present simple yet general techniques to characterize simultaneously, the complexity ormore » efficient approximability of a number of versions/variants of the problems SAT(S), Q-SAT(S), S-SAT(S),MAX-Q-SAT(S) etc., for many different such D,C ,S, T. These versions/variants include decision, counting, maximization and approximate maximization problems, for unquantified, quantified and stochastically quantified formulas. Our unified approach is based on the following two basic concepts: (i) strongly-local replacements/reductions and (ii) relational/algebraic represent ability. Some of the results extend the earlier results in [Pa85,LMP99,CF+93,CF+94O]u r techniques and results reported here also provide significant steps towards obtaining dichotomy theorems, for a number of the problems above, including the problems MAX-&-SAT( S), and MAX-S-SAT(S). The discovery of such dichotomy theorems, for unquantified formulas, has received significant recent attention in the literature [CF+93,CF+94,Cr95,KSW97]« less

  15. LEGO bricks used as chemotactic chambers: evaluation by a computer-assisted image analysis technique.

    PubMed

    Azzarà, A; Chimenti, M

    2004-01-01

    One of the main techniques used to explore neutrophil motility, employs micropore filters in chemotactic chambers. Many new models have been proposed, in order to perform multiple microassays in a rapid, inexpensive and reproducible way. In this work, LEGO bricks have been used as chemotactic chambers in the evaluation of neutrophil random motility and chemotaxis and compared with conventional Boyden chambers in a "time-response" experiment. Neutrophil motility throughout the filters was evaluated by means of an image-processing workstation, in which a dedicated algorithm recognizes and counts the cells in several fields and focal planes throughout the whole filter; correlates counts and depth values; performs a statistical analysis of data; calculates the true value of neutrophil migration; determines the distribution of cells; and displays the migration pattern. By this method, we found that the distances travelled by the cells in conventional chambers and in LEGO bricks were perfectly identical, both in random migration and under chemotactic conditions. Moreover, no interference with the physiological behaviour of neutrophils was detectable. In fact, the kinetics of migration was identical both in random migration (characterized by a gaussian pattern) and in chemotaxis (characterized by a typical stimulation peak, previously identified by our workstation). In conclusion, LEGO bricks are extremely precise devices. They are simple to use and allow the use of small amounts of chemoattractant solution and cell suspension, supplying by itself a triplicate test. LEGO bricks are inexpensive, fast and suitable for current diagnostic activity or for research investigations in every laboratory.

  16. Counting Synapses Using FIB/SEM Microscopy: A True Revolution for Ultrastructural Volume Reconstruction.

    PubMed

    Merchán-Pérez, Angel; Rodriguez, José-Rodrigo; Alonso-Nanclares, Lidia; Schertel, Andreas; Defelipe, Javier

    2009-01-01

    The advent of transmission electron microscopy (TEM) in the 1950s represented a fundamental step in the study of neuronal circuits. The application of this technique soon led to the realization that the number of synapses changes during the course of normal life, as well as under certain pathological or experimental circumstances. Since then, one of the main goals in neurosciences has been to define simple and accurate methods to estimate the magnitude of these changes. Contrary to analysing single sections, TEM reconstructions are extremely time-consuming and difficult. Therefore, most quantitative studies use stereological methods to define the three-dimensional characteristics of synaptic junctions that are studied in two dimensions. Here, to count the exact number of synapses per unit of volume we have applied a new three-dimensional reconstruction method that involves the combination of focused ion beam milling and scanning electron microscopy (FIB/SEM). We show that the images obtained with FIB/SEM are similar to those obtained with TEM, but with the advantage that FIB/SEM permits serial reconstructions of large volumes of tissue to be generated rapidly and automatically. Furthermore, we compared the estimates of the number of synapses obtained with stereological methods with the values obtained by FIB/SEM reconstructions. We concluded that FIB/SEM not only provides the actual number of synapses per volume but it is also much easier and faster to use than other currently available TEM methods. More importantly, it also avoids most of the errors introduced by stereological methods and overcomes the difficulties associated with these techniques.

  17. Primary Standardization of 152Eu by 4πβ(LS) – γ (Nal) coincidence counting and CIEMAT-NIST method

    NASA Astrophysics Data System (ADS)

    Ruzzarin, A.; da Cruz, P. A. L.; Ferreira Filho, A. L.; Iwahara, A.

    2018-03-01

    The 4πβ-γ coincidence counting and CIEMAT/NIST liquid scintillation method were used in the standardization of a solution of 152Eu. In CIEMAT/NIST method, measurements were performed in a Liquid Scintillation Counter model Wallac 1414. In the 4πβ-γ coincidence counting, the solution was standardized using a coincidence method with ‘‘beta-efficiency extrapolation”. A simple 4πβ-γ coincidence system was used, with acrylic scintillation cell coupled to two coincident photomultipliers at 180° each other and NaI(Tl) detector. The activity concentrations obtained were 156.934 ± 0.722 and 157.403 ± 0.113 kBq/g, respectively, for CIEMAT/NIST and 4πβ-γ coincidence counting measurement methods.

  18. Comparison of McMaster and FECPAKG2 methods for counting nematode eggs in the faeces of alpacas.

    PubMed

    Rashid, Mohammed H; Stevenson, Mark A; Waenga, Shea; Mirams, Greg; Campbell, Angus J D; Vaughan, Jane L; Jabbar, Abdul

    2018-05-02

    This study aimed to compare the FECPAK G2 and the McMaster techniques for counting of gastrointestinal nematode eggs in the faeces of alpacas using two floatation solutions (saturated sodium chloride and sucrose solutions). Faecal eggs counts from both techniques were compared using the Lin's concordance correlation coefficient and Bland and Altman statistics. Results showed moderate to good agreement between the two methods, with better agreement achieved when saturated sugar is used as a floatation fluid, particularly when faecal egg counts are less than 1000 eggs per gram of faeces. To the best of our knowledge this is the first study to assess agreement of measurements between McMaster and FECPAK G2 methods for estimating faecal eggs in South American camelids.

  19. Repeatability of paired counts.

    PubMed

    Alexander, Neal; Bethony, Jeff; Corrêa-Oliveira, Rodrigo; Rodrigues, Laura C; Hotez, Peter; Brooker, Simon

    2007-08-30

    The Bland and Altman technique is widely used to assess the variation between replicates of a method of clinical measurement. It yields the repeatability, i.e. the value within which 95 per cent of repeat measurements lie. The valid use of the technique requires that the variance is constant over the data range. This is not usually the case for counts of items such as CD4 cells or parasites, nor is the log transformation applicable to zero counts. We investigate the properties of generalized differences based on Box-Cox transformations. For an example, in a data set of hookworm eggs counted by the Kato-Katz method, the square root transformation is found to stabilize the variance. We show how to back-transform the repeatability on the square root scale to the repeatability of the counts themselves, as an increasing function of the square mean root egg count, i.e. the square of the average of square roots. As well as being more easily interpretable, the back-transformed results highlight the dependence of the repeatability on the sample volume used.

  20. Effect of a syringe aspiration technique versus a mechanical suction technique and use of N-butylscopolammonium bromide on the quantity and quality of bronchoalveolar lavage fluid samples obtained from horses with the summer pasture endophenotype of equine asthma.

    PubMed

    Bowser, Jacquelyn E; Costa, Lais R R; Rodil, Alba U; Lopp, Christine T; Johnson, Melanie E; Wills, Robert W; Swiderski, Cyprianna E

    2018-03-01

    OBJECTIVE To evaluate the effect of 2 bronchoalveolar lavage (BAL) sampling techniques and the use of N-butylscopolammonium bromide (NBB) on the quantity and quality of BAL fluid (BALF) samples obtained from horses with the summer pasture endophenotype of equine asthma. ANIMALS 8 horses with the summer pasture endophenotype of equine asthma. PROCEDURES BAL was performed bilaterally (right and left lung sites) with a flexible videoendoscope passed through the left or right nasal passage. During lavage of the first lung site, a BALF sample was collected by means of either gentle syringe aspiration or mechanical suction with a pressure-regulated wall-mounted suction pump. The endoscope was then maneuvered into the contralateral lung site, and lavage was performed with the alternate fluid retrieval technique. For each horse, BAL was performed bilaterally once with and once without premedication with NBB (21-day interval). The BALF samples retrieved were evaluated for volume, total cell count, differential cell count, RBC count, and total protein concentration. RESULTS Use of syringe aspiration significantly increased total BALF volume (mean volume increase, 40 mL [approx 7.5% yield]) and decreased total RBC count (mean decrease, 142 cells/μL), compared with use of mechanical suction. The BALF nucleated cell count and differential cell count did not differ between BAL procedures. Use of NBB had no effect on BALF retrieval. CONCLUSIONS AND CLINICAL RELEVANCE Results indicated that retrieval of BALF by syringe aspiration may increase yield and reduce barotrauma in horses at increased risk of bronchoconstriction and bronchiolar collapse. Further studies to determine the usefulness of NBB and other bronchodilators during BAL procedures in horses are warranted.

  1. Bacteriological quality of drinking water from source to household in Ibadan, Nigeria.

    PubMed

    Oloruntoba, E O; Sridhar, M K C

    2007-06-01

    The bacteriological quality of drinking water from well, spring, borehole, and tap sources and that stored in containers by urban households in Ibadan was assessed during wet and dry seasons. The MPN technique was used to detect and enumerate the number of coliforms in water samples. Results showed that majority of households relied on wells, which were found to be the most contaminated of all the sources. At the household level, water quality significantly deteriorated after collection and storage as a result of poor handling. Furthermore, there was significant seasonal variation in E. coli count at source (P=0.013) and household (P=0.001). The study concludes that there is a need to improve the microbial quality of drinking water at source and the household level through hygiene education, and provision of simple, acceptable, low-cost treatment methods.

  2. The application of laser Rayleigh scattering to gas density measurements in hypersonic helium flows

    NASA Technical Reports Server (NTRS)

    Hoppe, J. C.; Honaker, W. C.

    1979-01-01

    Measurements of the mean static free-stream gas density have been made in two Langley Research Center helium facilities, the 3-inch leg of the high-Reynolds-number helium complex and the 22-inch hypersonic helium tunnel. Rayleigh scattering of a CW argon ion laser beam at 514.5 nm provided the basic physical mechanism. The behavior of the scattered signal was linear, confirmed by a preliminary laboratory study. That study also revealed the need to introduce baffles to reduce stray light. A relatively simple optical system and associated photon-counting electronics were utilized to obtain data for densities from 10 to the 23rd to 10 to the 25th per cu m. The major purpose, to confirm the applicability of this technique in the hypersonic helium flow, was accomplished.

  3. A Fluorescent Marking and Re-count Technique Using the Invasive Earthworm, Pontoscolex corethrurus (Annelida: Oligochaeta)

    Treesearch

    Grizelle Gonzalez; Elianid Espinoza; Zhigang Liu; Xiaoming Zou

    2006-01-01

    We used a fluorescence technique to mark and re-count the invasive earthworm, Pontoscolex corethrurus from PVC tubes established in a forest and a recently abandoned pasture in Puerto Rico to test the effects of the labeling treatment on earthworm population survival over time. A fluorescent marker was injected into the earthworms in the middle third section of the...

  4. Reliable enumeration of malaria parasites in thick blood films using digital image analysis.

    PubMed

    Frean, John A

    2009-09-23

    Quantitation of malaria parasite density is an important component of laboratory diagnosis of malaria. Microscopy of Giemsa-stained thick blood films is the conventional method for parasite enumeration. Accurate and reproducible parasite counts are difficult to achieve, because of inherent technical limitations and human inconsistency. Inaccurate parasite density estimation may have adverse clinical and therapeutic implications for patients, and for endpoints of clinical trials of anti-malarial vaccines or drugs. Digital image analysis provides an opportunity to improve performance of parasite density quantitation. Accurate manual parasite counts were done on 497 images of a range of thick blood films with varying densities of malaria parasites, to establish a uniformly reliable standard against which to assess the digital technique. By utilizing descriptive statistical parameters of parasite size frequency distributions, particle counting algorithms of the digital image analysis programme were semi-automatically adapted to variations in parasite size, shape and staining characteristics, to produce optimum signal/noise ratios. A reliable counting process was developed that requires no operator decisions that might bias the outcome. Digital counts were highly correlated with manual counts for medium to high parasite densities, and slightly less well correlated with conventional counts. At low densities (fewer than 6 parasites per analysed image) signal/noise ratios were compromised and correlation between digital and manual counts was poor. Conventional counts were consistently lower than both digital and manual counts. Using open-access software and avoiding custom programming or any special operator intervention, accurate digital counts were obtained, particularly at high parasite densities that are difficult to count conventionally. The technique is potentially useful for laboratories that routinely perform malaria parasite enumeration. The requirements of a digital microscope camera, personal computer and good quality staining of slides are potentially reasonably easy to meet.

  5. Statistical precision of the intensities retrieved from constrained fitting of overlapping peaks in high-resolution mass spectra

    DOE PAGES

    Cubison, M. J.; Jimenez, J. L.

    2015-06-05

    Least-squares fitting of overlapping peaks is often needed to separately quantify ions in high-resolution mass spectrometer data. A statistical simulation approach is used to assess the statistical precision of the retrieved peak intensities. The sensitivity of the fitted peak intensities to statistical noise due to ion counting is probed for synthetic data systems consisting of two overlapping ion peaks whose positions are pre-defined and fixed in the fitting procedure. The fitted intensities are sensitive to imperfections in the m/Q calibration. These propagate as a limiting precision in the fitted intensities that may greatly exceed the precision arising from counting statistics.more » The precision on the fitted peak intensity falls into one of three regimes. In the "counting-limited regime" (regime I), above a peak separation χ ~ 2 to 3 half-widths at half-maximum (HWHM), the intensity precision is similar to that due to counting error for an isolated ion. For smaller χ and higher ion counts (~ 1000 and higher), the intensity precision rapidly degrades as the peak separation is reduced ("calibration-limited regime", regime II). Alternatively for χ < 1.6 but lower ion counts (e.g. 10–100) the intensity precision is dominated by the additional ion count noise from the overlapping ion and is not affected by the imprecision in the m/Q calibration ("overlapping-limited regime", regime III). The transition between the counting and m/Q calibration-limited regimes is shown to be weakly dependent on resolving power and data spacing and can thus be approximated by a simple parameterisation based only on peak intensity ratios and separation. A simple equation can be used to find potentially problematic ion pairs when evaluating results from fitted spectra containing many ions. Longer integration times can improve the precision in regimes I and III, but a given ion pair can only be moved out of regime II through increased spectrometer resolving power. As a result, studies presenting data obtained from least-squares fitting procedures applied to mass spectral peaks should explicitly consider these limits on statistical precision.« less

  6. Raman Spectroscopy for Mineral Identification and Quantification for in situ Planetary Surface Analysis: A Point Count Method

    NASA Technical Reports Server (NTRS)

    Haskin, Larry A.; Wang, Alian; Rockow, Kaylynn M.; Jolliff, Bradley L.; Korotev, Randy L.; Viskupic, Karen M.

    1997-01-01

    Quantification of mineral proportions in rocks and soils by Raman spectroscopy on a planetary surface is best done by taking many narrow-beam spectra from different locations on the rock or soil, with each spectrum yielding peaks from only one or two minerals. The proportion of each mineral in the rock or soil can then be determined from the fraction of the spectra that contain its peaks, in analogy with the standard petrographic technique of point counting. The method can also be used for nondestructive laboratory characterization of rock samples. Although Raman peaks for different minerals seldom overlap each other, it is impractical to obtain proportions of constituent minerals by Raman spectroscopy through analysis of peak intensities in a spectrum obtained by broad-beam sensing of a representative area of the target material. That is because the Raman signal strength produced by a mineral in a rock or soil is not related in a simple way through the Raman scattering cross section of that mineral to its proportion in the rock, and the signal-to-noise ratio of a Raman spectrum is poor when a sample is stimulated by a low-power laser beam of broad diameter. Results obtained by the Raman point-count method are demonstrated for a lunar thin section (14161,7062) and a rock fragment (15273,7039). Major minerals (plagioclase and pyroxene), minor minerals (cristobalite and K-feldspar), and accessory minerals (whitlockite, apatite, and baddeleyite) were easily identified. Identification of the rock types, KREEP basalt or melt rock, from the 100-location spectra was straightforward.

  7. Generating Discrete Power-Law Distributions from a Death- Multiple Immigration Population Process

    NASA Astrophysics Data System (ADS)

    Matthews, J. O.; Jakeman, E.; Hopcraft, K. I.

    2003-04-01

    We consider the evolution of a simple population process governed by deaths and multiple immigrations that arrive with rates particular to their order. For a particular choice of rates, the equilibrium solution has a discrete power-law form. The model is a generalization of a process investigated previously where immigrants arrived in pairs [1]. The general properties of this model are discussed in a companion paper. The population is initiated with precisely M individuals present and evolves to an equilibrium distribution with a power-law tail. However the power-law tails of the equilibrium distribution are established immediately, so that moments and correlation properties of the population are undefined for any non-zero time. The technique we develop to characterize this process utilizes external monitoring that counts the emigrants leaving the population in specified time intervals. This counting distribution also possesses a power-law tail for all sampling times and the resulting time series exhibits two features worthy of note, a large variation in the strength of the signal, reflecting the power-law PDF; and secondly, intermittency of the emissions. We show that counting with a detector of finite dynamic range regularizes naturally the fluctuations, in effect `clipping' the events. All previously undefined characteristics such as the mean, autocorrelation and probabilities to the first event and time between events are well defined and derived. These properties, although obtained by discarding much data, nevertheless possess embedded power-law regimes that characterize the population in a way that is analogous to box averaging determination of fractal-dimension.

  8. Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors.

    PubMed

    Dutton, Neale A W; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K

    2016-07-20

    SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed.

  9. Application of a first impression triage in the Japan railway west disaster.

    PubMed

    Hashimoto, Atsunori; Ueda, Takahiro; Kuboyama, Kazutoshi; Yamada, Taihei; Terashima, Mariko; Miyawaki, Atsushi; Nakao, Atsunori; Kotani, Joji

    2013-01-01

    On April 25, 2005, a Japanese express train derailed into a building, resulting in 107 deaths and 549 injuries. We used "First Impression Triage (FIT)", our new triage strategy based on general inspection and palpation without counting pulse/respiratory rates, and determined the feasibility of FIT in the chaotic situation of treating a large number of injured people in a brief time period. The subjects included 39 patients who required hospitalization among 113 victims transferred to our hospital. After initial assessment with FIT by an emergency physician, patients were retrospectively reassessed with the preexisting the modified Simple Triage and Rapid Treatment (START) methodology, based on Injury Severity Score, probability of survival, and ICU stay. FIT resulted in shorter waiting time for triage. FIT designations comprised 11 red (immediate), 28 yellow (delayed), while START assigned six to red and 32 to yellow. There were no statistical differences between FIT and START in the accuracy rate calculated by means of probability of survival and ICU stay. Overall validity and reliability of FIT determined by outcome assessment were similar to those of START. FIT would be a simple and accurate technique to quickly triage a large number of patients.

  10. Connectivity algorithm with depth first search (DFS) on simple graphs

    NASA Astrophysics Data System (ADS)

    Riansanti, O.; Ihsan, M.; Suhaimi, D.

    2018-01-01

    This paper discusses an algorithm to detect connectivity of a simple graph using Depth First Search (DFS). The DFS implementation in this paper differs than other research, that is, on counting the number of visited vertices. The algorithm obtains s from the number of vertices and visits source vertex, following by its adjacent vertices until the last vertex adjacent to the previous source vertex. Any simple graph is connected if s equals 0 and disconnected if s is greater than 0. The complexity of the algorithm is O(n2).

  11. Progress in the development of the neutron flux monitoring system of the French GEN-IV SFR: simulations and experimental validations [ANIMMA--2015-IO-392

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jammes, C.; Filliatre, P.; Izarra, G. de

    France has a long experience of about 50 years in designing, building and operating sodium-cooled fast reactors (SFR) such as RAPSODIE, PHENIX and SUPER PHENIX. Fast reactors feature the double capability of reducing nuclear waste and saving nuclear energy resources by burning actinides. Since this reactor type is one of those selected by the Generation IV International Forum, the French government asked, in the year 2006, CEA, namely the French Alternative Energies and Atomic Energy Commission, to lead the development of an innovative GEN-IV nuclear- fission power demonstrator. The major objective is to improve the safety and availability of anmore » SFR. The neutron flux monitoring (NFM) system of any reactor must, in any situation, permit both reactivity control and power level monitoring from startup to full power. It also has to monitor possible changes in neutron flux distribution within the core region in order to prevent any local melting accident. The neutron detectors will have to be installed inside the reactor vessel because locations outside the vessel will suffer from severe disadvantages; radially the neutron shield that is also contained in the reactor vessel will cause unacceptable losses in neutron flux; below the core the presence of a core-catcher prevents from inserting neutron guides; and above the core the distance is too large to obtain decent neutron signals outside the vessel. Another important point is to limit the number of detectors placed in the vessel in order to alleviate their installation into the vessel. In this paper, we show that the architecture of the NFM system will rely on high-temperature fission chambers (HTFC) featuring wide-range flux monitoring capability. The definition of such a system is presented and the justifications of technological options are brought with the use of simulation and experimental results. Firstly, neutron-transport calculations allow us to propose two in-vessel regions, namely the above-core and under-core structures. We verify that they comply with the main objective, that is the neutron power and flux distribution monitoring. HTFC placed in these two regions can detect an inadvertent control rod withdrawal that is a postulated initiating event for safety demonstration. Secondly, we show that the HTFC reliability is enhanced thanks to a more robust physical design and the fact that it has been justified that the mineral insulation is insensitive to any increase in temperature. Indeed, the HTFC insulation is subject to partial discharges at high temperature when the electric field between their electrodes is greater than about 200 V/mm or so. These discharges give rise to signals similar to the neutron pulses generated by a fission chamber itself, which may bias the HTFC count rate at start-up only. However, as displayed in Figure 1, we have experimentally verified that one can discriminate neutron pulses from partial discharges using online estimation of pulse width. Thirdly, we propose to estimate the count rate of a HTFC using the third order cumulant of its signal that is described by a filtered Poisson process. For such a statistic process, it is known that any cumulant, also called cumulative moment, is proportional to the process intensity that is here the count rate of a fission chamber. One recalls that the so-called Campbelling mode of such a detector is actually based on the signal variance, which is the second-order cumulant as well. The use of this extended Campbelling mode based on the third-order cumulant will permit to ensure the HTFC response linearity over the entire neutron flux range using a signal processing technique that is simple enough to satisfy design constraints on electric devices important for nuclear safety. We also show that this technique, named high order Campbelling method (HOC), is significantly more robust than another technique based on the change in the HTFC filling gas, which consists in adding a few percent of nitrogen. Finally, we also present an experimental campaign devoted to the required calibration process of the so-called HOC method. The Campbelling results show a good agreement with the simple pulse counting estimation at low count rates. It is also shown that the HOC technique provides a linear estimation of the count rates at higher power levels as well.« less

  12. Development of a practical image-based scatter correction method for brain perfusion SPECT: comparison with the TEW method.

    PubMed

    Shidahara, Miho; Watabe, Hiroshi; Kim, Kyeong Min; Kato, Takashi; Kawatsu, Shoji; Kato, Rikio; Yoshimura, Kumiko; Iida, Hidehiro; Ito, Kengo

    2005-10-01

    An image-based scatter correction (IBSC) method was developed to convert scatter-uncorrected into scatter-corrected SPECT images. The purpose of this study was to validate this method by means of phantom simulations and human studies with 99mTc-labeled tracers, based on comparison with the conventional triple energy window (TEW) method. The IBSC method corrects scatter on the reconstructed image I(mub)AC with Chang's attenuation correction factor. The scatter component image is estimated by convolving I(mub)AC with a scatter function followed by multiplication with an image-based scatter fraction function. The IBSC method was evaluated with Monte Carlo simulations and 99mTc-ethyl cysteinate dimer SPECT human brain perfusion studies obtained from five volunteers. The image counts and contrast of the scatter-corrected images obtained by the IBSC and TEW methods were compared. Using data obtained from the simulations, the image counts and contrast of the scatter-corrected images obtained by the IBSC and TEW methods were found to be nearly identical for both gray and white matter. In human brain images, no significant differences in image contrast were observed between the IBSC and TEW methods. The IBSC method is a simple scatter correction technique feasible for use in clinical routine.

  13. A novel method for flow pattern identification in unstable operational conditions using gamma ray and radial basis function.

    PubMed

    Roshani, G H; Nazemi, E; Roshani, M M

    2017-05-01

    Changes of fluid properties (especially density) strongly affect the performance of radiation-based multiphase flow meter and could cause error in recognizing the flow pattern and determining void fraction. In this work, we proposed a methodology based on combination of multi-beam gamma ray attenuation and dual modality densitometry techniques using RBF neural network in order to recognize the flow regime and determine the void fraction in gas-liquid two phase flows independent of the liquid phase changes. The proposed system is consisted of one 137 Cs source, two transmission detectors and one scattering detector. The registered counts in two transmission detectors were used as the inputs of one primary Radial Basis Function (RBF) neural network for recognizing the flow regime independent of liquid phase density. Then, after flow regime identification, three RBF neural networks were utilized for determining the void fraction independent of liquid phase density. Registered count in scattering detector and first transmission detector were used as the inputs of these three RBF neural networks. Using this simple methodology, all the flow patterns were correctly recognized and the void fraction was predicted independent of liquid phase density with mean relative error (MRE) of less than 3.28%. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Rapid Quantitative Detection of Lactobacillus sakei in Meat and Fermented Sausages by Real-Time PCR

    PubMed Central

    Martín, Belén; Jofré, Anna; Garriga, Margarita; Pla, Maria; Aymerich, Teresa

    2006-01-01

    A quick and simple method for quantitative detection of Lactobacillus sakei in fermented sausages was successfully developed. It is based on Chelex-100-based DNA purification and real-time PCR enumeration using a TaqMan fluorescence probe. Primers and probes were designed in the L. sakei 16S-23S rRNA intergenic transcribed spacer region, and the assay was evaluated using L. sakei genomic DNA and an artificially inoculated sausage model. The detection limit of this technique was approximately 3 cells per reaction mixture using both purified DNA and the inoculated sausage model. The quantification limit was established at 30 cells per reaction mixture in both models. The assay was then applied to enumerate L. sakei in real samples, and the results were compared to the MRS agar count method followed by confirmation of the percentage of L. sakei colonies. The results obtained by real-time PCR were not statistically significantly different than those obtained by plate count on MRS agar (P > 0.05), showing a satisfactory agreement between both methods. Therefore, the real-time PCR assay developed can be considered a promising rapid alternative method for the quantification of L. sakei and evaluation of the implantation of starter strains of L. sakei in fermented sausages. PMID:16957227

  15. Seabird species vary in behavioural response to drone census.

    PubMed

    Brisson-Curadeau, Émile; Bird, David; Burke, Chantelle; Fifield, David A; Pace, Paul; Sherley, Richard B; Elliott, Kyle H

    2017-12-20

    Unmanned aerial vehicles (UAVs) provide an opportunity to rapidly census wildlife in remote areas while removing some of the hazards. However, wildlife may respond negatively to the UAVs, thereby skewing counts. We surveyed four species of Arctic cliff-nesting seabirds (glaucous gull Larus hyperboreus, Iceland gull Larus glaucoides, common murre Uria aalge and thick-billed murre Uria lomvia) using a UAV and compared censusing techniques to ground photography. An average of 8.5% of murres flew off in response to the UAV, but >99% of those birds were non-breeders. We were unable to detect any impact of the UAV on breeding success of murres, except at a site where aerial predators were abundant and several birds lost their eggs to predators following UAV flights. Furthermore, we found little evidence for habituation by murres to the UAV. Most gulls flew off in response to the UAV, but returned to the nest within five minutes. Counts of gull nests and adults were similar between UAV and ground photography, however the UAV detected up to 52.4% more chicks because chicks were camouflaged and invisible to ground observers. UAVs provide a less hazardous and potentially more accurate method for surveying wildlife. We provide some simple recommendations for their use.

  16. Rapid quantitative detection of Lactobacillus sakei in meat and fermented sausages by real-time PCR.

    PubMed

    Martín, Belén; Jofré, Anna; Garriga, Margarita; Pla, Maria; Aymerich, Teresa

    2006-09-01

    A quick and simple method for quantitative detection of Lactobacillus sakei in fermented sausages was successfully developed. It is based on Chelex-100-based DNA purification and real-time PCR enumeration using a TaqMan fluorescence probe. Primers and probes were designed in the L. sakei 16S-23S rRNA intergenic transcribed spacer region, and the assay was evaluated using L. sakei genomic DNA and an artificially inoculated sausage model. The detection limit of this technique was approximately 3 cells per reaction mixture using both purified DNA and the inoculated sausage model. The quantification limit was established at 30 cells per reaction mixture in both models. The assay was then applied to enumerate L. sakei in real samples, and the results were compared to the MRS agar count method followed by confirmation of the percentage of L. sakei colonies. The results obtained by real-time PCR were not statistically significantly different than those obtained by plate count on MRS agar (P > 0.05), showing a satisfactory agreement between both methods. Therefore, the real-time PCR assay developed can be considered a promising rapid alternative method for the quantification of L. sakei and evaluation of the implantation of starter strains of L. sakei in fermented sausages.

  17. Adaptable Assignment

    DOT National Transportation Integrated Search

    1997-01-01

    This paper reports on a practical, simple method for adjusting a vehicle trip table so that the resulting assignments more closely match available traffic counts. "Practical" means that this is not purely a research effort - the procedure described h...

  18. The role of awareness of repetition during the development of automaticity in a dot-counting task

    PubMed Central

    Shadbolt, Emma

    2018-01-01

    This study examined whether being aware of the repetition of stimuli in a simple numerosity task could aid the development of automaticity. The numerosity task used in this study was a simple counting task. Thirty-four participants were divided into two groups. One group was instructed that the stimuli would repeat many times throughout the experiment. The results showed no significant differences in the way automatic processing developed between the groups. Similarly, there was no correlation between the point at which automatic processing developed and the point at which participants felt they benefitted from the repetition of stimuli. These results suggest that extra-trial features of a task may have no effect on the development of automaticity, a finding consistent with the instance theory of automatisation. PMID:29404220

  19. Neutron Detection With Ultra-Fast Digitizer and Pulse Identification Techniques on DIII-D

    NASA Astrophysics Data System (ADS)

    Zhu, Y. B.; Heidbrink, W. W.; Piglowski, D. A.

    2013-10-01

    A prototype system for neutron detection with an ultra-fast digitizer and pulse identification techniques has been implemented on the DIII-D tokamak. The system consists of a cylindrical neutron fission chamber, a charge sensitive amplifier, and a GaGe Octopus 12-bit CompuScope digitizer card installed in a Linux computer. Digital pulse identification techniques have been successfully performed at maximum data acquisition rate of 50 MSPS with on-board memory of 2 GS. Compared to the traditional approach with fast nuclear electronics for pulse counting, this straightforward digital solution has many advantages, including reduced expense, improved accuracy, higher counting rate, and easier maintenance. The system also provides the capability of neutron-gamma pulse shape discrimination and pulse height analysis. Plans for the upgrade of the old DIII-D neutron counting system with these techniques will be presented. Work supported by the US Department of Energy under SC-G903402, and DE-FC02-04ER54698.

  20. Predictive Analytics In Healthcare: Medications as a Predictor of Medical Complexity.

    PubMed

    Higdon, Roger; Stewart, Elizabeth; Roach, Jared C; Dombrowski, Caroline; Stanberry, Larissa; Clifton, Holly; Kolker, Natali; van Belle, Gerald; Del Beccaro, Mark A; Kolker, Eugene

    2013-12-01

    Children with special healthcare needs (CSHCN) require health and related services that exceed those required by most hospitalized children. A small but growing and important subset of the CSHCN group includes medically complex children (MCCs). MCCs typically have comorbidities and disproportionately consume healthcare resources. To enable strategic planning for the needs of MCCs, simple screens to identify potential MCCs rapidly in a hospital setting are needed. We assessed whether the number of medications used and the class of those medications correlated with MCC status. Retrospective analysis of medication data from the inpatients at Seattle Children's Hospital found that the numbers of inpatient and outpatient medications significantly correlated with MCC status. Numerous variables based on counts of medications, use of individual medications, and use of combinations of medications were considered, resulting in a simple model based on three different counts of medications: outpatient and inpatient drug classes and individual inpatient drug names. The combined model was used to rank the patient population for medical complexity. As a result, simple, objective admission screens for predicting the complexity of patients based on the number and type of medications were implemented.

  1. Measuring Transmission Efficiencies Of Mass Spectrometers

    NASA Technical Reports Server (NTRS)

    Srivastava, Santosh K.

    1989-01-01

    Coincidence counts yield absolute efficiencies. System measures mass-dependent transmission efficiencies of mass spectrometers, using coincidence-counting techniques reminiscent of those used for many years in calibration of detectors for subatomic particles. Coincidences between detected ions and electrons producing them counted during operation of mass spectrometer. Under certain assumptions regarding inelastic scattering of electrons, electron/ion-coincidence count is direct measure of transmission efficiency of spectrometer. When fully developed, system compact, portable, and used routinely to calibrate mass spectrometers.

  2. Pressure-based impact method to count bedload particles

    NASA Astrophysics Data System (ADS)

    Antico, Federica; Mendes, Luís; Aleixo, Rui; Ferreira, Rui M. L.

    2017-04-01

    Bedload transport processes determine morphological changes in fluvial, estuarine and coastal domains, thus impacting the diversity and quality of ecosystems and human activities such as river management, coastal protection or dam operation. In spite of the advancements made in the last 60 years, driven by the improvements in measurement techniques, research efforts on grain-scale mechanics of bedload are still required, especially to clarify the intermittent nature of bedload, its stochastic structure and its scale dependence. A new impact-based device to measure bedload transport - MiCas system - is presented in this work. It was designed to meet the following key requirements: simple data output composed of time instant and location of impacts; no need for post-processing - impacts determined through hardware and firmware; capable of computing simple statistics in real time such as cumulative particle counting and discrete lateral distribution of cumulative particle counts; able to run for very large time periods (days, weeks); ability to detect particle impacts of large size fractions that are separated by a few milliseconds; composed of robust and relatively cheap components. The system's firmware analyses pressure time series, namely recognizing the imprints of impacts of individual particles as they hit pressurized membranes. A pattern analysis algorithm is used to identify the impact events. The implementation of this principle in a dedicated microprocessor allows for the real-time measurements of particle hits and cumulative particle count. To validate the results obtained by the MiCas system, Experiments were carried out in the 12.5m long and 40.5cm wide glass-sided flume of the Laboratory of Hydraulics and Environment of Instituto Superior Técnico, Lisbon. This flume has two independent circuits for water and sediment recirculation. A cohesionless granular bed, composed of 4 layers of 5 mm glass beads, subjected to a steady-uniform turbulent open-channel flow, was analysed. All tests featured a period of 90 s data collection. For a detailed description of the laboratory facilities and test conditions see Mendes et al. (2016). Results from MiCas system were compared with those of obtained from the analysis of a high-speed video footage. The obtained results shown a good agreement between both techniques. The measurements carried out allowed to determine that MiCas system is able to track particle impact in real-time within an error margin of 2.0%. From different tests with the same conditions it was possible to determine the repeatability of MiCas system. Derived quantities such as bedload transport rates, eulerian auto-correlation functions and structure functions are also in close agreement with measurements based on optical methods. The main advantages of MiCas system relatively to digital image processing methods are: a) independence from optical access, thus avoiding problems with light intensity variations and oscillating free surfaces; b) small volume of data associated to particle counting, which allows for the possibility of acquiring very long data series (hours, days) of particle impacts. In the considered cases, it would take more than two hours to generate 1 MB of data. For the current validation tests, 90 s acquisition time generated 25 Gb of images but 11 kB of MiCas data. On the other hand the time necessary to process the digital images may correspond to days, effectively limiting its usage to small time series. c) the possibility of real-time measurements, allowing for detection of problems during the experiments and minimizing some post-processing steps. This research was partially supported by Portuguese and European funds, within programs COMPETE2020 and PORL-FEDER, through project PTDC/ECM-HID/6387/2014 granted by the National Foundation for Science and Technology (FCT). References Mendes L., Antico F., Sanches P., Alegria F., Aleixo R., and Ferreira RML. (2016). A particle counting system for calculation of bedload fluxes. Measurement Science and Technology. DOI: http://dx.doi.org/10.1088/0957-0233/27/12/125305

  3. Nucleated red blood cells in growth-restricted fetuses: associations with short-term neonatal outcome.

    PubMed

    Minior, V K; Bernstein, P S; Divon, M Y

    2000-01-01

    To determine the utility of the neonatal nucleated red blood cell (NRBC) count as an independent predictor of short-term perinatal outcome in growth-restricted fetuses. Hospital charts of neonates with a discharge diagnosis indicating a birth weight <10th percentile were reviewed for perinatal outcome. We studied all eligible neonates who had a complete blood count on the first day of life. After multiple gestations, anomalous fetuses and diabetic pregnancies were excluded; 73 neonates comprised the study group. Statistical analysis included ANOVA, simple and stepwise regression. Elevated NRBC counts were significantly associated with cesarean section for non-reassuring fetal status, neonatal intensive care unit admission and duration of neonatal intensive care unit stay, respiratory distress and intubation, thrombocytopenia, hyperbilirubinemia, intraventricular hemorrhage and neonatal death. Stepwise regression analysis including gestational age at birth, birth weight and NRBC count demonstrated that in growth-restricted fetuses, NRBC count was the strongest predictor of neonatal intraventricular hemorrhage, neonatal respiratory distress and neonatal death. An elevated NRBC count independently predicts adverse perinatal outcome in growth-restricted fetuses. Copyright 2000 S. Karger AG, Basel.

  4. Clustering method for counting passengers getting in a bus with single camera

    NASA Astrophysics Data System (ADS)

    Yang, Tao; Zhang, Yanning; Shao, Dapei; Li, Ying

    2010-03-01

    Automatic counting of passengers is very important for both business and security applications. We present a single-camera-based vision system that is able to count passengers in a highly crowded situation at the entrance of a traffic bus. The unique characteristics of the proposed system include, First, a novel feature-point-tracking- and online clustering-based passenger counting framework, which performs much better than those of background-modeling-and foreground-blob-tracking-based methods. Second, a simple and highly accurate clustering algorithm is developed that projects the high-dimensional feature point trajectories into a 2-D feature space by their appearance and disappearance times and counts the number of people through online clustering. Finally, all test video sequences in the experiment are captured from a real traffic bus in Shanghai, China. The results show that the system can process two 320×240 video sequences at a frame rate of 25 fps simultaneously, and can count passengers reliably in various difficult scenarios with complex interaction and occlusion among people. The method achieves high accuracy rates up to 96.5%.

  5. Monitoring planktivorous seabird populations: Validating surface counts of crevice-nesting auklets using mark-resight techniques

    USGS Publications Warehouse

    Sheffield, L.M.; Gall, Adrian E.; Roby, D.D.; Irons, D.B.; Dugger, K.M.

    2006-01-01

    Least Auklets (Aethia pusilla (Pallas, 1811)) are the most abundant species of seabird in the Bering Sea and offer a relatively efficient means of monitoring secondary productivity in the marine environment. Counting auklets on surface plots is the primary method used to track changes in numbers of these crevice-nesters, but counts can be highly variable and may not be representative of the number of nesting individuals. We compared average maximum counts of Least Auklets on surface plots with density estimates based on mark–resight data at a colony on St. Lawrence Island, Alaska, during 2001–2004. Estimates of breeding auklet abundance from mark–resight averaged 8 times greater than those from maximum surface counts. Our results also indicate that average maximum surface counts are poor indicators of breeding auklet abundance and do not vary consistently with auklet nesting density across the breeding colony. Estimates of Least Auklet abundance from mark–resight were sufficiently precise to meet management goals for tracking changes in seabird populations. We recommend establishing multiple permanent banding plots for mark–resight studies on colonies selected for intensive long-term monitoring. Mark–resight is more likely to detect biologically significant changes in size of auklet breeding colonies than traditional surface count techniques.

  6. Use of alpha spectroscopy for conducting rapid surveys of transuranic activity on air sample filters and smears.

    PubMed

    Hayes, Robert B; Peña, Adan M; Goff, Thomas E

    2005-08-01

    This paper demonstrates the utility of a portable alpha Continuous Air Monitor (CAM) as a bench top scalar counter for multiple sample types. These include using the CAM to count fixed air sample filters and radiological smears. In counting radiological smears, the CAM is used very much like a gas flow proportional counter (GFPC), albeit with a lower efficiency. Due to the typically low background in this configuration, the minimum detectable activity for a 5-min count should be in the range of about 10 dpm which is acceptably below the 20 dpm limit for transuranic isotopes. When counting fixed air sample filters, the CAM algorithm along with other measurable characteristics can be used to identify and quantify the presence of transuranic isotopes in the samples. When the radiological control technician wants to take some credit from naturally occurring radioactive material contributions due to radon progeny producing higher energy peaks (as in the case with a fixed air sample filter), then more elaborate techniques are required. The techniques presented here will generate a decision level of about 43 dpm for such applications. The calibration for this application should alternatively be done using the default values of channels 92-126 for region of interest 1. This can be done within 10 to 15 min resulting in a method to rapidly evaluate air filters for transuranic activity. When compared to the 1-h count technique described by , the technique presented in the present work demonstrates a technique whereby more than two thirds of samples can be rapidly shown (within 10 to 15 min) to be within regulatory compliant limits. In both cases, however, spectral quality checks are required to insure sample self attenuation is not a significant bias in the activity estimates. This will allow the same level of confidence when using these techniques for activity quantification as is presently available for air monitoring activity quantification using CAMs.

  7. Calendar methods of fertility regulation: a rule of thumb.

    PubMed

    Colombo, B; Scarpa, B

    1996-01-01

    "[Many] illiterate women, particularly in the third world, find [it] difficult to apply usual calendar methods for the regulation of fertility. Some of them are even unable to make simple subtractions. In this paper we are therefore trying to evaluate the applicability and the efficiency of an extremely simple rule which entails only [the ability to count] a number of days, and always the same way." (SUMMARY IN ITA) excerpt

  8. Methods for assessing long-term mean pathogen count in drinking water and risk management implications.

    PubMed

    Englehardt, James D; Ashbolt, Nicholas J; Loewenstine, Chad; Gadzinski, Erik R; Ayenu-Prah, Albert Y

    2012-06-01

    Recently pathogen counts in drinking and source waters were shown theoretically to have the discrete Weibull (DW) or closely related discrete growth distribution (DGD). The result was demonstrated versus nine short-term and three simulated long-term water quality datasets. These distributions are highly skewed such that available datasets seldom represent the rare but important high-count events, making estimation of the long-term mean difficult. In the current work the methods, and data record length, required to assess long-term mean microbial count were evaluated by simulation of representative DW and DGD waterborne pathogen count distributions. Also, microbial count data were analyzed spectrally for correlation and cycles. In general, longer data records were required for more highly skewed distributions, conceptually associated with more highly treated water. In particular, 500-1,000 random samples were required for reliable assessment of the population mean ±10%, though 50-100 samples produced an estimate within one log (45%) below. A simple correlated first order model was shown to produce count series with 1/f signal, and such periodicity over many scales was shown in empirical microbial count data, for consideration in sampling. A tiered management strategy is recommended, including a plan for rapid response to unusual levels of routinely-monitored water quality indicators.

  9. Analysis of calibration data for the uranium active neutron coincidence counting collar with attention to errors in the measured neutron coincidence rate

    DOE PAGES

    Croft, Stephen; Burr, Thomas Lee; Favalli, Andrea; ...

    2015-12-10

    We report that the declared linear density of 238U and 235U in fresh low enriched uranium light water reactor fuel assemblies can be verified for nuclear safeguards purposes using a neutron coincidence counter collar in passive and active mode, respectively. The active mode calibration of the Uranium Neutron Collar – Light water reactor fuel (UNCL) instrument is normally performed using a non-linear fitting technique. The fitting technique relates the measured neutron coincidence rate (the predictor) to the linear density of 235U (the response) in order to estimate model parameters of the nonlinear Padé equation, which traditionally is used to modelmore » the calibration data. Alternatively, following a simple data transformation, the fitting can also be performed using standard linear fitting methods. This paper compares performance of the nonlinear technique to the linear technique, using a range of possible error variance magnitudes in the measured neutron coincidence rate. We develop the required formalism and then apply the traditional (nonlinear) and alternative approaches (linear) to the same experimental and corresponding simulated representative datasets. Lastly, we find that, in this context, because of the magnitude of the errors in the predictor, it is preferable not to transform to a linear model, and it is preferable not to adjust for the errors in the predictor when inferring the model parameters« less

  10. Prevalence of thrombocytopenia before and after initiation of HAART among HIV infected patients at black lion specialized hospital, Addis Ababa, Ethiopia: a cross sectional study.

    PubMed

    Woldeamanuel, Gashaw Garedew; Wondimu, Diresibachew Haile

    2018-01-01

    Hematological abnormalities are common in HIV positive patients. Of these, thrombocytopenia is a known complication which has been associated with a variety of bleeding disorders. However, its magnitude and related factors have not been well-characterized in the era of highly active antiretroviral therapy (HAART) in Ethiopia. Therefore, this study aimed to assess the prevalence of thrombocytopenia before and after initiation of HAART among HIV positive patients attending Black Lion Specialized Hospital, Addis Ababa, Ethiopia. A cross sectional study was conducted from February to April 2017 in Black Lion Specialized Hospital, Addis Ababa, Ethiopia. A total of 176 patients on HAART were selected using simple random sampling techniques. Socio-demographic and clinical characteristics of the study patients were collected using structured questionnaire. Measurements of platelet counts and CD4 + T cell counts were made using Sysmex XT 2000i hematology analyzer and BD FACS Count CD4 analyzer, respectively. Statistical analysis of the data (Paired T- test and binary logistic regression) was done using SPSS version 20. P -value < 0.05 was considered as statistically significant. A total of 176 patients (Age > 18 years old) were enrolled in this study and had a mean age of 40.08 ± 9.38 years. There was significant increase in the mean values of platelet counts (218.44 ± 106.6 × 10 3 /μl vs 273.65 ± 83.8 × 10 3 /μl, p  < 0.001) after six months of HAART initiation compared to the baseline. Prevalence of thrombocytopenia before and after HAART initiation was 25 and 5.7% respectively. HIV patients whose CD4 counts < 200 Cells/μl were more likely to have thrombocytopenia than HIV patients whose CD4 count ≥350 Cells/μl. However, it was not statistically associated with prevalence of thrombocytopenia. This study has shown that the prevalence of thrombocytopenia after HAART initiation was decreased significantly. Based on our results, a number of study participants still had thrombocytopenia after initiation of HAART. Therefore, continuous screening for thrombocytopenia among HIV infected patients should be performed to decrease the risk of morbidity and mortality.

  11. Comparative evaluation of human heat stress indices on selected hospital admissions in Sydney, Australia.

    PubMed

    Goldie, James; Alexander, Lisa; Lewis, Sophie C; Sherwood, Steven

    2017-08-01

    To find appropriate regression model specifications for counts of the daily hospital admissions of a Sydney cohort and determine which human heat stress indices best improve the models' fit. We built parent models of eight daily counts of admission records using weather station observations, census population estimates and public holiday data. We added heat stress indices; models with lower Akaike Information Criterion scores were judged a better fit. Five of the eight parent models demonstrated adequate fit. Daily maximum Simplified Wet Bulb Globe Temperature (sWBGT) consistently improved fit more than most other indices; temperature and heatwave indices also modelled some health outcomes well. Humidity and heat-humidity indices better fit counts of patients who died following admission. Maximum sWBGT is an ideal measure of heat stress for these types of Sydney hospital admissions. Simple temperature indices are a good fallback where a narrower range of conditions is investigated. Implications for public health: This study confirms the importance of selecting appropriate heat stress indices for modelling. Epidemiologists projecting Sydney hospital admissions should use maximum sWBGT as a common measure of heat stress. Health organisations interested in short-range forecasting may prefer simple temperature indices. © 2017 The Authors.

  12. Reliability of a rapid hematology stain for sputum cytology*

    PubMed Central

    Gonçalves, Jéssica; Pizzichini, Emilio; Pizzichini, Marcia Margaret Menezes; Steidle, Leila John Marques; Rocha, Cristiane Cinara; Ferreira, Samira Cardoso; Zimmermann, Célia Tânia

    2014-01-01

    Objective: To determine the reliability of a rapid hematology stain for the cytological analysis of induced sputum samples. Methods: This was a cross-sectional study comparing the standard technique (May-Grünwald-Giemsa stain) with a rapid hematology stain (Diff-Quik). Of the 50 subjects included in the study, 21 had asthma, 19 had COPD, and 10 were healthy (controls). From the induced sputum samples collected, we prepared four slides: two were stained with May-Grünwald-Giemsa, and two were stained with Diff-Quik. The slides were read independently by two trained researchers blinded to the identification of the slides. The reliability for cell counting using the two techniques was evaluated by determining the intraclass correlation coefficients (ICCs) for intraobserver and interobserver agreement. Agreement in the identification of neutrophilic and eosinophilic sputum between the observers and between the stains was evaluated with kappa statistics. Results: In our comparison of the two staining techniques, the ICCs indicated almost perfect interobserver agreement for neutrophil, eosinophil, and macrophage counts (ICC: 0.98-1.00), as well as substantial agreement for lymphocyte counts (ICC: 0.76-0.83). Intraobserver agreement was almost perfect for neutrophil, eosinophil, and macrophage counts (ICC: 0.96-0.99), whereas it was moderate to substantial for lymphocyte counts (ICC = 0.65 and 0.75 for the two observers, respectively). Interobserver agreement for the identification of eosinophilic and neutrophilic sputum using the two techniques ranged from substantial to almost perfect (kappa range: 0.91-1.00). Conclusions: The use of Diff-Quik can be considered a reliable alternative for the processing of sputum samples. PMID:25029648

  13. Array-based infra-red detection: an enabling technology for people counting, sensing, tracking, and intelligent detection

    NASA Astrophysics Data System (ADS)

    Stogdale, Nick; Hollock, Steve; Johnson, Neil; Sumpter, Neil

    2003-09-01

    A 16x16 element un-cooled pyroelectric detector array has been developed which, when allied with advanced tracking and detection algorithms, has created a universal detector with multiple applications. Low-cost manufacturing techniques are used to fabricate a hybrid detector, intended for economic use in commercial markets. The detector has found extensive application in accurate people counting, detection, tracking, secure area protection, directional sensing and area violation; topics which are all pertinent to the provision of Homeland Security. The detection and tracking algorithms have, when allied with interpolation techniques, allowed a performance much higher than might be expected from a 16x16 array. This paper reviews the technology, with particular attention to the array structure, algorithms and interpolation techniques and outlines its application in a number of challenging market areas. Viewed from above, moving people are seen as 'hot blobs' moving through the field of view of the detector; background clutter or stationary objects are not seen and the detector works irrespective of lighting or environmental conditions. Advanced algorithms detect the people and extract size, shape, direction and velocity vectors allowing the number of people to be detected and their trajectories of motion to be tracked. Provision of virtual lines in the scene allows bi-directional counting of people flowing in and out of an entrance or area. Definition of a virtual closed area in the scene allows counting of the presence of stationary people within a defined area. Definition of 'counting lines' allows the counting of people, the ability to augment access control devices by confirming a 'one swipe one entry' judgement and analysis of the flow and destination of moving people. For example, passing the 'wrong way' up a denied passageway can be detected. Counting stationary people within a 'defined area' allows the behaviour and size of groups of stationary people to be analysed and counted, an alarm condition can also be generated when people stray into such areas.

  14. Reconfigurable Computing As an Enabling Technology for Single-Photon-Counting Laser Altimetry

    NASA Technical Reports Server (NTRS)

    Powell, Wesley; Hicks, Edward; Pinchinat, Maxime; Dabney, Philip; McGarry, Jan; Murray, Paul

    2003-01-01

    Single-photon-counting laser altimetry is a new measurement technique offering significant advantages in vertical resolution, reducing instrument size, mass, and power, and reducing laser complexity as compared to analog or threshold detection laser altimetry techniques. However, these improvements come at the cost of a dramatically increased requirement for onboard real-time data processing. Reconfigurable computing has been shown to offer considerable performance advantages in performing this processing. These advantages have been demonstrated on the Multi-KiloHertz Micro-Laser Altimeter (MMLA), an aircraft based single-photon-counting laser altimeter developed by NASA Goddard Space Flight Center with several potential spaceflight applications. This paper describes how reconfigurable computing technology was employed to perform MMLA data processing in real-time under realistic operating constraints, along with the results observed. This paper also expands on these prior results to identify concepts for using reconfigurable computing to enable spaceflight single-photon-counting laser altimeter instruments.

  15. Rain volume estimation over areas using satellite and radar data

    NASA Technical Reports Server (NTRS)

    Doneaud, A. A.; Vonderhaar, T. H.

    1985-01-01

    An investigation of the feasibility of rain volume estimation using satellite data following a technique recently developed with radar data called the Arera Time Integral was undertaken. Case studies were selected on the basis of existing radar and satellite data sets which match in space and time. Four multicell clusters were analyzed. Routines for navigation remapping amd smoothing of satellite images were performed. Visible counts were normalized for solar zenith angle. A radar sector of interest was defined to delineate specific radar echo clusters for each radar time throughout the radar echo cluster lifetime. A satellite sector of interest was defined by applying small adjustments to the radar sector using a manual processing technique. The radar echo area, the IR maximum counts and the IR counts matching radar echo areas were found to evolve similarly, except for the decaying phase of the cluster where the cirrus debris keeps the IR counts high.

  16. Photon Counting - One More Time

    NASA Astrophysics Data System (ADS)

    Stanton, Richard H.

    2012-05-01

    Photon counting has been around for more than 60 years, and has been available to amateurs for most of that time. In most cases single photons are detected using photomultiplier tubes, "old technology" that became available after the Second World War. But over the last couple of decades the perfection of CCD devices has given amateurs the ability to perform accurate photometry with modest telescopes. Is there any reason to still count photons? This paper discusses some of the strengths of current photon counting technology, particularly relating to the search for fast optical transients. Technology advances in counters and photomultiplier modules are briefly mentioned. Illustrative data are presented including FFT analysis of bright star photometry and a technique for finding optical pulses in a large file of noisy data. This latter technique is shown to enable the discovery of a possible optical flare on the polar variable AM Her.

  17. Uncertainty of quantitative microbiological methods of pharmaceutical analysis.

    PubMed

    Gunar, O V; Sakhno, N G

    2015-12-30

    The total uncertainty of quantitative microbiological methods, used in pharmaceutical analysis, consists of several components. The analysis of the most important sources of the quantitative microbiological methods variability demonstrated no effect of culture media and plate-count techniques in the estimation of microbial count while the highly significant effect of other factors (type of microorganism, pharmaceutical product and individual reading and interpreting errors) was established. The most appropriate method of statistical analysis of such data was ANOVA which enabled not only the effect of individual factors to be estimated but also their interactions. Considering all the elements of uncertainty and combining them mathematically the combined relative uncertainty of the test results was estimated both for method of quantitative examination of non-sterile pharmaceuticals and microbial count technique without any product. These data did not exceed 35%, appropriated for a traditional plate count methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  18. Single Photon Counting Performance and Noise Analysis of CMOS SPAD-Based Image Sensors

    PubMed Central

    Dutton, Neale A. W.; Gyongy, Istvan; Parmesan, Luca; Henderson, Robert K.

    2016-01-01

    SPAD-based solid state CMOS image sensors utilising analogue integrators have attained deep sub-electron read noise (DSERN) permitting single photon counting (SPC) imaging. A new method is proposed to determine the read noise in DSERN image sensors by evaluating the peak separation and width (PSW) of single photon peaks in a photon counting histogram (PCH). The technique is used to identify and analyse cumulative noise in analogue integrating SPC SPAD-based pixels. The DSERN of our SPAD image sensor is exploited to confirm recent multi-photon threshold quanta image sensor (QIS) theory. Finally, various single and multiple photon spatio-temporal oversampling techniques are reviewed. PMID:27447643

  19. A Bayesian zero-truncated approach for analysing capture-recapture count data from classical scrapie surveillance in France.

    PubMed

    Vergne, Timothée; Calavas, Didier; Cazeau, Géraldine; Durand, Benoît; Dufour, Barbara; Grosbois, Vladimir

    2012-06-01

    Capture-recapture (CR) methods are used to study populations that are monitored with imperfect observation processes. They have recently been applied to the monitoring of animal diseases to evaluate the number of infected units that remain undetected by the surveillance system. This paper proposes three bayesian models to estimate the total number of scrapie-infected holdings in France from CR count data obtained from the French classical scrapie surveillance programme. We fitted two zero-truncated Poisson (ZTP) models (with and without holding size as a covariate) and a zero-truncated negative binomial (ZTNB) model to the 2006 national surveillance count dataset. We detected a large amount of heterogeneity in the count data, making the use of the simple ZTP model inappropriate. However, including holding size as a covariate did not bring any significant improvement over the simple ZTP model. The ZTNB model proved to be the best model, giving an estimation of 535 (CI(95%) 401-796) infected and detectable sheep holdings in 2006, although only 141 were effectively detected, resulting in a holding-level prevalence of 4.4‰ (CI(95%) 3.2-6.3) and a sensitivity of holding-level surveillance of 26% (CI(95%) 18-35). The main limitation of the present study was the small amount of data collected during the surveillance programme. It was therefore not possible to build complex models that would allow depicting more accurately the epidemiological and detection processes that generate the surveillance data. We discuss the perspectives of capture-recapture count models in the context of animal disease surveillance. Copyright © 2012 Elsevier B.V. All rights reserved.

  20. Meningiomas: Objective assessment of proliferative indices by immunohistochemistry and automated counting method.

    PubMed

    Chavali, Pooja; Uppin, Megha S; Uppin, Shantveer G; Challa, Sundaram

    2017-01-01

    The most reliable histological correlate of recurrence risk in meningiomas is increased mitotic activity. Proliferative index with Ki-67 immunostaining is a helpful adjunct to manual counting. However, both show considerable inter-observer variability. A new immunohistochemical method for counting mitotic figures, using antibody against the phosphohistone H3 (PHH3) protein was introduced. Similarly, a computer based automated counting for Ki-67 labelling index (LI) is available. To study the use of these new techniques in the objective assessment of proliferation indices in meningiomas. This was a retrospective study of intracranial meningiomas diagnosed during the year 2013.The hematoxylin and eosin (H and E) sections and immunohistochemistry (IHC) with Ki-67 were reviewed by two pathologists. Photomicrographs of the representative areas were subjected to Ki-67 analysis by Immunoratio (IR) software. Mean Ki-67 LI, both manual and by IR were calculated. IHC with PHH3 was performed. PHH3 positive nuclei were counted and mean values calculated. Data analysis was done using SPSS software. A total of 64 intracranial meningiomas were diagnosed. Evaluation on H and E, PHH3, Ki-67 LI (both manual and IR) were done in 32 cases (22 grade I and 10 grade II meningiomas). Statistically significant correlation was seen between the mitotic count in each grade and PHH3 values and also between the grade of the tumor and values of Ki-67 and PHH3. Both the techniques used in the study had advantage over, as well as, correlated well with the existing techniques and hence, can be applied to routine use.

  1. Color Counts, Too!

    ERIC Educational Resources Information Center

    Sewell, Julia H.

    1983-01-01

    Students with undetected color blindness can have problems with specific teaching methods and materials. The problem should be ruled out in children with suspected learning disabilities and taken into account in career counseling. Nine examples of simple classroom modifications are described. (CL)

  2. AMY trigger system

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sakai, Yoshihide

    1989-04-01

    A trigger system of the AMY detector at TRISTAN e{sup +}e{sup -} collider is described briefly. The system uses simple track segment and shower cluster counting scheme to classify events to be triggered. It has been operating successfully since 1987.

  3. The cognitive foundations of early arithmetic skills: It is counting and number judgment, but not finger gnosis, that count.

    PubMed

    Long, Imogen; Malone, Stephanie A; Tolan, Anne; Burgoyne, Kelly; Heron-Delaney, Michelle; Witteveen, Kate; Hulme, Charles

    2016-12-01

    Following on from ideas developed by Gerstmann, a body of work has suggested that impairments in finger gnosis may be causally related to children's difficulties in learning arithmetic. We report a study with a large sample of typically developing children (N=197) in which we assessed finger gnosis and arithmetic along with a range of other relevant cognitive predictors of arithmetic skills (vocabulary, counting, and symbolic and nonsymbolic magnitude judgments). Contrary to some earlier claims, we found no meaningful association between finger gnosis and arithmetic skills. Counting and symbolic magnitude comparison were, however, powerful predictors of arithmetic skills, replicating a number of earlier findings. Our findings seriously question theories that posit either a simple association or a causal connection between finger gnosis and the development of arithmetic skills. Crown Copyright © 2016. Published by Elsevier Inc. All rights reserved.

  4. Immunologic findings, thrombocytopenia and disease activity in lupus nephritis.

    PubMed Central

    Clark, W. F.; Linton, A. L.; Cordy, P. E.; Keown, P. E.; Lohmann, R. C.; Lindsay, R. M.

    1978-01-01

    Twenty patients with nephritis due to systemic lupus erythematosus were followed up for a mean of 34 months after renal biopsy with serial determinations of total serum complement and C3 and C4 concentrations, binding of deoxyribonucleic acid (DNA), antinuclear antibody pattern and platelet count. There were 25 episodes of nonhematologic observed disease activity in 16 of the 20 patients; elevated DNA binding and thrombocytopenia correlated well with these episodes. The mean platelet count during episodes of observed disease activity was 96 +/- 42 X 10(9)/L, which was significantly different from the mean count of 248 +/- 90 X 10(9)/L during disease quiescence. The proportion of false-positive results with the immunologic tests varied from 25% to 67% and with platelet counts it was 11%. It is suggested that thrombocytopenia may be a simple and accurate index of disease activity in lupus nephritis. PMID:350367

  5. A multi-purpose readout electronics for CdTe and CZT detectors for x-ray imaging applications

    NASA Astrophysics Data System (ADS)

    Yue, X. B.; Deng, Z.; Xing, Y. X.; Liu, Y. N.

    2017-09-01

    A multi-purpose readout electronics based on the DPLMS digital filter has been developed for CdTe and CZT detectors for X-ray imaging applications. Different filter coefficients can be synthesized optimized either for high energy resolution at relatively low counting rate or for high rate photon-counting with reduced energy resolution. The effects of signal width constraints, sampling rate and length were numerical studied by Mento Carlo simulation with simple CRRC shaper input signals. The signal width constraint had minor effect and the ENC was only increased by 6.5% when the signal width was shortened down to 2 τc. The sampling rate and length depended on the characteristic time constants of both input and output signals. For simple CR-RC input signals, the minimum number of the filter coefficients was 12 with 10% increase in ENC when the output time constant was close to the input shaping time. A prototype readout electronics was developed for demonstration, using a previously designed analog front ASIC and a commercial ADC card. Two different DPLMS filters were successfully synthesized and applied for high resolution and high counting rate applications respectively. The readout electronics was also tested with a linear array CdTe detector. The energy resolutions of Am-241 59.5 keV peak were measured to be 6.41% in FWHM for the high resolution filter and to be 13.58% in FWHM for the high counting rate filter with 160 ns signal width constraint.

  6. SERE: single-parameter quality control and sample comparison for RNA-Seq.

    PubMed

    Schulze, Stefan K; Kanwar, Rahul; Gölzenleuchter, Meike; Therneau, Terry M; Beutler, Andreas S

    2012-10-03

    Assessing the reliability of experimental replicates (or global alterations corresponding to different experimental conditions) is a critical step in analyzing RNA-Seq data. Pearson's correlation coefficient r has been widely used in the RNA-Seq field even though its statistical characteristics may be poorly suited to the task. Here we present a single-parameter test procedure for count data, the Simple Error Ratio Estimate (SERE), that can determine whether two RNA-Seq libraries are faithful replicates or globally different. Benchmarking shows that the interpretation of SERE is unambiguous regardless of the total read count or the range of expression differences among bins (exons or genes), a score of 1 indicating faithful replication (i.e., samples are affected only by Poisson variation of individual counts), a score of 0 indicating data duplication, and scores >1 corresponding to true global differences between RNA-Seq libraries. On the contrary the interpretation of Pearson's r is generally ambiguous and highly dependent on sequencing depth and the range of expression levels inherent to the sample (difference between lowest and highest bin count). Cohen's simple Kappa results are also ambiguous and are highly dependent on the choice of bins. For quantifying global sample differences SERE performs similarly to a measure based on the negative binomial distribution yet is simpler to compute. SERE can therefore serve as a straightforward and reliable statistical procedure for the global assessment of pairs or large groups of RNA-Seq datasets by a single statistical parameter.

  7. SERE: Single-parameter quality control and sample comparison for RNA-Seq

    PubMed Central

    2012-01-01

    Background Assessing the reliability of experimental replicates (or global alterations corresponding to different experimental conditions) is a critical step in analyzing RNA-Seq data. Pearson’s correlation coefficient r has been widely used in the RNA-Seq field even though its statistical characteristics may be poorly suited to the task. Results Here we present a single-parameter test procedure for count data, the Simple Error Ratio Estimate (SERE), that can determine whether two RNA-Seq libraries are faithful replicates or globally different. Benchmarking shows that the interpretation of SERE is unambiguous regardless of the total read count or the range of expression differences among bins (exons or genes), a score of 1 indicating faithful replication (i.e., samples are affected only by Poisson variation of individual counts), a score of 0 indicating data duplication, and scores >1 corresponding to true global differences between RNA-Seq libraries. On the contrary the interpretation of Pearson’s r is generally ambiguous and highly dependent on sequencing depth and the range of expression levels inherent to the sample (difference between lowest and highest bin count). Cohen’s simple Kappa results are also ambiguous and are highly dependent on the choice of bins. For quantifying global sample differences SERE performs similarly to a measure based on the negative binomial distribution yet is simpler to compute. Conclusions SERE can therefore serve as a straightforward and reliable statistical procedure for the global assessment of pairs or large groups of RNA-Seq datasets by a single statistical parameter. PMID:23033915

  8. The IDEA model: A single equation approach to the Ebola forecasting challenge.

    PubMed

    Tuite, Ashleigh R; Fisman, David N

    2018-03-01

    Mathematical modeling is increasingly accepted as a tool that can inform disease control policy in the face of emerging infectious diseases, such as the 2014-2015 West African Ebola epidemic, but little is known about the relative performance of alternate forecasting approaches. The RAPIDD Ebola Forecasting Challenge (REFC) tested the ability of eight mathematical models to generate useful forecasts in the face of simulated Ebola outbreaks. We used a simple, phenomenological single-equation model (the "IDEA" model), which relies only on case counts, in the REFC. Model fits were performed using a maximum likelihood approach. We found that the model performed reasonably well relative to other more complex approaches, with performance metrics ranked on average 4th or 5th among participating models. IDEA appeared better suited to long- than short-term forecasts, and could be fit using nothing but reported case counts. Several limitations were identified, including difficulty in identifying epidemic peak (even retrospectively), unrealistically precise confidence intervals, and difficulty interpolating daily case counts when using a model scaled to epidemic generation time. More realistic confidence intervals were generated when case counts were assumed to follow a negative binomial, rather than Poisson, distribution. Nonetheless, IDEA represents a simple phenomenological model, easily implemented in widely available software packages that could be used by frontline public health personnel to generate forecasts with accuracy that approximates that which is achieved using more complex methodologies. Copyright © 2016 The Author(s). Published by Elsevier B.V. All rights reserved.

  9. Isoelectronic substitutions and aluminium alloying in the Ta-Nb-Hf-Zr-Ti high-entropy alloy superconductor

    NASA Astrophysics Data System (ADS)

    von Rohr, Fabian O.; Cava, Robert J.

    2018-03-01

    High-entropy alloys (HEAs) are a new class of materials constructed from multiple principal elements statistically arranged on simple crystallographic lattices. Due to the large amount of disorder present, they are excellent model systems for investigating the properties of materials intermediate between crystalline and amorphous states. Here we report the effects of systematic isoelectronic replacements, using Mo-Y, Mo-Sc, and Cr-Sc mixtures, for the valence electron count 4 and 5 elements in the body-centered cubic (BCC) Ta-Nb-Zr-Hf-Ti high-entropy alloy (HEA) superconductor. We find that the superconducting transition temperature Tc strongly depends on the elemental makeup of the alloy, and not exclusively its electron count. The replacement of niobium or tantalum by an isoelectronic mixture lowers the transition temperature by more than 60%, while the isoelectronic replacement of hafnium, zirconium, or titanium has a limited impact on Tc. We further explore the alloying of aluminium into the nearly optimal electron count [TaNb] 0.67(ZrHfTi) 0.33 HEA superconductor. The electron count dependence of the superconducting Tc for (HEA)Al x is found to be more crystallinelike than for the [TaNb] 1 -x(ZrHfTi) x HEA solid solution. For an aluminum content of x =0.4 the high-entropy stabilization of the simple BCC lattice breaks down. This material crystallizes in the tetragonal β -uranium structure type and superconductivity is not observed above 1.8 K.

  10. Radiation Discrimination in LiBaF3 Scintillator Using Digital Signal Processing Techniques

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aalseth, Craig E.; Bowyer, Sonya M.; Reeder, Paul L.

    2002-11-01

    The new scintillator material LiBaF3:Ce offers the possibility of measuring neutron or alpha count rates and energy spectra simultaneously while measuring gamma count rates and spectra using a single detector.

  11. Blood eosinophil counts for the prediction of the severity of exercise-induced bronchospasm in asthma.

    PubMed

    Koh, Y I; Choi, S

    2002-02-01

    It has been suggested that airway eosinophilic inflammation is associated with the severity of exercise-induced bronchospasm (EIB). Blood eosinophils are known to be an indirect marker of airway inflammation in asthma. The aim of this study is to investigate that a simple and easy blood test for blood eosinphil counts may predict the severity of EIB in asthma. Seventy-seven men with perennial asthma (age range 18-23 years) were included. Lung function test, skin prick test, and blood tests for eosinophils counts and total IgE levels were performed. Methacholine bronchial provocation test and, 24 h later, free running test were carried out. EIB was defined as a 15% reduction or more in post-exercise FEV1 compared with pre-exercise FEV1 value. Atopy score was defined as a sum of mean wheal diameters to allergens. EIB was observed in 60 (78%) of 77 subjects. Asthmatics with EIB showed significantly increased percentages of eosinophils (P<0.01), log eosinophil counts (P<0.001), and atopy scores (P<0.05) and decreased log PC20 values (P < 0.05) compared with asthmatics without EIB. Asthmatics with eosinophils of > 700 microl(-1) (36.9 +/- 12.7%) had significantly greater maximal % fall in FEV1 after exercise than asthmatics with eosinophils of < 350 microl(-1) (24.7 +/- 16.6%, P <0.05). Blood eosinophil counts > 350 microl(-1) yielded the specificity of 88% and positive predictive value of 93% for the presence of EIB. When a multiple regression analysis of maximal % fall in FEV1 according to log eosinophil counts, log PC20, log IgE and atopy score was performed, only blood eosinophil counts were significant factor contributing to the maximal % fall in FEV1 after exercise. These findings not only suggest that a simple blood test for eosinophils may be useful in the prediction of the severity of EIB, but also reinforce the view that airway eosinophilic inflammation may play a major role in EIB in asthma.

  12. Performance of today’s dual energy CT and future multi energy CT in virtual non-contrast imaging and in iodine quantification: A simulation study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faby, Sebastian, E-mail: sebastian.faby@dkfz.de; Kuchenbecker, Stefan; Sawall, Stefan

    2015-07-15

    Purpose: To study the performance of different dual energy computed tomography (DECT) techniques, which are available today, and future multi energy CT (MECT) employing novel photon counting detectors in an image-based material decomposition task. Methods: The material decomposition performance of different energy-resolved CT acquisition techniques is assessed and compared in a simulation study of virtual non-contrast imaging and iodine quantification. The material-specific images are obtained via a statistically optimal image-based material decomposition. A projection-based maximum likelihood approach was used for comparison with the authors’ image-based method. The different dedicated dual energy CT techniques are simulated employing realistic noise models andmore » x-ray spectra. The authors compare dual source DECT with fast kV switching DECT and the dual layer sandwich detector DECT approach. Subsequent scanning and a subtraction method are studied as well. Further, the authors benchmark future MECT with novel photon counting detectors in a dedicated DECT application against the performance of today’s DECT using a realistic model. Additionally, possible dual source concepts employing photon counting detectors are studied. Results: The DECT comparison study shows that dual source DECT has the best performance, followed by the fast kV switching technique and the sandwich detector approach. Comparing DECT with future MECT, the authors found noticeable material image quality improvements for an ideal photon counting detector; however, a realistic detector model with multiple energy bins predicts a performance on the level of dual source DECT at 100 kV/Sn 140 kV. Employing photon counting detectors in dual source concepts can improve the performance again above the level of a single realistic photon counting detector and also above the level of dual source DECT. Conclusions: Substantial differences in the performance of today’s DECT approaches were found for the application of virtual non-contrast and iodine imaging. Future MECT with realistic photon counting detectors currently can only perform comparably to dual source DECT at 100 kV/Sn 140 kV. Dual source concepts with photon counting detectors could be a solution to this problem, promising a better performance.« less

  13. Single-particle imaging for biosensor applications

    NASA Astrophysics Data System (ADS)

    Yorulmaz, Mustafa; Isil, Cagatay; Seymour, Elif; Yurdakul, Celalettin; Solmaz, Berkan; Koc, Aykut; Ünlü, M. Selim

    2017-10-01

    Current state-of-the-art technology for in-vitro diagnostics employ laboratory tests such as ELISA that consists of a multi-step test procedure and give results in analog format. Results of these tests are interpreted by the color change in a set of diluted samples in a multi-well plate. However, detection of the minute changes in the color poses challenges and can lead to false interpretations. Instead, a technique that allows individual counting of specific binding events would be useful to overcome such challenges. Digital imaging has been applied recently for diagnostics applications. SPR is one of the techniques allowing quantitative measurements. However, the limit of detection in this technique is on the order of nM. The current required detection limit, which is already achieved with the analog techniques, is around pM. Optical techniques that are simple to implement and can offer better sensitivities have great potential to be used in medical diagnostics. Interference Microscopy is one of the tools that have been investigated over years in optics field. More of the studies have been performed in confocal geometry and each individual nanoparticle was observed separately. Here, we achieve wide-field imaging of individual nanoparticles in a large field-of-view ( 166 μm × 250 μm) on a micro-array based sensor chip in fraction of a second. We tested the sensitivity of our technique on dielectric nanoparticles because they exhibit optical properties similar to viruses and cells. We can detect non-resonant dielectric polystyrene nanoparticles of 100 nm. Moreover, we perform post-processing applications to further enhance visibility.

  14. Characterization of the 2012-044C Briz-M Upper Stage Breakup

    NASA Technical Reports Server (NTRS)

    Hamilton, Joseph A.; Matney, Mark

    2013-01-01

    The NASA breakup model prediction was close to the observed population for catalog objects. The NASA breakup model predicted a larger population than was observed for objects under 10 cm. The stare technique produces low observation counts, but is readily comparable to model predictions. Customized stare parameters (Az, El, Range) were effective to increase the opportunities for HAX to observe the debris cloud. Other techniques to increase observation count will be considered for future breakup events.

  15. Comparison of Kato-Katz, ethyl-acetate sedimentation, and Midi Parasep® in the diagnosis of hookworm, Ascaris and Trichuris infections in the context of an evaluation of rural sanitation in India.

    PubMed

    Funk, Anna L; Boisson, Sophie; Clasen, Thomas; Ensink, Jeroen H J

    2013-06-01

    The Kato-Katz, conventional ethyl-acetate sedimentation, and Midi Parasep(®) methods for diagnosing infection with soil-transmitted helminths were compared. The Kato-Katz technique gave the best overall diagnostic performance with the highest results in all measures (prevalence, faecal egg count, sensitivity) followed by the conventional ethyl-acetate and then the Midi Parasep(®) technique. The Kato-Katz technique showed a significantly higher faecal egg count and sensitivity for both hookworm and Trichuris as compared to the Midi Parasep(®) technique. The conventional ethyl-acetate technique produced smaller pellets and showed lower pellet mobility as compared to the Midi Parasep(®). Copyright © 2013 Elsevier B.V. All rights reserved.

  16. An algorithm for determining the rotation count of pulsars

    NASA Astrophysics Data System (ADS)

    Freire, Paulo C. C.; Ridolfi, Alessandro

    2018-06-01

    We present here a simple, systematic method for determining the correct global rotation count of a radio pulsar; an essential step for the derivation of an accurate phase-coherent ephemeris. We then build on this method by developing a new algorithm for determining the global rotational count for pulsars with sparse timing data sets. This makes it possible to obtain phase-coherent ephemerides for pulsars for which this has been impossible until now. As an example, we do this for PSR J0024-7205aa, an extremely faint Millisecond pulsar (MSP) recently discovered in the globular cluster 47 Tucanae. This algorithm has the potential to significantly reduce the number of observations and the amount of telescope time needed to follow up on new pulsar discoveries.

  17. Effects of extending the one-more-than technique with the support of a mobile purchasing assistance system.

    PubMed

    Hsu, Guo-Liang; Tang, Jung-Chang; Hwang, Wu-Yuin

    2014-08-01

    The one-more-than technique is an effective strategy for individuals with intellectual disabilities (ID) to use when making purchases. However, the heavy cognitive demands of money counting skills potentially limit how individuals with ID shop. This study employed a multiple-probe design across participants and settings, via the assistance of a mobile purchasing assistance system (MPAS), to assess the effectiveness of the one-more-than technique on independent purchases for items with prices beyond the participants' money counting skills. Results indicated that the techniques with the MPAS could effectively convert participants' initial money counting problems into useful advantages for successfully promoting the independent purchasing skills of three secondary school students with ID. Also noteworthy is the fact that mobile technologies could be a permanent prompt for those with ID to make purchases in their daily lives. The treatment effects could be maintained for eight weeks and generalized across three community settings. Implications for practice and future studies are provided. Copyright © 2014 Elsevier Ltd. All rights reserved.

  18. An automated method of quantifying ferrite microstructures using electron backscatter diffraction (EBSD) data.

    PubMed

    Shrestha, Sachin L; Breen, Andrew J; Trimby, Patrick; Proust, Gwénaëlle; Ringer, Simon P; Cairney, Julie M

    2014-02-01

    The identification and quantification of the different ferrite microconstituents in steels has long been a major challenge for metallurgists. Manual point counting from images obtained by optical and scanning electron microscopy (SEM) is commonly used for this purpose. While classification systems exist, the complexity of steel microstructures means that identifying and quantifying these phases is still a great challenge. Moreover, point counting is extremely tedious, time consuming, and subject to operator bias. This paper presents a new automated identification and quantification technique for the characterisation of complex ferrite microstructures by electron backscatter diffraction (EBSD). This technique takes advantage of the fact that different classes of ferrite exhibit preferential grain boundary misorientations, aspect ratios and mean misorientation, all of which can be detected using current EBSD software. These characteristics are set as criteria for identification and linked to grain size to determine the area fractions. The results of this method were evaluated by comparing the new automated technique with point counting results. The technique could easily be applied to a range of other steel microstructures. © 2013 Published by Elsevier B.V.

  19. Improving photoelectron counting and particle identification in scintillation detectors with Bayesian techniques

    NASA Astrophysics Data System (ADS)

    Akashi-Ronquest, M.; Amaudruz, P.-A.; Batygov, M.; Beltran, B.; Bodmer, M.; Boulay, M. G.; Broerman, B.; Buck, B.; Butcher, A.; Cai, B.; Caldwell, T.; Chen, M.; Chen, Y.; Cleveland, B.; Coakley, K.; Dering, K.; Duncan, F. A.; Formaggio, J. A.; Gagnon, R.; Gastler, D.; Giuliani, F.; Gold, M.; Golovko, V. V.; Gorel, P.; Graham, K.; Grace, E.; Guerrero, N.; Guiseppe, V.; Hallin, A. L.; Harvey, P.; Hearns, C.; Henning, R.; Hime, A.; Hofgartner, J.; Jaditz, S.; Jillings, C. J.; Kachulis, C.; Kearns, E.; Kelsey, J.; Klein, J. R.; Kuźniak, M.; LaTorre, A.; Lawson, I.; Li, O.; Lidgard, J. J.; Liimatainen, P.; Linden, S.; McFarlane, K.; McKinsey, D. N.; MacMullin, S.; Mastbaum, A.; Mathew, R.; McDonald, A. B.; Mei, D.-M.; Monroe, J.; Muir, A.; Nantais, C.; Nicolics, K.; Nikkel, J. A.; Noble, T.; O'Dwyer, E.; Olsen, K.; Orebi Gann, G. D.; Ouellet, C.; Palladino, K.; Pasuthip, P.; Perumpilly, G.; Pollmann, T.; Rau, P.; Retière, F.; Rielage, K.; Schnee, R.; Seibert, S.; Skensved, P.; Sonley, T.; Vázquez-Jáuregui, E.; Veloce, L.; Walding, J.; Wang, B.; Wang, J.; Ward, M.; Zhang, C.

    2015-05-01

    Many current and future dark matter and neutrino detectors are designed to measure scintillation light with a large array of photomultiplier tubes (PMTs). The energy resolution and particle identification capabilities of these detectors depend in part on the ability to accurately identify individual photoelectrons in PMT waveforms despite large variability in pulse amplitudes and pulse pileup. We describe a Bayesian technique that can identify the times of individual photoelectrons in a sampled PMT waveform without deconvolution, even when pileup is present. To demonstrate the technique, we apply it to the general problem of particle identification in single-phase liquid argon dark matter detectors. Using the output of the Bayesian photoelectron counting algorithm described in this paper, we construct several test statistics for rejection of backgrounds for dark matter searches in argon. Compared to simpler methods based on either observed charge or peak finding, the photoelectron counting technique improves both energy resolution and particle identification of low energy events in calibration data from the DEAP-1 detector and simulation of the larger MiniCLEAN dark matter detector.

  20. 42 CFR 493.1276 - Standard: Clinical cytogenetics.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... of accessioning, cell preparation, photographing or other image reproduction technique, photographic... records that document the following: (1) The media used, reactions observed, number of cells counted, number of cells karyotyped, number of chromosomes counted for each metaphase spread, and the quality of...

  1. Validation of an automated colony counting system for group A Streptococcus.

    PubMed

    Frost, H R; Tsoi, S K; Baker, C A; Laho, D; Sanderson-Smith, M L; Steer, A C; Smeesters, P R

    2016-02-08

    The practice of counting bacterial colony forming units on agar plates has long been used as a method to estimate the concentration of live bacteria in culture. However, due to the laborious and potentially error prone nature of this measurement technique, an alternative method is desirable. Recent technologic advancements have facilitated the development of automated colony counting systems, which reduce errors introduced during the manual counting process and recording of information. An additional benefit is the significant reduction in time taken to analyse colony counting data. Whilst automated counting procedures have been validated for a number of microorganisms, the process has not been successful for all bacteria due to the requirement for a relatively high contrast between bacterial colonies and growth medium. The purpose of this study was to validate an automated counting system for use with group A Streptococcus (GAS). Twenty-one different GAS strains, representative of major emm-types, were selected for assessment. In order to introduce the required contrast for automated counting, 2,3,5-triphenyl-2H-tetrazolium chloride (TTC) dye was added to Todd-Hewitt broth with yeast extract (THY) agar. Growth on THY agar with TTC was compared with growth on blood agar and THY agar to ensure the dye was not detrimental to bacterial growth. Automated colony counts using a ProtoCOL 3 instrument were compared with manual counting to confirm accuracy over the stages of the growth cycle (latent, mid-log and stationary phases) and in a number of different assays. The average percentage differences between plating and counting methods were analysed using the Bland-Altman method. A percentage difference of ±10 % was determined as the cut-off for a critical difference between plating and counting methods. All strains measured had an average difference of less than 10 % when plated on THY agar with TTC. This consistency was also observed over all phases of the growth cycle and when plated in blood following bactericidal assays. Agreement between these methods suggest the use of an automated colony counting technique for GAS will significantly reduce time spent counting bacteria to enable a more efficient and accurate measurement of bacteria concentration in culture.

  2. THE HUBBLE SPACE TELESCOPE WIDE FIELD CAMERA 3 EARLY RELEASE SCIENCE DATA: PANCHROMATIC FAINT OBJECT COUNTS FOR 0.2-2 {mu}m WAVELENGTH

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Windhorst, Rogier A.; Cohen, Seth H.; Mechtley, Matt

    2011-04-01

    We describe the Hubble Space Telescope (HST) Wide Field Camera 3 (WFC3) Early Release Science (ERS) observations in the Great Observatories Origins Deep Survey (GOODS) South field. The new WFC3 ERS data provide calibrated, drizzled mosaics in the UV filters F225W, F275W, and F336W, as well as in the near-IR filters F098M (Y{sub s} ), F125W (J), and F160W (H) with 1-2 HST orbits per filter. Together with the existing HST Advanced Camera for Surveys (ACS) GOODS-South mosaics in the BViz filters, these panchromatic 10-band ERS data cover 40-50 arcmin{sup 2} at 0.2-1.7 {mu}m in wavelength at 0.''07-0.''15 FWHM resolutionmore » and 0.''090 Multidrizzled pixels to depths of AB {approx_equal} 26.0-27.0 mag (5{sigma}) for point sources, and AB {approx_equal} 25.5-26.5 mag for compact galaxies. In this paper, we describe (1) the scientific rationale, and the data taking plus reduction procedures of the panchromatic 10-band ERS mosaics, (2) the procedure of generating object catalogs across the 10 different ERS filters, and the specific star-galaxy separation techniques used, and (3) the reliability and completeness of the object catalogs from the WFC3 ERS mosaics. The excellent 0.''07-0.''15 FWHM resolution of HST/WFC3 and ACS makes star-galaxy separation straightforward over a factor of 10 in wavelength to AB {approx_equal} 25-26 mag from the UV to the near-IR, respectively. Our main results are: (1) proper motion of faint ERS stars is detected over 6 years at 3.06 {+-} 0.66 mas year{sup -1} (4.6{sigma}), consistent with Galactic structure models; (2) both the Galactic star counts and the galaxy counts show mild but significant trends of decreasing count slopes from the mid-UV to the near-IR over a factor of 10 in wavelength; (3) combining the 10-band ERS counts with the panchromatic Galaxy and Mass Assembly survey counts at the bright end (10 mag {approx}< AB {approx}< 20 mag) and the Hubble Ultra Deep Field counts in the BVizY{sub s}JH filters at the faint end (24 mag {approx}< AB {approx}< 30 mag) yields galaxy counts that are well measured over the entire flux range 10 mag {approx}< AB {approx}< 30 mag for 0.2-2 {mu}m in wavelength; (4) simple luminosity+density evolution models can fit the galaxy counts over this entire flux range. However, no single model can explain the counts over this entire flux range in all 10 filters simultaneously. More sophisticated models of galaxy assembly are needed to reproduce the overall constraints provided by the current panchromatic galaxy counts for 10 mag {approx}< AB {approx}< 30 mag over a factor of 10 in wavelength.« less

  3. An Algorithm to Automatically Generate the Combinatorial Orbit Counting Equations

    PubMed Central

    Melckenbeeck, Ine; Audenaert, Pieter; Michoel, Tom; Colle, Didier; Pickavet, Mario

    2016-01-01

    Graphlets are small subgraphs, usually containing up to five vertices, that can be found in a larger graph. Identification of the graphlets that a vertex in an explored graph touches can provide useful information about the local structure of the graph around that vertex. Actually finding all graphlets in a large graph can be time-consuming, however. As the graphlets grow in size, more different graphlets emerge and the time needed to find each graphlet also scales up. If it is not needed to find each instance of each graphlet, but knowing the number of graphlets touching each node of the graph suffices, the problem is less hard. Previous research shows a way to simplify counting the graphlets: instead of looking for the graphlets needed, smaller graphlets are searched, as well as the number of common neighbors of vertices. Solving a system of equations then gives the number of times a vertex is part of each graphlet of the desired size. However, until now, equations only exist to count graphlets with 4 or 5 nodes. In this paper, two new techniques are presented. The first allows to generate the equations needed in an automatic way. This eliminates the tedious work needed to do so manually each time an extra node is added to the graphlets. The technique is independent on the number of nodes in the graphlets and can thus be used to count larger graphlets than previously possible. The second technique gives all graphlets a unique ordering which is easily extended to name graphlets of any size. Both techniques were used to generate equations to count graphlets with 4, 5 and 6 vertices, which extends all previous results. Code can be found at https://github.com/IneMelckenbeeck/equation-generator and https://github.com/IneMelckenbeeck/graphlet-naming. PMID:26797021

  4. Exploring the effects of transfers and readmissions on trends in population counts of hospital admissions for coronary heart disease: a Western Australian data linkage study.

    PubMed

    Lopez, Derrick; Nedkoff, Lee; Knuiman, Matthew; Hobbs, Michael S T; Briffa, Thomas G; Preen, David B; Hung, Joseph; Beilby, John; Mathur, Sushma; Reynolds, Anna; Sanfilippo, Frank M

    2017-11-17

    To develop a method for categorising coronary heart disease (CHD) subtype in linked data accounting for different CHD diagnoses across records, and to compare hospital admission numbers and ratios of unlinked versus linked data for each CHD subtype over time, and across age groups and sex. Cohort study. Person-linked hospital administrative data covering all admissions for CHD in Western Australia from 1988 to 2013. Ratios of (1) unlinked admission counts to contiguous admission (CA) counts (accounting for transfers), and (2) 28-day episode counts (accounting for transfers and readmissions) to CA counts stratified by CHD subtype, sex and age group. In all CHD subtypes, the ratios changed in a linear or quadratic fashion over time and the coefficients of the trend term differed across CHD subtypes. Furthermore, for many CHD subtypes the ratios also differed by age group and sex. For example, in women aged 35-54 years, the ratio of unlinked to CA counts for non-ST elevation myocardial infarction admissions in 2000 was 1.10, and this increased in a linear fashion to 1.30 in 2013, representing an annual increase of 0.0148. The use of unlinked counts in epidemiological estimates of CHD hospitalisations overestimates CHD counts. The CA and 28-day episode counts are more aligned with epidemiological studies of CHD. The degree of overestimation of counts using only unlinked counts varies in a complex manner with CHD subtype, time, sex and age group, and it is not possible to apply a simple correction factor to counts obtained from unlinked data. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  5. 32-channel single photon counting module for ultrasensitive detection of DNA sequences

    NASA Astrophysics Data System (ADS)

    Gudkov, Georgiy; Dhulla, Vinit; Borodin, Anatoly; Gavrilov, Dmitri; Stepukhovich, Andrey; Tsupryk, Andrey; Gorbovitski, Boris; Gorfinkel, Vera

    2006-10-01

    We continue our work on the design and implementation of multi-channel single photon detection systems for highly sensitive detection of ultra-weak fluorescence signals, for high-performance, multi-lane DNA sequencing instruments. A fiberized, 32-channel single photon detection (SPD) module based on single photon avalanche diode (SPAD), model C30902S-DTC, from Perkin Elmer Optoelectronics (PKI) has been designed and implemented. Unavailability of high performance, large area SPAD arrays and our desire to design high performance photon counting systems drives us to use individual diodes. Slight modifications in our quenching circuit has doubled the linear range of our system from 1MHz to 2MHz, which is the upper limit for these devices and the maximum saturation count rate has increased to 14 MHz. The detector module comprises of a single board computer PC-104 that enables data visualization, recording, processing, and transfer. Very low dark count (300-1000 counts/s), robust, efficient, simple data collection and processing, ease of connectivity to any other application demanding similar requirements and similar performance results to the best commercially available single photon counting module (SPCM from PKI) are some of the features of this system.

  6. An interlaboratory comparison of sizing and counting of subvisible particles mimicking protein aggregates.

    PubMed

    Ripple, Dean C; Montgomery, Christopher B; Hu, Zhishang

    2015-02-01

    Accurate counting and sizing of protein particles has been limited by discrepancies of counts obtained by different methods. To understand the bias and repeatability of techniques in common use in the biopharmaceutical community, the National Institute of Standards and Technology has conducted an interlaboratory comparison for sizing and counting subvisible particles from 1 to 25 μm. Twenty-three laboratories from industry, government, and academic institutions participated. The circulated samples consisted of a polydisperse suspension of abraded ethylene tetrafluoroethylene particles, which closely mimic the optical contrast and morphology of protein particles. For restricted data sets, agreement between data sets was reasonably good: relative standard deviations (RSDs) of approximately 25% for light obscuration counts with lower diameter limits from 1 to 5 μm, and approximately 30% for flow imaging with specified manufacturer and instrument setting. RSDs of the reported counts for unrestricted data sets were approximately 50% for both light obscuration and flow imaging. Differences between instrument manufacturers were not statistically significant for light obscuration but were significant for flow imaging. We also report a method for accounting for differences in the reported diameter for flow imaging and electrical sensing zone techniques; the method worked well for diameters greater than 15 μm. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  7. White Blood Cells, Neutrophils, and Reactive Oxygen Metabolites among Asymptomatic Subjects.

    PubMed

    Kotani, Kazuhiko; Sakane, Naoki

    2012-06-01

    Chronic inflammation and oxidative stress are associated with health and the disease status. The objective of the present study was to investigate the association among white blood cell (WBC) counts, neutrophil counts as a WBC subpopulation, and diacron reactive oxygen metabolites (d-ROMs) levels in an asymptomatic population. The clinical data, including general cardiovascular risk variables and high-sensitivity C-reactive protein (hs-CRP), were collected from 100 female subjects (mean age, 62 years) in outpatient clinics. The correlation of the d-ROMs with hs-CRP, WBC, and neutrophil counts was examined. The mean/median levels were WBC counts 5.9 × 10(9)/L, neutrophil counts 3.6 × 10(9)/L, hs-CRP 0.06 mg/dL, and d-ROMs 359 CURR U. A simple correlation analysis showed a significant positive correlation of the d-ROMs with the WBC counts, neutrophil counts, or hs-CRP levels. The correlation between d-ROMs and neutrophil counts (β = 0.22, P < 0.05), as well as that between d-ROMs and hs-CRP (β = 0.28, P < 0.01), remained significant and independent in a multiple linear regression analysis adjusted for other variables. A multiple linear regression analysis showed that WBC counts had only a positive correlation tendency to the d-ROMs. Neutrophils may be slightly but more involved in the oxidative stress status, as assessed by d-ROMs, in comparison to the overall WBC. Further studies are needed to clarify the biologic mechanism(s) of the observed relationship.

  8. Proposal for a uniform designation of zearalenone and its metabolites.

    PubMed

    Metzler, Manfred

    2011-02-01

    The Fusarium mycotoxin zearalenone is a frequent contaminant of food and feed. Up to now, different abbreviations and counting systems for the numerous positions of this macrocyclic ß-resorcylic acid lactone and its metabolites have been used. As the number of identified fungal and mammalian metabolites of zearalenone is still growing, the lack of a uniform designation makes the literature on these important toxins confusing and complicated. Here, we propose a logical set of abbreviations and a simple counting system, in order to facilitate future research communications on zearalenone and its congeners.

  9. Radionuclide counting technique for measuring wind velocity and direction

    NASA Technical Reports Server (NTRS)

    Singh, J. J. (Inventor)

    1984-01-01

    An anemometer utilizing a radionuclide counting technique for measuring both the velocity and the direction of wind is described. A pendulum consisting of a wire and a ball with a source of radiation on the lower surface of the ball is positioned by the wind. Detectors and are located in a plane perpendicular to pendulum (no wind). The detectors are located on the circumferene of a circle and are equidistant from each other as well as the undisturbed (no wind) source ball position.

  10. Star counts and visual extinctions in dark nebulae

    NASA Technical Reports Server (NTRS)

    Dickman, R. L.

    1978-01-01

    Application of star count techniques to the determination of visual extinctions in compact, fairly high-extinction dark nebulae is discussed. Particular attention is devoted to the determination of visual extinctions for a cloud having a possibly anomalous ratio of total to selective extinction. The techniques discussed are illustrated in application at two colors to four well-known compact dust clouds or Bok globules: Barnard 92, B 133, B 134, and B 335. Minimum masses and lower limits to the central extinction of these objects are presented.

  11. Night Sky Weather Monitoring System Using Fish-Eye CCD

    NASA Astrophysics Data System (ADS)

    Tomida, Takayuki; Saito, Yasunori; Nakamura, Ryo; Yamazaki, Katsuya

    Telescope Array (TA) is international joint experiment observing ultra-high energy cosmic rays. TA employs fluorescence detection technique to observe cosmic rays. In this technique, tho existence of cloud significantly affects quality of data. Therefore, cloud monitoring provides important information. We are developing two new methods for evaluating night sky weather with pictures taken by charge-coupled device (CCD) camera. One is evaluating the amount of cloud with pixels brightness. The other is counting the number of stars with contour detection technique. The results of these methods show clear correlation, and we concluded both the analyses are reasonable methods for weather monitoring. We discuss reliability of the star counting method.

  12. SYNCHROTRON RADIATION, FREE ELECTRON LASER, APPLICATION OF NUCLEAR TECHNOLOGY, ETC. Physical design of positronium time of flight spectroscopy apparatus

    NASA Astrophysics Data System (ADS)

    Jiang, Xiao-Pan; Zhang, Zi-Liang; Qin, Xiu-Bo; Yu, Run-Sheng; Wang, Bao-Yi

    2010-12-01

    Positronium time of flight spectroscopy (Ps-TOF) is an effective technique for porous material research. It has advantages over other techniques for analyzing the porosity and pore tortuosity of materials. This paper describes a design for Ps-TOF apparatus based on the Beijing intense slow positron beam, supplying a new material characterization technique. In order to improve the time resolution and increase the count rate of the apparatus, the detector system is optimized. For 3 eV o-Ps, the time broadening is 7.66 ns and the count rate is 3 cps after correction.

  13. Multiple Heavy Metal Tolerance of Soil Bacterial Communities and Its Measurement by a Thymidine Incorporation Technique

    PubMed Central

    Díaz-Raviña, Montserrat; Bååth, Erland; Frostegård, Åsa

    1994-01-01

    A thymidine incorporation technique was used to determine the tolerance of a soil bacterial community to Cu, Cd, Zn, Ni, and Pb. An agricultural soil was artificially contaminated in our laboratory with individual metals at three different concentrations, and the results were compared with the results obtained by using the plate count technique. Thymidine incorporation was found to be a simple and rapid method for measuring tolerance. Data obtained by this technique were very reproducible. A linear relationship was found between changes in community tolerance levels obtained by the thymidine incorporation and plate count techniques (r = 0.732, P < 0.001). An increase in tolerance to the metal added to soil was observed for the bacterial community obtained from each polluted soil compared with the community obtained from unpolluted soil. The only exception was when Pb was added; no indication of Pb tolerance was found. An increase in the tolerance to metals other than the metal originally added to soil was also observed, indicating that there was multiple heavy metal tolerance at the community level. Thus, Cu pollution, in addition to increasing tolerance to Cu, also induced tolerance to Zn, Cd, and Ni. Zn and Cd pollution increased community tolerance to all five metals. Ni amendment increased tolerance to Ni the most but also increased community tolerance to Zn and, to lesser degrees, increased community tolerance to Pb and Cd. In soils polluted with Pb increased tolerance to other metals was found in the following order: Ni > Cd > Zn > Cu. We found significant positive relationships between changes in Cd, Zn, and Pb tolerance and, to a lesser degree, between changes in Pb and Ni tolerance when all metals and amendment levels were compared. The magnitude of the increase in heavy metal tolerance was found to be linearly related to the logarithm of the metal concentration added to the soil. Threshold tolerance concentrations were estimated from these linear relationships, and changes in tolerance could be detected at levels of soil contamination similar to those reported previously to result in changes in the phospholipid fatty acid pattern (Å. Frostegård, A. Tunlid, and E. Bååth, Appl. Environ. Microbiol. 59: 3605-3617, 1993). PMID:16349314

  14. Evaluation of Petrifilm Lactic Acid Bacteria Plates for Counting Lactic Acid Bacteria in Food.

    PubMed

    Kanagawa, Satomi; Ohshima, Chihiro; Takahashi, Hajime; Burenqiqige; Kikuchi, Misato; Sato, Fumina; Nakamura, Ayaka; Mohamed, Shimaa M; Kuda, Takashi; Kimura, Bon

    2018-06-01

    Although lactic acid bacteria (LAB) are used widely as starter cultures in the production of fermented foods, they are also responsible for food decay and deterioration. The undesirable growth of LAB in food causes spoilage, discoloration, and slime formation. Because of these adverse effects, food companies test for the presence of LAB in production areas and processed foods and consistently monitor the behavior of these bacteria. The 3M Petrifilm LAB Count Plates have recently been launched as a time-saving and simple-to-use plate designed for detecting and quantifying LAB. This study compares the abilities of Petrifilm LAB Count Plates and the de Man Rogosa Sharpe (MRS) agar medium to determine the LAB count in a variety of foods and swab samples collected from a food production area. Bacterial strains isolated from Petrifilm LAB Count Plates were identified by 16S rDNA sequence analysis to confirm the specificity of these plates for LAB. The results showed no significant difference in bacterial counts measured by using Petrifilm LAB Count Plates and MRS medium. Furthermore, all colonies growing on Petrifilm LAB Count Plates were confirmed to be LAB, while yeast colonies also formed in MRS medium. Petrifilm LAB Count Plates eliminated the plate preparation and plate inoculation steps, and the cultures could be started as soon as a diluted food sample was available. Food companies are required to establish quality controls and perform tests to check the quality of food products; the use of Petrifilm LAB Count Plates can simplify this testing process for food companies.

  15. A Novel In-Beam Delayed Neutron Counting Technique for Characterization of Special Nuclear Materials

    NASA Astrophysics Data System (ADS)

    Bentoumi, G.; Rogge, R. B.; Andrews, M. T.; Corcoran, E. C.; Dimayuga, I.; Kelly, D. G.; Li, L.; Sur, B.

    2016-12-01

    A delayed neutron counting (DNC) system, where the sample to be analyzed remains stationary in a thermal neutron beam outside of the reactor, has been developed at the National Research Universal (NRU) reactor of the Canadian Nuclear Laboratories (CNL) at Chalk River. The new in-beam DNC is a novel approach for non-destructive characterization of special nuclear materials (SNM) that could enable identification and quantification of fissile isotopes within a large and shielded sample. Despite the orders of magnitude reduction in neutron flux, the in-beam DNC method can be as informative as the conventional in-core DNC for most cases while offering practical advantages and mitigated risk when dealing with large radioactive samples of unknown origin. This paper addresses (1) the qualification of in-beam DNC using a monochromatic thermal neutron beam in conjunction with a proven counting apparatus designed originally for in-core DNC, and (2) application of in-beam DNC to an examination of large sealed capsules containing unknown radioactive materials. Initial results showed that the in-beam DNC setup permits non-destructive analysis of bulky and gamma shielded samples. The method does not lend itself to trace analysis, and at best could only reveal the presence of a few milligrams of 235U via the assay of in-beam DNC total counts. Through analysis of DNC count rates, the technique could be used in combination with other neutron or gamma techniques to quantify isotopes present within samples.

  16. Performance assessment of self-interrogation neutron resonance densitometry for spent nuclear fuel assay

    NASA Astrophysics Data System (ADS)

    Hu, Jianwei; Tobin, Stephen J.; LaFleur, Adrienne M.; Menlove, Howard O.; Swinhoe, Martyn T.

    2013-11-01

    Self-Interrogation Neutron Resonance Densitometry (SINRD) is one of several nondestructive assay (NDA) techniques being integrated into systems to measure spent fuel as part of the Next Generation Safeguards Initiative (NGSI) Spent Fuel Project. The NGSI Spent Fuel Project is sponsored by the US Department of Energy's National Nuclear Security Administration to measure plutonium in, and detect diversion of fuel pins from, spent nuclear fuel assemblies. SINRD shows promising capability in determining the 239Pu and 235U content in spent fuel. SINRD is a relatively low-cost and lightweight instrument, and it is easy to implement in the field. The technique makes use of the passive neutron source existing in a spent fuel assembly, and it uses ratios between the count rates collected in fission chambers that are covered with different absorbing materials. These ratios are correlated to key attributes of the spent fuel assembly, such as the total mass of 239Pu and 235U. Using count rate ratios instead of absolute count rates makes SINRD less vulnerable to systematic uncertainties. Building upon the previous research, this work focuses on the underlying physics of the SINRD technique: quantifying the individual impacts on the count rate ratios of a few important nuclides using the perturbation method; examining new correlations between count rate ratio and mass quantities based on the results of the perturbation study; quantifying the impacts on the energy windows of the filtering materials that cover the fission chambers by tallying the neutron spectra before and after the neutrons go through the filters; and identifying the most important nuclides that cause cooling-time variations in the count rate ratios. The results of these studies show that 235U content has a major impact on the SINRD signal in addition to the 239Pu content. Plutonium-241 and 241Am are the two main nuclides responsible for the variation in the count rate ratio with cooling time. In short, this work provides insights into some of the main factors that affect the performance of SINRD, and it should help improve the hardware design and the algorithm used to interpret the signal for the SINRD technique. In addition, the modeling and simulation techniques used in this work can be easily adopted for analysis of other NDA systems, especially when complex systems like spent nuclear fuel are involved. These studies were conducted at Los Alamos National Laboratory.

  17. A cost-effective monitoring technique in particle therapy via uncollimated prompt gamma peak integration

    NASA Astrophysics Data System (ADS)

    Krimmer, J.; Angellier, G.; Balleyguier, L.; Dauvergne, D.; Freud, N.; Hérault, J.; Létang, J. M.; Mathez, H.; Pinto, M.; Testa, E.; Zoccarato, Y.

    2017-04-01

    For the purpose of detecting deviations from the prescribed treatment during particle therapy, the integrals of uncollimated prompt gamma-ray timing distributions are investigated. The intention is to provide information, with a simple and cost-effective setup, independent from monitoring devices of the beamline. Measurements have been performed with 65 MeV protons at a clinical cyclotron. Prompt gamma-rays emitted from the target are identified by means of time-of-flight. The proton range inside the PMMA target has been varied via a modulator wheel. The measured variation of the prompt gamma peak integrals as a function of the modulator position is consistent with simulations. With detectors covering a solid angle of 25 msr (corresponding to a diameter of 3-4 in. at a distance of 50 cm from the beam axis) and 108 incident protons, deviations of a few per cent in the prompt gamma-ray count rate can be detected. For the present configuration, this change in the count rate corresponds to a 3 mm change in the proton range in a PMMA target. Furthermore, simulation studies show that a combination of the signals from multiple detectors may be used to detect a misplacement of the target. A different combination of these signals results in a precise number of the detected prompt gamma rays, which is independent on the actual target position.

  18. Refining aging criteria for northern sea otters in Washington State

    USGS Publications Warehouse

    Schuler, Krysten L.; Baker, Bridget B.; Mayer, Karl A.; Perez-Heydrich, Carolina; Holahan, Paula M.; Thomas, Nancy J.; White, C. LeAnn

    2018-01-01

    Measurement of skull ossification patterns is a standard method for aging various mammalian species and has been used to age Russian, Californian, and Alaskan sea otter populations. Cementum annuli counts have also been verified as an accurate aging method for the Alaskan sea otter population. In this study, cementum annuli count results and skull ossification patterns were compared as methods for aging the northern sea otter (Enhydra lutris kenyoni) population in Washington State. Significant agreement was found between the two methods suggesting that either method could be used to age the Washington population of otters. This study also found that ossification of the squamosal-jugal suture at the ventral glenoid fossa can be used to differentiate male subadults from adults. To assist field biologists or others without access to cementum annuli or skull ossification analysis techniques, a suite of morphologic, physiologic, and developmental characteristics were analyzed to assess whether a set of these more easily accessible parameters could also predict age class for the Washington population of otters. Tooth condition score, evidence of reproductive activity in females, and tooth eruption pattern were identified as the most useful criteria for classifying Washington sea otters as pups, juveniles, subadults, or adults/aged adults. A simple decision tree based on characteristics accessible in the field or at necropsy was created that can be used to reliably predict age class of Washington sea otters as determined by cementum annuli.

  19. A simple-rapid method to separate uranium, thorium, and protactinium for U-series age-dating of materials

    PubMed Central

    Knight, Andrew W.; Eitrheim, Eric S.; Nelson, Andrew W.; Nelson, Steven; Schultz, Michael K.

    2017-01-01

    Uranium-series dating techniques require the isolation of radionuclides in high yields and in fractions free of impurities. Within this context, we describe a novel-rapid method for the separation and purification of U, Th, and Pa. The method takes advantage of differences in the chemistry of U, Th, and Pa, utilizing a commercially-available extraction chromatographic resin (TEVA) and standard reagents. The elution behavior of U, Th, and Pa were optimized using liquid scintillation counting techniques and fractional purity was evaluated by alpha-spectrometry. The overall method was further assessed by isotope dilution alpha-spectrometry for the preliminary age determination of an ancient carbonate sample obtained from the Lake Bonneville site in western Utah (United States). Preliminary evaluations of the method produced elemental purity of greater than 99.99% and radiochemical recoveries exceeding 90% for U and Th and 85% for Pa. Excellent purity and yields (76% for U, 96% for Th and 55% for Pa) were also obtained for the analysis of the carbonate samples and the preliminary Pa and Th ages of about 39,000 years before present are consistent with 14C-derived age of the material. PMID:24681438

  20. Unpredictable long-term tissue effects in laser-assisted vasovasostomy

    NASA Astrophysics Data System (ADS)

    Gilbert, Peter T. O.

    2000-05-01

    Macroscopic Nd:YAG laser-assisted vasovasostomy was introduced to clinical practice as an attractive alternative to conventional microsurgical suture techniques. In this simple procedure the approximated vasal ends are welded by 0.5 sec laser pulses of 10 W power. The anastomosis is secured by two superficial seromuscular 5 - 0 PDS sutures placed on diametrically opposed sites of the vasal circumference. To date, 17 patients have undergone macroscopic laser-assisted vasovasostomy. In each case the operation was carried out under general anesthesia. There were no serious intra- or postoperative complications. Twelve patients were available for long-term followup (4 years). Sperm counts were obtained two months following surgery and from then on every two years. Whereas patency rate reached 75% at the first control examination, it dropped to 33% after two years. After that period no further deterioration was observed. Probably the main reason for this phenomenon is sperm leaking through mucosal defects at the anastomosis with subsequent formation of intramural sperm granuloma and delayed stenosis of the vasal lumen. This tissue reaction may also occur in the different suture techniques thus accounting for the well- established discrepancy of patency and pregnancy rates in microsurgical vasovasostomy.

  1. Radioactivities of Long Duration Exposure Facility (LDEF) materials: Baggage and bonanzas

    NASA Technical Reports Server (NTRS)

    Smith, Alan R.; Hurley, Donna L.

    1991-01-01

    Radioactivities in materials onboard the returned Long Duration Exposure Facility (LDEF) satellite were studied by a variety of techniques. Among the most powerful is low background Ge semiconductor detector gamma ray spectrometry. The observed radioactivities are of two origins: those radionuclides produced by nuclear reactions with the radiation field in orbit and radionuclides present initially as contaminants in materials used for construction of the spacecraft and experimental assemblies. In the first category are experiment related monitor foils and tomato seeds, and such spacecraft materials as Al, stainless steel, and Ti. In the second category are Al, Be, Ti, Va, and some special glasses. Consider that measured peak-area count rates from both categories range from a high value of about 1 count per minute down to less than 0.001 count per minute. Successful measurement of count rates toward the low end of this range can be achieved only through low background techniques, such as used to obtain the results presented here.

  2. Radioactivities of Long Duration Exposure Facility (LDEF) materials: Baggage and bonanzas

    NASA Astrophysics Data System (ADS)

    Smith, Alan R.; Hurley, Donna L.

    1991-06-01

    Radioactivities in materials onboard the returned Long Duration Exposure Facility (LDEF) satellite were studied by a variety of techniques. Among the most powerful is low background Ge semiconductor detector gamma ray spectrometry. The observed radioactivities are of two origins: those radionuclides produced by nuclear reactions with the radiation field in orbit and radionuclides present initially as contaminants in materials used for construction of the spacecraft and experimental assemblies. In the first category are experiment related monitor foils and tomato seeds, and such spacecraft materials as Al, stainless steel, and Ti. In the second category are Al, Be, Ti, Va, and some special glasses. Consider that measured peak-area count rates from both categories range from a high value of about 1 count per minute down to less than 0.001 count per minute. Successful measurement of count rates toward the low end of this range can be achieved only through low background techniques, such as used to obtain the results presented here.

  3. Test of a mosquito eggshell isolation method and subsampling procedure.

    PubMed

    Turner, P A; Streever, W J

    1997-03-01

    Production of Aedes vigilax, the common salt-marsh mosquito, can be assessed by determining eggshell densities found in soil. In this study, 14 field-collected eggshell samples were used to test a subsampling technique and compare eggshell counts obtained with a flotation method to those obtained by direct examination of sediment (DES). Relative precision of the subsampling technique was assessed by determining the minimum number of subsamples required to estimate the true mean and confidence interval of a sample at a predetermined confidence level. A regression line was fitted to cube-root transformed eggshell counts obtained from flotation and DES and found to be significant (P < 0.001, r2 = 0.97). The flotation method allowed processing of samples in about one-third of the time required by DES, but recovered an average of 44% of the eggshells present. Eggshells obtained with the flotation method can be used to predict those from DES using the following equation: DES count = [1.386 x (flotation count)0.33 - 0.01]3.

  4. Simple system for measuring tritium Ad/absorption using a 2. pi. counter and thermal desorption spectrometer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miyake, H.; Matsuyama, M.; Watanabe, K.

    1992-03-01

    In this paper, the authors develop a simple system using tritium tracer and thermal desorption techniques to measure the tritium adsorption and/or absorption on/in a material having typical surface conditions: namely, not cleaned surface. The tritium counting devices used were a 2{pi} counter and conventional proportional counter. With this system, the amounts of ad/absorption could be measured without exposing the samples to air after exposing them to tritium gas. The overall efficiency (F) of the 2{pi} counter was described at F = exp({minus}2.64h), where h is the distance from the sample to the detector. Ad/absorption measurements were carried out formore » several materials used for fabricating conventional vacuum systems. The results were, in the order of decreasing amounts of ad/absorption, as (fiber reinforced plastics(FRP)) {gt} (nickel(Ni), molybdenum disulfide(MoS{sub 2})) {gt} (stainless steel (SS304), iron(Fe), aluminum alloy(A2219)) {gt} (boron nitride(h-BN), silicon carbide (SiC), SS304 passivated by anodic oxidation layers(ASS) and that by boron nitride segregation layers (BSS)). The relative amounts were abut 100 for Ni and 0.1 for ASS and BSS, being normalized to Fe = 1.« less

  5. Integrating chronological uncertainties for annually laminated lake sediments using layer counting, independent chronologies and Bayesian age modelling (Lake Ohau, South Island, New Zealand)

    NASA Astrophysics Data System (ADS)

    Vandergoes, Marcus J.; Howarth, Jamie D.; Dunbar, Gavin B.; Turnbull, Jocelyn C.; Roop, Heidi A.; Levy, Richard H.; Li, Xun; Prior, Christine; Norris, Margaret; Keller, Liz D.; Baisden, W. Troy; Ditchburn, Robert; Fitzsimons, Sean J.; Bronk Ramsey, Christopher

    2018-05-01

    Annually resolved (varved) lake sequences are important palaeoenvironmental archives as they offer a direct incremental dating technique for high-frequency reconstruction of environmental and climate change. Despite the importance of these records, establishing a robust chronology and quantifying its precision and accuracy (estimations of error) remains an essential but challenging component of their development. We outline an approach for building reliable independent chronologies, testing the accuracy of layer counts and integrating all chronological uncertainties to provide quantitative age and error estimates for varved lake sequences. The approach incorporates (1) layer counts and estimates of counting precision; (2) radiometric and biostratigrapic dating techniques to derive independent chronology; and (3) the application of Bayesian age modelling to produce an integrated age model. This approach is applied to a case study of an annually resolved sediment record from Lake Ohau, New Zealand. The most robust age model provides an average error of 72 years across the whole depth range. This represents a fractional uncertainty of ∼5%, higher than the <3% quoted for most published varve records. However, the age model and reported uncertainty represent the best fit between layer counts and independent chronology and the uncertainties account for both layer counting precision and the chronological accuracy of the layer counts. This integrated approach provides a more representative estimate of age uncertainty and therefore represents a statistically more robust chronology.

  6. Short-Term Clinical Disease Progression in HIV-Infected Patients Receiving Combination Antiretroviral Therapy: Results from the TREAT Asia HIV Observational Database

    PubMed Central

    Srasuebkul, Preeyaporn; Lim, Poh Lian; Lee, Man Po; Kumarasamy, Nagalingeswaran; Zhou, Jialun; Sirisanthana, Thira; Li, Patrick C. K.; Kamarulzaman, Adeeba; Oka, Shinichi; Phanuphak, Praphan; Vonthanak, Saphonn; Merati, Tuti P.; Chen, Yi-Ming A.; Sungkanuparph, Somnuek; Tau, Goa; Zhang, Fujie; Lee, Christopher K. C.; Ditangco, Rossana; Pujari, Sanjay; Choi, Jun Y.; Smith, Jeffery; Law, Matthew G.

    2009-01-01

    Objective The aim of our study was to develop, on the basis of simple clinical data, predictive short-term risk equations for AIDS or death in Asian patients infected with human immunodeficiency virus (HIV) who were included in the TREAT Asia HIV Observational Database. Methods Inclusion criteria were highly active antiretroviral therapy initiation and completion of required laboratory tests. Predictors of short-term AIDS or death were assessed using Poisson regression. Three different models were developed: a clinical model, a CD4 cell count model, and a CD4 cell count and HIV RNA level model. We separated patients into low-risk, high-risk, and very high-risk groups according to the key risk factors Identified. Results In the clinical model, patients with severe anemia or a body mass index (BMI; calculated as the weight in kilograms divided by the square of the height in meters) ≤18 were at very high risk, and patients who were aged <40 years or were male and had mild anemia were at high risk. In the CD4 cell count model, patients with a CD4 cell count <50 cells/µL, severe anemia, or a BMI ≤18 were at very high risk, and patients who had a CD4 cell count of 51–200 cells/µL, were aged <40 years, or were male and had mild anemia were at high risk. In the CD4 cell count and HIV RNA level model, patients with a CD4 cell count <50 cells/µL, a detectable viral load, severe anemia, or a BMI ≤18 were at very high risk, and patients with a CD4 cell count of 51–200 cells/µL and mild anemia were at high risk. The incidence of new AIDS or death in the clinical model was 1.3, 4.9, and 15.6 events per 100 person-years in the low-risk, high-risk, and very high-risk groups, respectively. In the CD4 cell count model the respective incidences were 0.9, 2.7, and 16.02 events per 100 person-years; in the CD4 cell count and HIV RNA level model, the respective incidences were 0.8, 1.8, and 6.2 events per 100 person-years. Conclusions These models are simple enough for widespread use in busy clinics and should allow clinicians to identify patients who are at high risk of AIDS or death in Asia and the Pacific region and in resource-poor settings. PMID:19226231

  7. Mathematics anxiety affects counting but not subitizing during visual enumeration.

    PubMed

    Maloney, Erin A; Risko, Evan F; Ansari, Daniel; Fugelsang, Jonathan

    2010-02-01

    Individuals with mathematics anxiety have been found to differ from their non-anxious peers on measures of higher-level mathematical processes, but not simple arithmetic. The current paper examines differences between mathematics anxious and non-mathematics anxious individuals in more basic numerical processing using a visual enumeration task. This task allows for the assessment of two systems of basic number processing: subitizing and counting. Mathematics anxious individuals, relative to non-mathematics anxious individuals, showed a deficit in the counting but not in the subitizing range. Furthermore, working memory was found to mediate this group difference. These findings demonstrate that the problems associated with mathematics anxiety exist at a level more basic than would be predicted from the extant literature. Copyright 2009 Elsevier B.V. All rights reserved.

  8. Photocounting distributions for exponentially decaying sources.

    PubMed

    Teich, M C; Card, H C

    1979-05-01

    Exact photocounting distributions are obtained for a pulse of light whose intensity is exponentially decaying in time, when the underlying photon statistics are Poisson. It is assumed that the starting time for the sampling interval (which is of arbitrary duration) is uniformly distributed. The probability of registering n counts in the fixed time T is given in terms of the incomplete gamma function for n >/= 1 and in terms of the exponential integral for n = 0. Simple closed-form expressions are obtained for the count mean and variance. The results are expected to be of interest in certain studies involving spontaneous emission, radiation damage in solids, and nuclear counting. They will also be useful in neurobiology and psychophysics, since habituation and sensitization processes may sometimes be characterized by the same stochastic model.

  9. Comparative Study of Genotoxicity in Different Tobacco Related Habits using Micronucleus Assay in Exfoliated Buccal Epithelial Cells

    PubMed Central

    Guruprasad, Yadavalli; Jose, Maji; Saxena, Kartikay; K, Deepa; Prabhu, Vishnudas

    2014-01-01

    Background: Oral cancer is one of the most debilitating diseases afflicting mankind. Consumption of tobacco in various forms constitutes one of the most important etiological factors in initiation of oral cancer. When the focus of today’s research is to determine early genotoxic changes in human cells, micronucleus (MN) assay provides a simple, yet reliable indicator of genotoxic damage. Aims and Objectives: To identify and quantify micronuclei in the exfoliated cells of oral mucosa in individuals with different tobacco related habits and control group, to compare the genotoxicity of different tobacco related habits between each group and also with that of control group. Patients and Methods: In the present study buccal smears of 135 individuals with different tobacco related habits & buccal smears of 45 age and sex matched controls were obtained, stained using Giemsa stain and then observed under 100X magnification in order to identify and quantify micronuclei in the exfoliated cells of oral mucosa. Results: The mean Micronucleus (MN) count in individuals having smoking habit were 3.11 while the count was 0.50, 2.13, and 1.67 in normal control, smoking with beetle quid and smokeless tobacco habit respectively. MN count in smokers group was 2.6 times more compared to normal controls. MN count was more even in other groups when compared to normal control but to a lesser extent. Conclusion: From our study we concluded that tobacco in any form is genotoxic especially smokers are of higher risk and micronucleus assay can be used as a simple yet reliable marker for genotoxic evaluation. PMID:24995238

  10. Multichannel temperature controller for hot air solar house

    NASA Technical Reports Server (NTRS)

    Currie, J. R.

    1979-01-01

    This paper describes an electronic controller that is optimized to operate a hot air solar system. Thermal information is obtained from copper constantan thermocouples and a wall-type thermostat. The signals from the thermocouples are processed through a single amplifier using a multiplexing scheme. The multiplexing reduces the component count and automatically calibrates the thermocouple amplifier. The processed signals connect to some simple logic that selects one of the four operating modes. This simple, inexpensive, and reliable scheme is well suited to control hot air solar systems.

  11. Fast distributed large-pixel-count hologram computation using a GPU cluster.

    PubMed

    Pan, Yuechao; Xu, Xuewu; Liang, Xinan

    2013-09-10

    Large-pixel-count holograms are one essential part for big size holographic three-dimensional (3D) display, but the generation of such holograms is computationally demanding. In order to address this issue, we have built a graphics processing unit (GPU) cluster with 32.5 Tflop/s computing power and implemented distributed hologram computation on it with speed improvement techniques, such as shared memory on GPU, GPU level adaptive load balancing, and node level load distribution. Using these speed improvement techniques on the GPU cluster, we have achieved 71.4 times computation speed increase for 186M-pixel holograms. Furthermore, we have used the approaches of diffraction limits and subdivision of holograms to overcome the GPU memory limit in computing large-pixel-count holograms. 745M-pixel and 1.80G-pixel holograms were computed in 343 and 3326 s, respectively, for more than 2 million object points with RGB colors. Color 3D objects with 1.02M points were successfully reconstructed from 186M-pixel hologram computed in 8.82 s with all the above three speed improvement techniques. It is shown that distributed hologram computation using a GPU cluster is a promising approach to increase the computation speed of large-pixel-count holograms for large size holographic display.

  12. What Are the Signs of Alzheimer's Disease? | NIH MedlinePlus the Magazine

    MedlinePlus

    ... in behavior and personality Conduct tests of memory, problem solving, attention, counting, and language Carry out standard medical ... over and over having trouble paying bills or solving simple math problems getting lost losing things or putting them in ...

  13. Visual counts as an index of White-Tailed Prairie Dog density

    USGS Publications Warehouse

    Menkens, George E.; Biggins, Dean E.; Anderson, Stanley H.

    1990-01-01

    Black-footed ferrets (Mustela nigripes) are depended on prairie dogs (Cynomys spp.) for food and shelter and were historically restricted to prairie dog towns (Anderson et al. 1986). Because ferrets and prairie dogs are closely associated, successful ferret management and conservation depends on successful prairie dog management. A critical component of any management program for ferrets will be monitoring prairie dog population dynamics on towns containing ferrets or on towns proposed as ferret reintroduction sites. Three techniques for estimating prairie dog population size and density are counts of plugged and reopened burrows (Tietjen and Matschke 1982), mark-recapture (Otis et al. 1978; Seber 1982, 1986; Menkens and Anderson 1989), and visual counts (Fagerstone and Biggins 1986, Knowles 1986). The technique of plugging burrows and counting the number reopened by prairie dogs is too time and labor intensive for population evaluation on a large number of towns or over large areas. Total burrow counts are not correlated with white-tailed prairie dog (C. leucurus) densities and thus cannot be used for populated evaluation (Menkens et al. 1988). Mark-recapture requires trapping that is expensive and time and labor intensive. Monitoring a large number of prairie dog populations using mark-recapture would be difficult. Alternatively a large number of populations could be monitored in short periods of time using the visual count technique (Fagerstone and Biggins 1986, Knowles 1986). However, the accuracy of visual counts has only been evaluated in a few locations. Thus, it is not known whether the relationship between counts and prairie dog density is consistent throughout the prairie dog's range. Our objective was to evaluate the potential of using visual counts as a rapid means of estimating white-tailed prairie dog density in prairie dog towns throughout Wyoming. We studied 18 white-tailed prairie dog towns in 4 white-tailed prairie dog complexes in Wyoming near Laramie (105°40'W, 41°20'N, 3 grids), Pathfinder reservoir (106°55'W, 42°30'N, 6 grids), Shirley Basin (106°10'W, 42°20'N, 6 grids), and Meeteetse (108°10'W, 44°10'N, 3 grids). All towns were dominated by grasses, forbs, and shrubs (details in Collins and Lichvar 1986). Topography of towns ranged from flat to gently rolling hills.

  14. A radionuclide counting technique for measuring wind velocity. [drag force anemometers

    NASA Technical Reports Server (NTRS)

    Singh, J. J.; Khandelwal, G. S.; Mall, G. H.

    1981-01-01

    A technique for measuring wind velocities of meteorological interest is described. It is based on inverse-square-law variation of the counting rates as the radioactive source-to-counter distance is changed by wind drag on the source ball. Results of a feasibility study using a weak bismuth 207 radiation source and three Geiger-Muller radiation counters are reported. The use of the technique is not restricted to Martian or Mars-like environments. A description of the apparatus, typical results, and frequency response characteristics are included. A discussion of a double-pendulum arrangement is presented. Measurements reported herein indicate that the proposed technique may be suitable for measuring wind speeds up to 100 m/sec, which are either steady or whose rates of fluctuation are less than 1 kHz.

  15. [Thunder-fire Moxibustion for Qi Deficiency-induced Fatigue in Breast Cancer Patients Under-going Chemotherapy].

    PubMed

    Lu, Lu; Li, Wei-Han; Guo, Xiao-Chuan; Fu, Wen-Bin

    2018-02-25

    To observe the clinical effect of thunder-fire moxibustion in the treatment of qi deficiency-induced fatigue in breast cancer patients undergoing chemotherapy. Sixty breast cancer patients undergoing chemotherapy were randomly divided into thunder-fire moxibustion (Moxi) and conventional nursing (nursing) groups ( n =30 in each group). Patients in the Moxi group were treated with thunder-fire moxibustion applied to the back part of body from Pishu (BL 20) to Qihaishu (BL 24) on the bilateral sides and to the abdominal part from Zhongwan (CV 12) to Guanyuan (CV 4) for 30 min, once a day for 14 days. Patients in the nursing group were treated with health education and conventional nursing care. The simple fatigue scale, traditional Chinese medicine (TCM) syndrome score, clinical curative effect were observed before and after the treatment, and white blood cell (WBC) count was observed 5 days ofter chemotherapy and after the treatment respectively. After the treatment, the simple fatigue scales and TCM syndrome scores were significantly decreased and WBC counts were significantly increased in both groups relevant to their individual pre-treatment ( P <0.01). The therapeutic effect of the Moxi group was appa-rently superior to that of the nursing group in lowering the simple fatigue scale and TCM syndrome score and in up-regulating WBC count ( P <0.01, P <0.05). The total effective rate of the Moxi group was significantly higher than that of the nursing group (83.3%[25/30]vs 36.7% [11/30], P <0.01). Thunder-fire moxibustion can effectively relieve the degree of fatigue and the symptoms of qi deficiency in breast cancer patients undergoing chemotherapy.

  16. Effects of the frame acquisition rate on the sensitivity of gastro-oesophageal reflux scintigraphy

    PubMed Central

    Codreanu, I; Chamroonrat, W; Edwards, K

    2013-01-01

    Objective: To compare the sensitivity of gastro-oesophageal reflux (GOR) scintigraphy at 5-s and 60-s frame acquisition rates. Methods: GOR scintigraphy of 50 subjects (1 month–20 years old, mean 42 months) were analysed concurrently using 5-s and 60-s acquisition frames. Reflux episodes were graded as low if activity was detected in the distal half of the oesophagus and high if activity was detected in its upper half or in the oral cavity. For comparison purposes, detected GOR in any number of 5-s frames corresponding to one 60-s frame was counted as one episode. Results: A total of 679 episodes of GOR to the upper oesophagus were counted using a 5-s acquisition technique. Only 183 of such episodes were detected on 60-s acquisition images. To the lower oesophagus, a total of 1749 GOR episodes were detected using a 5-s acquisition technique and only 1045 episodes using 60-s acquisition frames (these also included the high-level GOR on 5-s frames counted as low level on 60-s acquisition frames). 10 patients had high-level GOR episodes that were detected only using a 5-s acquisition technique, leading to a different diagnosis in these patients. No correlation between the number of reflux episodes and the gastric emptying rates was noted. Conclusion: The 5-s frame acquisition technique is more sensitive than the 60-s frame acquisition technique for detecting both high- and low-level GOR. Advances in knowledge: Brief GOR episodes with a relatively low number of radioactive counts are frequently indistinguishable from intense background activity on 60-s acquisition frames. PMID:23520226

  17. Photographic techniques for characterizing streambed particle sizes

    USGS Publications Warehouse

    Whitman, Matthew S.; Moran, Edward H.; Ourso, Robert T.

    2003-01-01

    We developed photographic techniques to characterize coarse (>2-mm) and fine (≤2-mm) streambed particle sizes in 12 streams in Anchorage, Alaska. Results were compared with current sampling techniques to assess which provided greater sampling efficiency and accuracy. The streams sampled were wadeable and contained gravel—cobble streambeds. Gradients ranged from about 5% at the upstream sites to about 0.25% at the downstream sites. Mean particle sizes and size-frequency distributions resulting from digitized photographs differed significantly from those resulting from Wolman pebble counts for five sites in the analysis. Wolman counts were biased toward selecting larger particles. Photographic analysis also yielded a greater number of measured particles (mean = 989) than did the Wolman counts (mean = 328). Stream embeddedness ratings assigned from field and photographic observations were significantly different at 5 of the 12 sites, although both types of ratings showed a positive relationship with digitized surface fines. Visual estimates of embeddedness and digitized surface fines may both be useful indicators of benthic conditions, but digitizing surface fines produces quantitative rather than qualitative data. Benefits of the photographic techniques include reduced field time, minimal streambed disturbance, convenience of postfield processing, easy sample archiving, and improved accuracy and replication potential.

  18. Povidone Iodine Rectal Preparation at Time of Prostate Needle Biopsy is a Simple and Reproducible Means to Reduce Risk of Procedural Infection.

    PubMed

    Raman, Jay D; Lehman, Kathleen K; Dewan, Kalyan; Kirimanjeswara, Girish

    2015-09-21

    Single institution and population-based studies highlight that infectious complications following transrectal ultrasound guided prostate needle biopsy (TRUS PNB) are increasing. Such infections are largely attributable to quinolone resistant microorganisms which colonize the rectal vault and are translocated into the bloodstream during the biopsy procedure. A povidone iodine rectal preparation (PIRP) at time of biopsy is a simple, reproducible method to reduce rectal microorganism colony counts and therefore resultant infections following TRUS PNB. All patients are administered three days of oral antibiotic therapy prior to biopsy. The PIRP technique involves initially positioning the patient in the standard manner for a TRUS PNB. Following digital rectal examination, 15 ml of a 10% solution of commercially available povidone iodine is mixed with 5 ml of 1% lidocaine jelly to create slurry. A 4 cmx4 cm sterile gauze is soaked in this slurry and then inserted into the rectal vault for 2 min after which it is removed. Thereafter, a disposable cotton gynecologic swab is used to paint both the perianal area and the rectal vault to a distance of 3 cm from the anus. The povidone iodine solution is then allowed to dry for 2-3 min prior to proceeding with standard transrectal ultrasonography and subsequent biopsy. This PIRP technique has been in practice at our institution since March of 2012 with an associated reduction of post-biopsy infections from 4.3% to 0.6% (p=0.02). The principal advantage of this prophylaxis regimen is its simplicity and reproducibility with use of an easily available, inexpensive agent to reduce infections. Furthermore, the technique avoids exposing patients to additional systemic antibiotics with potential further propagation of multi-drug resistant organisms. Usage of PIRP at TRUS PNB, however, is not applicable for patients with iodine or shellfish allergies.

  19. An Evaluation of the Accuracy of the Subtraction Method Used for Determining Platelet Counts in Advanced Platelet-Rich Fibrin and Concentrated Growth Factor Preparations

    PubMed Central

    Watanabe, Taisuke; Isobe, Kazushige; Suzuki, Taiji; Kawabata, Hideo; Nakamura, Masayuki; Tsukioka, Tsuneyuki; Okudera, Toshimitsu; Okudera, Hajime; Uematsu, Kohya; Okuda, Kazuhiro; Nakata, Koh; Kawase, Tomoyuki

    2017-01-01

    Platelet concentrates should be quality-assured of purity and identity prior to clinical use. Unlike for the liquid form of platelet-rich plasma, platelet counts cannot be directly determined in solid fibrin clots and are instead calculated by subtracting the counts in other liquid or semi-clotted fractions from those in whole blood samples. Having long suspected the validity of this method, we herein examined the possible loss of platelets in the preparation process. Blood samples collected from healthy male donors were immediately centrifuged for advanced platelet-rich fibrin (A-PRF) and concentrated growth factors (CGF) according to recommended centrifugal protocols. Blood cells in liquid and semi-clotted fractions were directly counted. Platelets aggregated on clot surfaces were observed by scanning electron microscopy. A higher centrifugal force increased the numbers of platelets and platelet aggregates in the liquid red blood cell fraction and the semi-clotted red thrombus in the presence and absence of the anticoagulant, respectively. Nevertheless, the calculated platelet counts in A-PRF/CGF preparations were much higher than expected, rendering the currently accepted subtraction method inaccurate for determining platelet counts in fibrin clots. To ensure the quality of solid types of platelet concentrates chairside in a timely manner, a simple and accurate platelet-counting method should be developed immediately. PMID:29563413

  20. Electrochemical magneto-actuated biosensor for CD4 count in AIDS diagnosis and monitoring.

    PubMed

    Carinelli, S; Xufré Ballesteros, C; Martí, M; Alegret, S; Pividori, M I

    2015-12-15

    The counting of CD4(+) T lymphocytes is a clinical parameter used for AIDS diagnosis and follow-up. As this disease is particularly prevalent in developing countries, simple and affordable CD4 cell counting methods are urgently needed in resource-limited settings. This paper describes an electrochemical magneto-actuated biosensor for CD4 count in whole blood. The CD4(+) T lymphocytes were isolated, preconcentrated and labeled from 100 μL of whole blood by immunomagnetic separation with magnetic particles modified with antiCD3 antibodies. The captured cells were labeled with a biotinylated antiCD4 antibody, followed by the reaction with the electrochemical reporter streptavidin-peroxidase conjugate. The limit of detection for the CD4 counting magneto-actuated biosensor in whole blood was as low as 44 cells μL(-1) while the logistic range was found to be from 89 to 912 cells μL(-1), which spans the whole medical interest range for CD4 counts in AIDS patients. The electrochemical detection together with the immunomagnetic separation confers high sensitivity, resulting in a rapid, inexpensive, robust, user-friendly method for CD4 counting. This approach is a promising alternative for the costly standard flow cytometry and suitable as diagnostic tool at decentralized practitioner sites in low resource settings, especially in less developed countries. Copyright © 2015 Elsevier B.V. All rights reserved.

  1. Single Photon Counting UV Solar-Blind Detectors Using Silicon and III-Nitride Materials

    PubMed Central

    Nikzad, Shouleh; Hoenk, Michael; Jewell, April D.; Hennessy, John J.; Carver, Alexander G.; Jones, Todd J.; Goodsall, Timothy M.; Hamden, Erika T.; Suvarna, Puneet; Bulmer, J.; Shahedipour-Sandvik, F.; Charbon, Edoardo; Padmanabhan, Preethi; Hancock, Bruce; Bell, L. Douglas

    2016-01-01

    Ultraviolet (UV) studies in astronomy, cosmology, planetary studies, biological and medical applications often require precision detection of faint objects and in many cases require photon-counting detection. We present an overview of two approaches for achieving photon counting in the UV. The first approach involves UV enhancement of photon-counting silicon detectors, including electron multiplying charge-coupled devices and avalanche photodiodes. The approach used here employs molecular beam epitaxy for delta doping and superlattice doping for surface passivation and high UV quantum efficiency. Additional UV enhancements include antireflection (AR) and solar-blind UV bandpass coatings prepared by atomic layer deposition. Quantum efficiency (QE) measurements show QE > 50% in the 100–300 nm range for detectors with simple AR coatings, and QE ≅ 80% at ~206 nm has been shown when more complex AR coatings are used. The second approach is based on avalanche photodiodes in III-nitride materials with high QE and intrinsic solar blindness. PMID:27338399

  2. Single Photon Counting UV Solar-Blind Detectors Using Silicon and III-Nitride Materials.

    PubMed

    Nikzad, Shouleh; Hoenk, Michael; Jewell, April D; Hennessy, John J; Carver, Alexander G; Jones, Todd J; Goodsall, Timothy M; Hamden, Erika T; Suvarna, Puneet; Bulmer, J; Shahedipour-Sandvik, F; Charbon, Edoardo; Padmanabhan, Preethi; Hancock, Bruce; Bell, L Douglas

    2016-06-21

    Ultraviolet (UV) studies in astronomy, cosmology, planetary studies, biological and medical applications often require precision detection of faint objects and in many cases require photon-counting detection. We present an overview of two approaches for achieving photon counting in the UV. The first approach involves UV enhancement of photon-counting silicon detectors, including electron multiplying charge-coupled devices and avalanche photodiodes. The approach used here employs molecular beam epitaxy for delta doping and superlattice doping for surface passivation and high UV quantum efficiency. Additional UV enhancements include antireflection (AR) and solar-blind UV bandpass coatings prepared by atomic layer deposition. Quantum efficiency (QE) measurements show QE > 50% in the 100-300 nm range for detectors with simple AR coatings, and QE ≅ 80% at ~206 nm has been shown when more complex AR coatings are used. The second approach is based on avalanche photodiodes in III-nitride materials with high QE and intrinsic solar blindness.

  3. Generalized scaling relationships on transition metals: Influence of adsorbate-coadsorbate interactions

    NASA Astrophysics Data System (ADS)

    Majumdar, Paulami; Greeley, Jeffrey

    2018-04-01

    Linear scaling relations of adsorbate energies across a range of catalytic surfaces have emerged as a central interpretive paradigm in heterogeneous catalysis. They are, however, typically developed for low adsorbate coverages which are not always representative of realistic heterogeneous catalytic environments. Herein, we present generalized linear scaling relations on transition metals that explicitly consider adsorbate-coadsorbate interactions at variable coverages. The slopes of these scaling relations do not follow the simple bond counting principles that govern scaling on transition metals at lower coverages. The deviations from bond counting are explained using a pairwise interaction model wherein the interaction parameter determines the slope of the scaling relationship on a given metal at variable coadsorbate coverages, and the slope across different metals at fixed coadsorbate coverage is approximated by adding a coverage-dependent correction to the standard bond counting contribution. The analysis provides a compact explanation for coverage-dependent deviations from bond counting in scaling relationships and suggests a useful strategy for incorporation of coverage effects into catalytic trends studies.

  4. Comparison of line transects and point counts for monitoring spring migration in forested wetlands

    USGS Publications Warehouse

    Wilson, R.R.; Twedt, D.J.; Elliott, A.B.

    2000-01-01

    We compared the efficacy of 400-m line transects and sets of three point counts at detecting avian richness and abundance in bottomland hardwood forests and intensively managed cottonwood (Populus deltoides) plantations within the Mississippi Alluvial Valley. We detected more species and more individuals on line transects than on three point counts during 218 paired surveys conducted between 24 March and 3 June, 1996 and 1997. Line transects also yielded more birds per unit of time, even though point counts yielded higher estimates of relative bird density. In structurally more-complex bottomland hardwood forests, we detected more species and individuals on line transects, but in more-open cottonwood plantations, transects surpassed point counts only at detecting species within 50 m of the observer. Species richness and total abundance of Nearctic-Neotropical migrants and temperate migrants were greater on line transects within bottomland hardwood forests. Within cottonwood plantations, however, only species richness of Nearctic-Neotropical migrants and total abundance of temperate migrants were greater on line transects. Because we compared survey techniques using the same observer, within the same forest stand on a given day, we assumed that the technique yielding greater estimates of avian species richness and total abundance per unit of effort is superior. Thus, for monitoring migration within hardwood forests of the Mississippi Alluvial Valley, we recommend using line transects instead of point counts.

  5. Simultaneous and quantitative monitoring of co-cultured Pseudomonas aeruginosa and Staphylococcus aureus with antibiotics on a diffusometric platform

    NASA Astrophysics Data System (ADS)

    Chung, Chih-Yao; Wang, Jhih-Cheng; Chuang, Han-Sheng

    2017-04-01

    Successful treatments against bacterial infections depend on antimicrobial susceptibility testing (AST). However, conventional AST requires more than 24 h to obtain an outcome, thereby contributing to high patient mortality. An antibiotic therapy based on experiences is therefore necessary for saving lives and escalating the emergence of multidrug-resistant pathogens. Accordingly, a fast and effective drug screen is necessary for the appropriate administration of antibiotics. The mixed pathogenic nature of infectious diseases emphasizes the need to develop an assay system for polymicrobial infections. On this basis, we present a novel technique for simultaneous and quantitative monitoring of co-cultured microorganisms by coupling optical diffusometry with bead-based immunoassays. This simple integration simultaneously achieves a rapid AST analysis for two pathogens. Triple color particles were simultaneously recorded and subsequently analyzed by functionalizing different fluorescent color particles with dissimilar pathogen-specific antibodies. Results suggested that the effect of the antibiotic, gentamicin, on co-cultured Pseudomonas aeruginosa and Staphylococcus aureus was effectively distinguished by the proposed technique. This study revealed a multiplexed and time-saving (within 2 h) platform with a small sample volume (~0.5 μL) and a low initial bacterial count (50 CFU per droplet, ~105 CFU/mL) for continuously monitoring the growth of co-cultured microorganisms. This technique provides insights into timely therapies against polymicrobial diseases in the near future.

  6. Comparison of nerve trimming with the Er:YAG laser and steel knife

    NASA Astrophysics Data System (ADS)

    Josephson, G. D.; Bass, Lawrence S.; Kasabian, A. K.

    1995-05-01

    Best outcome in nerve repair requires precise alignment and minimization of scar at the repair interface. Surgeons attempt to create the sharpest cut surface at the nerve edge prior to approximation. Pulsed laser modalities are being investigated in several medical applications which require precise atraumatic cutting. We compared nerve trimming with the Er:YAG laser (1375 J/cm2) to conventional steel knife trimming prior to neurorrhaphy. Sprague- Dawley rats were anesthetized with ketamine and xylazine. Under operating microscope magnification the sciatic nerve was dissected and transected using one of the test techniques. In the laser group, the pulses were directed axially across the nerve using a stage which fixed laser fiber/nerve distance and orientation. Specimens were sent for scanning electron microscopy (SEM) at time zero. Epineurial repairs were performed with 10 - 0 nylon simple interrupted sutures. At intervals to 90 days, specimens were harvested and sectioned longitudinally and axially for histologic examination. Time zero SEM revealed clean cuts in both groups but individual axons were clearly visible in all laser specimens. Small pits were also visible on the cut surface of laser treated nerves. No significant differences in nerve morphology were seen during healing. Further studies to quantify axon counts, and functional outcome will be needed to assess this technique of nerve trimming. Delivery system improvements will also be required, to make the technique clinically practical.

  7. The New Beat Spectrum

    ERIC Educational Resources Information Center

    Falter, H. Ellie

    2011-01-01

    How do teachers teach students to count rhythms? Teachers can choose from various techniques. Younger students may learn themed words (such as "pea," "carrot," or "avocado"), specific rhythm syllables (such as "ta" and "ti-ti"), or some other counting method to learn notation and internalize rhythms. As students grow musically, and especially when…

  8. Item Response Modeling of Multivariate Count Data with Zero Inflation, Maximum Inflation, and Heaping

    ERIC Educational Resources Information Center

    Magnus, Brooke E.; Thissen, David

    2017-01-01

    Questionnaires that include items eliciting count responses are becoming increasingly common in psychology. This study proposes methodological techniques to overcome some of the challenges associated with analyzing multivariate item response data that exhibit zero inflation, maximum inflation, and heaping at preferred digits. The modeling…

  9. Data indexing techniques for the EUVE all-sky survey

    NASA Technical Reports Server (NTRS)

    Lewis, J.; Saba, V.; Dobson, C.

    1992-01-01

    This poster describes techniques developed for manipulating large full-sky data sets for the Extreme Ultraviolet Explorer project. The authors have adapted the quatrilateralized cubic sphere indexing algorithm to allow us to efficiently store and process several types of large data sets, such as full-sky maps of photon counts, exposure time, and count rates. A variation of this scheme is used to index sparser data such as individual photon events and viewing times for selected areas of the sky, which are eventually used to create EUVE source catalogs.

  10. Estimating the Effective System Dead Time Parameter for Correlated Neutron Counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Croft, Stephen; Cleveland, Steve; Favalli, Andrea

    We present that neutron time correlation analysis is one of the main technical nuclear safeguards techniques used to verify declarations of, or to independently assay, special nuclear materials. Quantitative information is generally extracted from the neutron-event pulse train, collected from moderated assemblies of 3He proportional counters, in the form of correlated count rates that are derived from event-triggered coincidence gates. These count rates, most commonly referred to as singles, doubles and triples rates etc., when extracted using shift-register autocorrelation logic, are related to the reduced factorial moments of the time correlated clusters of neutrons emerging from the measurement items. Correctingmore » these various rates for dead time losses has received considerable attention recently. The dead time losses for the higher moments in particular, and especially for large mass (high rate and highly multiplying) items, can be significant. Consequently, even in thoughtfully designed systems, accurate dead time treatments are needed if biased mass determinations are to be avoided. In support of this effort, in this paper we discuss a new approach to experimentally estimate the effective system dead time of neutron coincidence counting systems. It involves counting a random neutron source (e.g. AmLi is a good approximation to a source without correlated emission) and relating the second and higher moments of the neutron number distribution recorded in random triggered interrogation coincidence gates to the effective value of dead time parameter. We develop the theoretical basis of the method and apply it to the Oak Ridge Large Volume Active Well Coincidence Counter using sealed AmLi radionuclide neutron sources and standard multiplicity shift register electronics. The method is simple to apply compared to the predominant present approach which involves using a set of 252Cf sources of wide emission rate, it gives excellent precision in a conveniently short time, and it yields consistent results as a function of the order of the moment used to extract the dead time parameter. In addition, this latter observation is reassuring in that it suggests the assumptions underpinning the theoretical analysis are fit for practical application purposes. However, we found that the effective dead time parameter obtained is not constant, as might be expected for a parameter that in the dead time model is characteristic of the detector system, but rather, varies systematically with gate width.« less

  11. Estimating the Effective System Dead Time Parameter for Correlated Neutron Counting

    DOE PAGES

    Croft, Stephen; Cleveland, Steve; Favalli, Andrea; ...

    2017-04-29

    We present that neutron time correlation analysis is one of the main technical nuclear safeguards techniques used to verify declarations of, or to independently assay, special nuclear materials. Quantitative information is generally extracted from the neutron-event pulse train, collected from moderated assemblies of 3He proportional counters, in the form of correlated count rates that are derived from event-triggered coincidence gates. These count rates, most commonly referred to as singles, doubles and triples rates etc., when extracted using shift-register autocorrelation logic, are related to the reduced factorial moments of the time correlated clusters of neutrons emerging from the measurement items. Correctingmore » these various rates for dead time losses has received considerable attention recently. The dead time losses for the higher moments in particular, and especially for large mass (high rate and highly multiplying) items, can be significant. Consequently, even in thoughtfully designed systems, accurate dead time treatments are needed if biased mass determinations are to be avoided. In support of this effort, in this paper we discuss a new approach to experimentally estimate the effective system dead time of neutron coincidence counting systems. It involves counting a random neutron source (e.g. AmLi is a good approximation to a source without correlated emission) and relating the second and higher moments of the neutron number distribution recorded in random triggered interrogation coincidence gates to the effective value of dead time parameter. We develop the theoretical basis of the method and apply it to the Oak Ridge Large Volume Active Well Coincidence Counter using sealed AmLi radionuclide neutron sources and standard multiplicity shift register electronics. The method is simple to apply compared to the predominant present approach which involves using a set of 252Cf sources of wide emission rate, it gives excellent precision in a conveniently short time, and it yields consistent results as a function of the order of the moment used to extract the dead time parameter. In addition, this latter observation is reassuring in that it suggests the assumptions underpinning the theoretical analysis are fit for practical application purposes. However, we found that the effective dead time parameter obtained is not constant, as might be expected for a parameter that in the dead time model is characteristic of the detector system, but rather, varies systematically with gate width.« less

  12. Estimating the effective system dead time parameter for correlated neutron counting

    NASA Astrophysics Data System (ADS)

    Croft, Stephen; Cleveland, Steve; Favalli, Andrea; McElroy, Robert D.; Simone, Angela T.

    2017-11-01

    Neutron time correlation analysis is one of the main technical nuclear safeguards techniques used to verify declarations of, or to independently assay, special nuclear materials. Quantitative information is generally extracted from the neutron-event pulse train, collected from moderated assemblies of 3He proportional counters, in the form of correlated count rates that are derived from event-triggered coincidence gates. These count rates, most commonly referred to as singles, doubles and triples rates etc., when extracted using shift-register autocorrelation logic, are related to the reduced factorial moments of the time correlated clusters of neutrons emerging from the measurement items. Correcting these various rates for dead time losses has received considerable attention recently. The dead time losses for the higher moments in particular, and especially for large mass (high rate and highly multiplying) items, can be significant. Consequently, even in thoughtfully designed systems, accurate dead time treatments are needed if biased mass determinations are to be avoided. In support of this effort, in this paper we discuss a new approach to experimentally estimate the effective system dead time of neutron coincidence counting systems. It involves counting a random neutron source (e.g. AmLi is a good approximation to a source without correlated emission) and relating the second and higher moments of the neutron number distribution recorded in random triggered interrogation coincidence gates to the effective value of dead time parameter. We develop the theoretical basis of the method and apply it to the Oak Ridge Large Volume Active Well Coincidence Counter using sealed AmLi radionuclide neutron sources and standard multiplicity shift register electronics. The method is simple to apply compared to the predominant present approach which involves using a set of 252Cf sources of wide emission rate, it gives excellent precision in a conveniently short time, and it yields consistent results as a function of the order of the moment used to extract the dead time parameter. This latter observation is reassuring in that it suggests the assumptions underpinning the theoretical analysis are fit for practical application purposes. However, we found that the effective dead time parameter obtained is not constant, as might be expected for a parameter that in the dead time model is characteristic of the detector system, but rather, varies systematically with gate width.

  13. Dynamic time-correlated single-photon counting laser ranging

    NASA Astrophysics Data System (ADS)

    Peng, Huan; Wang, Yu-rong; Meng, Wen-dong; Yan, Pei-qin; Li, Zhao-hui; Li, Chen; Pan, Hai-feng; Wu, Guang

    2018-03-01

    We demonstrate a photon counting laser ranging experiment with a four-channel single-photon detector (SPD). The multi-channel SPD improve the counting rate more than 4×107 cps, which makes possible for the distance measurement performed even in daylight. However, the time-correlated single-photon counting (TCSPC) technique cannot distill the signal easily while the fast moving targets are submersed in the strong background. We propose a dynamic TCSPC method for fast moving targets measurement by varying coincidence window in real time. In the experiment, we prove that targets with velocity of 5 km/s can be detected according to the method, while the echo rate is 20% with the background counts of more than 1.2×107 cps.

  14. A simple device to convert a small-animal PET scanner into a multi-sample tissue and injection syringe counter.

    PubMed

    Green, Michael V; Seidel, Jurgen; Choyke, Peter L; Jagoda, Elaine M

    2017-10-01

    We describe a simple fixture that can be added to the imaging bed of a small-animal PET scanner that allows for automated counting of multiple organ or tissue samples from mouse-sized animals and counting of injection syringes prior to administration of the radiotracer. The combination of imaging and counting capabilities in the same machine offers advantages in certain experimental settings. A polyethylene block of plastic, sculpted to mate with the animal imaging bed of a small-animal PET scanner, is machined to receive twelve 5-ml containers, each capable of holding an entire organ from a mouse-sized animal. In addition, a triangular cross-section slot is machined down the centerline of the block to secure injection syringes from 1-ml to 3-ml in size. The sample holder is scanned in PET whole-body mode to image all samples or in one bed position to image a filled injection syringe. Total radioactivity in each sample or syringe is determined from the reconstructed images of these objects using volume re-projection of the coronal images and a single region-of-interest for each. We tested the accuracy of this method by comparing PET estimates of sample and syringe activity with well counter and dose calibrator estimates of these same activities. PET and well counting of the same samples gave near identical results (in MBq, R 2 =0.99, slope=0.99, intercept=0.00-MBq). PET syringe and dose calibrator measurements of syringe activity in MBq were also similar (R 2 =0.99, slope=0.99, intercept=- 0.22-MBq). A small-animal PET scanner can be easily converted into a multi-sample and syringe counting device by the addition of a sample block constructed for that purpose. This capability, combined with live animal imaging, can improve efficiency and flexibility in certain experimental settings. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Poka-yoke process controller: designed for individuals with cognitive impairments.

    PubMed

    Erlandson, R F; Sant, D

    1998-01-01

    Poka-yoke is a Japanese term meaning "error proofing." Poka-yoke techniques were developed to achieve zero defects in manufacturing and assembly processes. The application of these techniques tends to reduce both the physical and cognitive demands of tasks and thereby make them more accessible. Poka-yoke interventions create a dialogue between the worker and the process, and this dialogue provides the feedback necessary for workers to prevent errors. For individuals with cognitive impairments, weighing and counting tasks can be difficult or impossible. Interventions that provide sufficient feedback to workers without disabilities tend to be too subtle for workers with cognitive impairments; hence, the feedback must be enhanced. The Poka-Yoke Controller (PYC) was designed to assist individuals with counting and weighing tasks. The PYC interfaces to an Ohaus CT6000 digital scale for weighing parts and for counting parts by weight. It also interfaces to sensors and switches for object counting tasks. The PYC interfaces to a variety of programmable voice output devices so that voice feedback or prompting can be provided at specific points in the weighing or counting process. The PYC can also be interfaced to conveyor systems, indexed turntables, and other material handling systems for coordinated counting and material handling operations. In all of our applications to date, we have observed improved worker performance, improved process quality, and greater worker independence. These observed benefits have also significantly reduced the need for staff intervention. The process controller is described and three applications are presented: a weighing task and two counting applications.

  16. Wedge sampling for computing clustering coefficients and triangle counts on large graphs

    DOE PAGES

    Seshadhri, C.; Pinar, Ali; Kolda, Tamara G.

    2014-05-08

    Graphs are used to model interactions in a variety of contexts, and there is a growing need to quickly assess the structure of such graphs. Some of the most useful graph metrics are based on triangles, such as those measuring social cohesion. Despite the importance of these triadic measures, algorithms to compute them can be extremely expensive. We discuss the method of wedge sampling. This versatile technique allows for the fast and accurate approximation of various types of clustering coefficients and triangle counts. Furthermore, these techniques are extensible to counting directed triangles in digraphs. Our methods come with provable andmore » practical time-approximation tradeoffs for all computations. We provide extensive results that show our methods are orders of magnitude faster than the state of the art, while providing nearly the accuracy of full enumeration.« less

  17. Computer measurement of particle sizes in electron microscope images

    NASA Technical Reports Server (NTRS)

    Hall, E. L.; Thompson, W. B.; Varsi, G.; Gauldin, R.

    1976-01-01

    Computer image processing techniques have been applied to particle counting and sizing in electron microscope images. Distributions of particle sizes were computed for several images and compared to manually computed distributions. The results of these experiments indicate that automatic particle counting within a reasonable error and computer processing time is feasible. The significance of the results is that the tedious task of manually counting a large number of particles can be eliminated while still providing the scientist with accurate results.

  18. Minimum Detectable Activity for Tomographic Gamma Scanning System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Venkataraman, Ram; Smith, Susan; Kirkpatrick, J. M.

    2015-01-01

    For any radiation measurement system, it is useful to explore and establish the detection limits and a minimum detectable activity (MDA) for the radionuclides of interest, even if the system is to be used at far higher values. The MDA serves as an important figure of merit, and often a system is optimized and configured so that it can meet the MDA requirements of a measurement campaign. The non-destructive assay (NDA) systems based on gamma ray analysis are no exception and well established conventions, such the Currie method, exist for estimating the detection limits and the MDA. However, the Tomographicmore » Gamma Scanning (TGS) technique poses some challenges for the estimation of detection limits and MDAs. The TGS combines high resolution gamma ray spectrometry (HRGS) with low spatial resolution image reconstruction techniques. In non-imaging gamma ray based NDA techniques measured counts in a full energy peak can be used to estimate the activity of a radionuclide, independently of other counting trials. However, in the case of the TGS each “view” is a full spectral grab (each a counting trial), and each scan consists of 150 spectral grabs in the transmission and emission scans per vertical layer of the item. The set of views in a complete scan are then used to solve for the radionuclide activities on a voxel by voxel basis, over 16 layers of a 10x10 voxel grid. Thus, the raw count data are not independent trials any more, but rather constitute input to a matrix solution for the emission image values at the various locations inside the item volume used in the reconstruction. So, the validity of the methods used to estimate MDA for an imaging technique such as TGS warrant a close scrutiny, because the pair-counting concept of Currie is not directly applicable. One can also raise questions as to whether the TGS, along with other image reconstruction techniques which heavily intertwine data, is a suitable method if one expects to measure samples whose activities are at or just above MDA levels. The paper examines methods used to estimate MDAs for a TGS system, and explores possible solutions that can be rigorously defended.« less

  19. Computer Needs of Severely Mentally Retarded Persons.

    ERIC Educational Resources Information Center

    Flanagan, Kelly

    1982-01-01

    The article reviews technology applicable for use by severely mentally retarded learners. Descriptions are given of assistive devices (including communication aids), controls and interfaces (such as single switch access to standard software), and software (including games to teach cause and effect and simple matching and counting). (CL)

  20. Adaptively Reevaluated Bayesian Localization (ARBL). A Novel Technique for Radiological Source Localization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, Erin A.; Robinson, Sean M.; Anderson, Kevin K.

    2015-01-19

    Here we present a novel technique for the localization of radiological sources in urban or rural environments from an aerial platform. The technique is based on a Bayesian approach to localization, in which measured count rates in a time series are compared with predicted count rates from a series of pre-calculated test sources to define likelihood. Furthermore, this technique is expanded by using a localized treatment with a limited field of view (FOV), coupled with a likelihood ratio reevaluation, allowing for real-time computation on commodity hardware for arbitrarily complex detector models and terrain. In particular, detectors with inherent asymmetry ofmore » response (such as those employing internal collimation or self-shielding for enhanced directional awareness) are leveraged by this approach to provide improved localization. Our results from the localization technique are shown for simulated flight data using monolithic as well as directionally-aware detector models, and the capability of the methodology to locate radioisotopes is estimated for several test cases. This localization technique is shown to facilitate urban search by allowing quick and adaptive estimates of source location, in many cases from a single flyover near a source. In particular, this method represents a significant advancement from earlier methods like full-field Bayesian likelihood, which is not generally fast enough to allow for broad-field search in real time, and highest-net-counts estimation, which has a localization error that depends strongly on flight path and cannot generally operate without exhaustive search« less

  1. Reevaluation of pollen quantitation by an automatic pollen counter.

    PubMed

    Muradil, Mutarifu; Okamoto, Yoshitaka; Yonekura, Syuji; Chazono, Hideaki; Hisamitsu, Minako; Horiguchi, Shigetoshi; Hanazawa, Toyoyuki; Takahashi, Yukie; Yokota, Kunihiko; Okumura, Satoshi

    2010-01-01

    Accurate and detailed pollen monitoring is useful for selection of medication and for allergen avoidance in patients with allergic rhinitis. Burkard and Durham pollen samplers are commonly used, but are labor and time intensive. In contrast, automatic pollen counters allow simple real-time pollen counting; however, these instruments have difficulty in distinguishing pollen from small nonpollen airborne particles. Misidentification and underestimation rates for an automatic pollen counter were examined to improve the accuracy of the pollen count. The characteristics of the automatic pollen counter were determined in a chamber study with exposure to cedar pollens or soil grains. The cedar pollen counts were monitored in 2006 and 2007, and compared with those from a Durham sampler. The pollen counts from the automatic counter showed a good correlation (r > 0.7) with those from the Durham sampler when pollen dispersal was high, but a poor correlation (r < 0.5) when pollen dispersal was low. The new correction method, which took into account the misidentification and underestimation, improved this correlation to r > 0.7 during the pollen season. The accuracy of automatic pollen counting can be improved using a correction to include rates of underestimation and misidentification in a particular geographical area.

  2. Evaluation of ICT filariasis card test using whole capillary blood: comparison with Knott's concentration and counting chamber methods.

    PubMed

    Njenga, S M; Wamae, C N

    2001-10-01

    An immunochromatographic card test (ICT) that uses fingerprick whole blood instead of serum for diagnosis of bancroftian filariasis has recently been developed. The card test was validated in the field in Kenya by comparing its sensitivity to the combined sensitivity of Knott's concentration and counting chamber methods. A total of 102 (14.6%) and 117 (16.7%) persons was found to be microfilaremic by Knott's concentration and counting chamber methods, respectively. The geometric mean intensities (GMI) were 74.6 microfilariae (mf)/ml and 256.5 mf/ml by Knott's concentration and counting chamber methods, respectively. All infected individuals detected by both Knott's concentration and counting chamber methods were also antigen positive by the ICT filariasis card test (100% sensitivity). Further, of 97 parasitologically amicrofilaremic persons, 24 (24.7%) were antigen positive by the ICT. The overall prevalence of antigenemia was 37.3%. Of 100 nonendemic area control persons, none was found to be filarial antigen positive (100% specificity). The results show that the new version of the ICT filariasis card test is a simple, sensitive, specific, and rapid test that is convenient in field settings.

  3. A Method of Recording and Predicting the Pollen Count.

    ERIC Educational Resources Information Center

    Buck, M.

    1985-01-01

    A hair dryer, plastic funnel, and microscope slide can be used for predicting pollen counts on a day-to-day basis. Materials, methods for assembly, collection technique, meteorological influences, and daily patterns are discussed. Data collected using the apparatus suggest that airborne grass products other than pollen also affect hay fever…

  4. Carbon fiber counting. [aircraft structures

    NASA Technical Reports Server (NTRS)

    Pride, R. A.

    1980-01-01

    A method was developed for characterizing the number and lengths of carbon fibers accidentally released by the burning of composite portions of civil aircraft structure in a jet fuel fire after an accident. Representative samplings of carbon fibers collected on transparent sticky film were counted from photographic enlargements with a computer aided technique which also provided fiber lengths.

  5. Surpassing Humans and Computers with JellyBean: Crowd-Vision-Hybrid Counting Algorithms.

    PubMed

    Sarma, Akash Das; Jain, Ayush; Nandi, Arnab; Parameswaran, Aditya; Widom, Jennifer

    2015-11-01

    Counting objects is a fundamental image processisng primitive, and has many scientific, health, surveillance, security, and military applications. Existing supervised computer vision techniques typically require large quantities of labeled training data, and even with that, fail to return accurate results in all but the most stylized settings. Using vanilla crowd-sourcing, on the other hand, can lead to significant errors, especially on images with many objects. In this paper, we present our JellyBean suite of algorithms, that combines the best of crowds and computer vision to count objects in images, and uses judicious decomposition of images to greatly improve accuracy at low cost. Our algorithms have several desirable properties: (i) they are theoretically optimal or near-optimal , in that they ask as few questions as possible to humans (under certain intuitively reasonable assumptions that we justify in our paper experimentally); (ii) they operate under stand-alone or hybrid modes, in that they can either work independent of computer vision algorithms, or work in concert with them, depending on whether the computer vision techniques are available or useful for the given setting; (iii) they perform very well in practice, returning accurate counts on images that no individual worker or computer vision algorithm can count correctly, while not incurring a high cost.

  6. Comparison of m-Endo LES, MacConkey, and Teepol media for membrane filtration counting of total coliform bacteria in water.

    PubMed Central

    Grabow, W O; du Preez, M

    1979-01-01

    Total coliform counts obtained by means of standard membrane filtration techniques, using MacConkey agar, m-Endo LES agar, Teepol agar, and pads saturated with Teepol broth as growth media, were compared. Various combinations of these media were used in tests on 490 samples of river water and city wastewater after different stages of conventional purification and reclamation processes including lime treatment, and filtration, active carbon treatment, ozonation, and chlorination. Endo agar yielded the highest average counts for all these samples. Teepol agar generally had higher counts then Teepol broth, whereas MacConkey agar had the lowest average counts. Identification of 871 positive isolates showed that Aeromonas hydrophila was the species most commonly detected. Species of Escherichia, Citrobacter, Klebsiella, and Enterobacter represented 55% of isolates which conformed to the definition of total coliforms on Endo agar, 54% on Teepol agar, and 45% on MacConkey agar. Selection for species on the media differed considerably. Evaluation of these data and literature on alternative tests, including most probable number methods, indicated that the technique of choice for routine analysis of total coliform bacteria in drinking water is membrane filtration using m-Endo LES agar as growth medium without enrichment procedures or a cytochrome oxidase restriction. PMID:394678

  7. Detection of microbial concentration in ice-cream using the impedance technique.

    PubMed

    Grossi, M; Lanzoni, M; Pompei, A; Lazzarini, R; Matteuzzi, D; Riccò, B

    2008-06-15

    The detection of microbial concentration, essential for safe and high quality food products, is traditionally made with the plate count technique, that is reliable, but also slow and not easily realized in the automatic form, as required for direct use in industrial machines. To this purpose, the method based on impedance measurements represents an attractive alternative since it can produce results in about 10h, instead of the 24-48h needed by standard plate counts and can be easily realized in automatic form. In this paper such a method has been experimentally studied in the case of ice-cream products. In particular, all main ice-cream compositions of real interest have been considered and no nutrient media has been used to dilute the samples. A measurement set-up has been realized using benchtop instruments for impedance measurements on samples whose bacteria concentration was independently measured by means of standard plate counts. The obtained results clearly indicate that impedance measurement represents a feasible and reliable technique to detect total microbial concentration in ice-cream, suitable to be implemented as an embedded system for industrial machines.

  8. Ratio of mean platelet volume to platelet count is a potential surrogate marker predicting liver cirrhosis.

    PubMed

    Iida, Hiroya; Kaibori, Masaki; Matsui, Kosuke; Ishizaki, Morihiko; Kon, Masanori

    2018-01-27

    To provide a simple surrogate marker predictive of liver cirrhosis (LC). Specimens from 302 patients who underwent resection for hepatocellular carcinoma between January 2006 and December 2012 were retrospectively analyzed. Based on pathologic findings, patients were divided into groups based on whether or not they had LC. Parameters associated with hepatic functional reserve were compared in these two groups using Mann-Whitney U -test for univariate analysis. Factors differing significantly in univariate analyses were entered into multivariate logistic regression analysis. There were significant differences between the LC group ( n = 100) and non-LC group ( n = 202) in prothrombin activity, concentrations of alanine aminotransferase, aspartate aminotransferase, total bilirubin, albumin, cholinesterase, type IV collagen, hyaluronic acid, indocyanine green retention rate at 15 min, maximal removal rate of technitium-99m diethylene triamine penta-acetic acid-galactosyl human serum albumin and ratio of mean platelet volume to platelet count (MPV/PLT). Multivariate analysis showed that prothrombin activity, concentrations of alanine aminotransferase, aspartate aminotransferase, total bilirubin and hyaluronic acid, and MPV/PLT ratio were factors independently predictive of LC. The area under the curve value for MPV/PLT was 0.78, with a 0.8 cutoff value having a sensitivity of 65% and a specificity of 78%. The MPV/PLT ratio, which can be determined simply from the complete blood count, may be a simple surrogate marker predicting LC.

  9. Subnuclear foci quantification using high-throughput 3D image cytometry

    NASA Astrophysics Data System (ADS)

    Wadduwage, Dushan N.; Parrish, Marcus; Choi, Heejin; Engelward, Bevin P.; Matsudaira, Paul; So, Peter T. C.

    2015-07-01

    Ionising radiation causes various types of DNA damages including double strand breaks (DSBs). DSBs are often recognized by DNA repair protein ATM which forms gamma-H2AX foci at the site of the DSBs that can be visualized using immunohistochemistry. However most of such experiments are of low throughput in terms of imaging and image analysis techniques. Most of the studies still use manual counting or classification. Hence they are limited to counting a low number of foci per cell (5 foci per nucleus) as the quantification process is extremely labour intensive. Therefore we have developed a high throughput instrumentation and computational pipeline specialized for gamma-H2AX foci quantification. A population of cells with highly clustered foci inside nuclei were imaged, in 3D with submicron resolution, using an in-house developed high throughput image cytometer. Imaging speeds as high as 800 cells/second in 3D were achieved by using HiLo wide-field depth resolved imaging and a remote z-scanning technique. Then the number of foci per cell nucleus were quantified using a 3D extended maxima transform based algorithm. Our results suggests that while most of the other 2D imaging and manual quantification studies can count only up to about 5 foci per nucleus our method is capable of counting more than 100. Moreover we show that 3D analysis is significantly superior compared to the 2D techniques.

  10. A New Method for Estimating Bacterial Abundances in Natural Samples using Sublimation

    NASA Technical Reports Server (NTRS)

    Glavin, Daniel P.; Cleaves, H. James; Schubert, Michael; Aubrey, Andrew; Bada, Jeffrey L.

    2004-01-01

    We have developed a new method based on the sublimation of adenine from Escherichia coli to estimate bacterial cell counts in natural samples. To demonstrate this technique, several types of natural samples including beach sand, seawater, deep-sea sediment, and two soil samples from the Atacama Desert were heated to a temperature of 500 C for several seconds under reduced pressure. The sublimate was collected on a cold finger and the amount of adenine released from the samples then determined by high performance liquid chromatography (HPLC) with UV absorbance detection. Based on the total amount of adenine recovered from DNA and RNA in these samples, we estimated bacterial cell counts ranging from approx. l0(exp 5) to l0(exp 9) E. coli cell equivalents per gram. For most of these samples, the sublimation based cell counts were in agreement with total bacterial counts obtained by traditional DAPI staining. The simplicity and robustness of the sublimation technique compared to the DAPI staining method makes this approach particularly attractive for use by spacecraft instrumentation. NASA is currently planning to send a lander to Mars in 2009 in order to assess whether or not organic compounds, especially those that might be associated with life, are present in Martian surface samples. Based on our analyses of the Atacama Desert soil samples, several million bacterial cells per gam of Martian soil should be detectable using this sublimation technique.

  11. Aerial population estimates of wild horses (Equus caballus) in the adobe town and salt wells creek herd management areas using an integrated simultaneous double-count and sightability bias correction technique

    USGS Publications Warehouse

    Lubow, Bruce C.; Ransom, Jason I.

    2007-01-01

    An aerial survey technique combining simultaneous double-count and sightability bias correction methodologies was used to estimate the population of wild horses inhabiting Adobe Town and Salt Wells Creek Herd Management Areas, Wyoming. Based on 5 surveys over 4 years, we conclude that the technique produced estimates consistent with the known number of horses removed between surveys and an annual population growth rate of 16.2 percent per year. Therefore, evidence from this series of surveys supports the validity of this survey method. Our results also indicate that the ability of aerial observers to see horse groups is very strongly dependent on skill of the individual observer, size of the horse group, and vegetation cover. It is also more modestly dependent on the ruggedness of the terrain and the position of the sun relative to the observer. We further conclude that censuses, or uncorrected raw counts, are inadequate estimates of population size for this herd. Such uncorrected counts were all undercounts in our trials, and varied in magnitude from year to year and observer to observer. As of April 2007, we estimate that the population of the Adobe Town /Salt Wells Creek complex is 906 horses with a 95 percent confidence interval ranging from 857 to 981 horses.

  12. Evaluation of dead-time corrections for post-radionuclide-therapy (177)Lu quantitative imaging with low-energy high-resolution collimators.

    PubMed

    Celler, Anna; Piwowarska-Bilska, Hanna; Shcherbinin, Sergey; Uribe, Carlos; Mikolajczak, Renata; Birkenfeld, Bozena

    2014-01-01

    Dead-time (DT) effects rarely cause problems in diagnostic single-photon emission computed tomography (SPECT) studies; however, in post-radionuclide-therapy imaging, DT can be substantial. Therefore, corrections may be necessary if quantitative images are used in image-based dosimetry or for evaluation of therapy outcomes. This task is particularly challenging if low-energy collimators are used. Our goal was to design a simple method to determine the dead-time correction factor (DTCF) without the need for phantom experiments and complex calculations. Planar and SPECT/CT scans of a water phantom containing a 70 ml bottle filled with lutetium-177 (Lu) were acquired over 60 days. Two small Lu markers were used in all scans. The DTCF based on the ratio of observed to true count rates measured over the entire spectrum and using photopeak primary photons only was estimated for phantom (DT present) and marker (no DT) scans. In addition, variations in counts in SPECT projections (potentially caused by varying bremsstrahlung and scatter) were investigated. For count rates that were about two-fold higher than typically seen in post-therapy Lu scans, the maximum DTCF reached a level of about 17%. The DTCF values determined directly from the phantom experiments using the total energy spectrum and photopeak counts only were equal to 13 and 16%, respectively. They were closely matched by those from the proposed marker-based method, which uses only two energy windows and measures photopeak primary photons (15-17%). A simple, marker-based method allowing for determination of the DTCF in high-activity Lu imaging studies has been proposed and validated using phantom experiments.

  13. Design, development and manufacture of a breadboard radio frequency mass gauging system

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The feasibility of the RF gauging mode, counting technique was demonstrated for gauging liquid hydrogen and liquid oxygen under all attitude conditions. With LH2, it was also demonstrated under dynamic fluid conditions, in which the fluid assumes ever changing positions within the tank, that the RF gauging technique on the average provides a very good indication of mass. It is significant that the distribution of the mode count data at each fill level during dynamic LH2 and LOX orientation testing does approach a statistical normal distribution. Multiple space-diversity probes provide better coupling to the resonant modes than utilization of a single probe element. The variable sweep rate generator technique provides a more uniform mode versus time distribution for processing.

  14. Evaluation of heterotrophic plate and chromogenic agar colony counting in water quality laboratories.

    PubMed

    Hallas, Gary; Monis, Paul

    2015-01-01

    The enumeration of bacteria using plate-based counts is a core technique used by food and water microbiology testing laboratories. However, manual counting of bacterial colonies is both time and labour intensive, can vary between operators and also requires manual entry of results into laboratory information management systems, which can be a source of data entry error. An alternative is to use automated digital colony counters, but there is a lack of peer-reviewed validation data to allow incorporation into standards. We compared the performance of digital counting technology (ProtoCOL3) against manual counting using criteria defined in internationally recognized standard methods. Digital colony counting provided a robust, standardized system suitable for adoption in a commercial testing environment. The digital technology has several advantages:•Improved measurement of uncertainty by using a standard and consistent counting methodology with less operator error.•Efficiency for labour and time (reduced cost).•Elimination of manual entry of data onto LIMS.•Faster result reporting to customers.

  15. Microbiology of cooked and dried edible Mediterranean field crickets (Gryllus bimaculatus) and superworms (Zophobas atratus) submitted to four different heating treatments.

    PubMed

    Grabowski, Nils Th; Klein, Günter

    2017-01-01

    To increase the shelf life of edible insects, modern techniques (e.g. freeze-drying) add to the traditional methods (degutting, boiling, sun-drying or roasting). However, microorganisms become inactivated rather than being killed, and when rehydrated, many return to vegetative stadia. Crickets (Gryllus bimaculatus) and superworms (Zophobas atratus) were submitted to four different drying techniques (T1 = 10' cooking, 24 h drying at 60℃; T2 = 10' cooking, 24 h drying at 80℃; T3 = 30' cooking, 12 h drying at 80℃, and 12 h drying at 100℃; T4 = boiling T3-treated insects after five days) and analysed for total bacteria counts, Enterobacteriaceae, staphylococci, bacilli, yeasts and moulds counts, E. coli, salmonellae, and Listeria monocytogenes (the latter three being negative throughout). The microbial counts varied strongly displaying species- and treatment-specific patterns. T3 was the most effective of the drying treatments tested to decrease all counts but bacilli, for which T2 was more efficient. Still, total bacteria counts remained high (G. bimaculatus > Z. atratus). Other opportunistically pathogenic microorganisms (Bacillus thuringiensis, B. licheniformis, B. pumilis, Pseudomonas aeruginosa, and Cryptococcus neoformans) were also encountered. The tyndallisation-like T4 reduced all counts to below detection limit, but nutrients leakage should be considered regarding food quality. In conclusion, species-specific drying procedures should be devised to ensure food safety. © The Author(s) 2016.

  16. Application of Photoshop and Scion Image analysis to quantification of signals in histochemistry, immunocytochemistry and hybridocytochemistry.

    PubMed

    Tolivia, Jorge; Navarro, Ana; del Valle, Eva; Perez, Cristina; Ordoñez, Cristina; Martínez, Eva

    2006-02-01

    To describe a simple method to achieve the differential selection and subsequent quantification of the strength signal using only one section. Several methods for performing quantitative histochemistry, immunocytochemistry or hybridocytochemistry, without use of specific commercial image analysis systems, rely on pixel-counting algorithms, which do not provide information on the amount of chromogen present in the section. Other techniques use complex algorithms to calculate the cumulative signal strength using two consecutive sections. To separate the chromogen signal we used the "Color range" option of the Adobe Photoshop program, which provides a specific file for a particular chromogen selection that could be applied on similar sections. The measurement of the chromogen signal strength of the specific staining is achieved with the Scion Image software program. The method described in this paper can also be applied to simultaneous detection of different signals on the same section or different parameters (area of particles, number of particles, etc.) when the "Analyze particles" tool of the Scion program is used.

  17. Counting Active Sites on Titanium Oxide-Silica Catalysts for Hydrogen Peroxide Activation through In Situ Poisoning with Phenylphosphonic Acid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eaton, Todd R.; Boston, Andrew M.; Thompson, Anthony B.

    2015-06-04

    Quantifying specific active sites in supported catalysts improves our understanding and assists in rational design. Supported oxides can undergo significant structural changes as surface densities increase from site-isolated cations to monolayers and crystallites, which changes the number of kinetically relevant sites. Herein, TiO x domains are titrated on TiO x–SiO 2 selectively with phenylphosphonic acid (PPA). An ex situ method quantifies all fluid-accessible TiO x, whereas an in situ titration during cis-cyclooctene epoxidation provides previously unavailable values for the number of tetrahedral Ti sites on which H 2O 2 activation occurs. We use this method to determine the active sitemore » densities of 22 different catalysts with different synthesis methods, loadings, and characteristic spectra and find a single intrinsic turnover frequency for cis-cyclooctene epoxidation of (40±7) h -1. This simple method gives molecular-level insight into catalyst structure that is otherwise hidden when bulk techniques are used.« less

  18. Input current shaped ac-to-dc converters

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Input current shaping techniques for ac-to-dc converters were investigated. Input frequencies much higher than normal, up to 20 kHz were emphasized. Several methods of shaping the input current waveform in ac-to-dc converters were reviewed. The simplest method is the LC filter following the rectifier. The next simplest method is the resistor emulation approach in which the inductor size is determined by the converter switching frequency and not by the line input frequency. Other methods require complicated switch drive algorithms to construct the input current waveshape. For a high-frequency line input, on the order of 20 kHz, the simple LC cannot be discarded so peremptorily, since the inductor size can be compared with that for the resistor emulation method. In fact, since a dc regulator will normally be required after the filter anyway, the total component count is almost the same as for the resistor emulation method, in which the filter is effectively incorporated into the regulator.

  19. Enumerating viruses by using fluorescence and the nature of the nonviral background fraction.

    PubMed

    Pollard, Peter C

    2012-09-01

    Bulk fluorescence measurements could be a faster and cheaper way of enumerating viruses than epifluorescence microscopy, flow cytometry, or transmission electron microscopy (TEM). However, since viruses are not imaged, the background fluorescence compromises the signal, and we know little about its nature. In this paper the size ranges of nucleotides that fluoresce in the presence of SYBR gold were determined for wastewater and a range of freshwater samples using a differential filtration method. Fluorescence excitation-emission matrices (FEEMs) showed that >70% of the SYBR fluorescence was in the <10-nm size fraction (background) and was not associated with intact viruses. This was confirmed using TEM. The use of FEEMs to develop a fluorescence-based method for counting viruses is an approach that is fundamentally different from the epifluorescence microscopy technique used for enumerating viruses. This high fluorescence background is currently overlooked, yet it has had a most pervasive influence on the development of a simple fluorescence-based method for quantifying viral abundance in water.

  20. Monopole operators and Hilbert series of Coulomb branches of 3 d = 4 gauge theories

    NASA Astrophysics Data System (ADS)

    Cremonesi, Stefano; Hanany, Amihay; Zaffaroni, Alberto

    2014-01-01

    This paper addresses a long standing problem - to identify the chiral ring and moduli space (i.e. as an algebraic variety) on the Coulomb branch of an = 4 superconformal field theory in 2+1 dimensions. Previous techniques involved a computation of the metric on the moduli space and/or mirror symmetry. These methods are limited to sufficiently small moduli spaces, with enough symmetry, or to Higgs branches of sufficiently small gauge theories. We introduce a simple formula for the Hilbert series of the Coulomb branch, which applies to any good or ugly three-dimensional = 4 gauge theory. The formula counts monopole operators which are dressed by classical operators, the Casimir invariants of the residual gauge group that is left unbroken by the magnetic flux. We apply our formula to several classes of gauge theories. Along the way we make various tests of mirror symmetry, successfully comparing the Hilbert series of the Coulomb branch with the Hilbert series of the Higgs branch of the mirror theory.

  1. On the Bayesian Nonparametric Generalization of IRT-Type Models

    ERIC Educational Resources Information Center

    San Martin, Ernesto; Jara, Alejandro; Rolin, Jean-Marie; Mouchart, Michel

    2011-01-01

    We study the identification and consistency of Bayesian semiparametric IRT-type models, where the uncertainty on the abilities' distribution is modeled using a prior distribution on the space of probability measures. We show that for the semiparametric Rasch Poisson counts model, simple restrictions ensure the identification of a general…

  2. Your Child's Development: 3 Years

    MedlinePlus

    ... girl plays make-believe takes turns while playing Cognitive Skills (Thinking and Learning) knows first and last name and age engages in pretend play can count three objects does simple puzzles can retell a story from a book When to Talk to Your Doctor Every child develops at his or her own pace, but ...

  3. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources.

    PubMed

    Klumpp, John; Brandl, Alexander

    2015-03-01

    A particle counting and detection system is proposed that searches for elevated count rates in multiple energy regions simultaneously. The system analyzes time-interval data (e.g., time between counts), as this was shown to be a more sensitive technique for detecting low count rate sources compared to analyzing counts per unit interval (Luo et al. 2013). Two distinct versions of the detection system are developed. The first is intended for situations in which the sample is fixed and can be measured for an unlimited amount of time. The second version is intended to detect sources that are physically moving relative to the detector, such as a truck moving past a fixed roadside detector or a waste storage facility under an airplane. In both cases, the detection system is expected to be active indefinitely; i.e., it is an online detection system. Both versions of the multi-energy detection systems are compared to their respective gross count rate detection systems in terms of Type I and Type II error rates and sensitivity.

  4. Tutorial on X-ray photon counting detector characterization.

    PubMed

    Ren, Liqiang; Zheng, Bin; Liu, Hong

    2018-01-01

    Recent advances in photon counting detection technology have led to significant research interest in X-ray imaging. As a tutorial level review, this paper covers a wide range of aspects related to X-ray photon counting detector characterization. The tutorial begins with a detailed description of the working principle and operating modes of a pixelated X-ray photon counting detector with basic architecture and detection mechanism. Currently available methods and techniques for charactering major aspects including energy response, noise floor, energy resolution, count rate performance (detector efficiency), and charge sharing effect of photon counting detectors are comprehensively reviewed. Other characterization aspects such as point spread function (PSF), line spread function (LSF), contrast transfer function (CTF), modulation transfer function (MTF), noise power spectrum (NPS), detective quantum efficiency (DQE), bias voltage, radiation damage, and polarization effect are also remarked. A cadmium telluride (CdTe) pixelated photon counting detector is employed for part of the characterization demonstration and the results are presented. This review can serve as a tutorial for X-ray imaging researchers and investigators to understand, operate, characterize, and optimize photon counting detectors for a variety of applications.

  5. A photographic technique for estimating egg density of the white pine weevil, Pissodes strobi (Peck)

    Treesearch

    Roger T. Zerillo

    1975-01-01

    Compares a photographic technique with visual and dissection techniques for estimating egg density of the white pine weevil, Pissodes strobi (Peck). The relatively high correlations (.67 and .79) between counts from photographs and those obtained by dissection indicate that the non-destructive photographic technique could be a useful tool for...

  6. Smart fast blood counting of trace volumes of body fluids from various mammalian species using a compact custom-built microscope cytometer (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Smith, Zachary J.; Gao, Tingjuan; Lin, Tzu-Yin; Carrade-Holt, Danielle; Lane, Stephen M.; Matthews, Dennis L.; Dwyre, Denis M.; Wachsmann-Hogiu, Sebastian

    2016-03-01

    Cell counting in human body fluids such as blood, urine, and CSF is a critical step in the diagnostic process for many diseases. Current automated methods for cell counting are based on flow cytometry systems. However, these automated methods are bulky, costly, require significant user expertise, and are not well suited to counting cells in fluids other than blood. Therefore, their use is limited to large central laboratories that process enough volume of blood to recoup the significant capital investment these instruments require. We present in this talk a combination of a (1) low-cost microscope system, (2) simple sample preparation method, and (3) fully automated analysis designed for providing cell counts in blood and body fluids. We show results on both humans and companion and farm animals, showing that accurate red cell, white cell, and platelet counts, as well as hemoglobin concentration, can be accurately obtained in blood, as well as a 3-part white cell differential in human samples. We can also accurately count red and white cells in body fluids with a limit of detection ~3 orders of magnitude smaller than current automated instruments. This method uses less than 1 microliter of blood, and less than 5 microliters of body fluids to make its measurements, making it highly compatible with finger-stick style collections, as well as appropriate for small animals such as laboratory mice where larger volume blood collections are dangerous to the animal's health.

  7. Correlation of total serum immunoglobulin E level, sputum, and peripheral eosinophil count in assessing the clinical severity in bronchial asthma.

    PubMed

    Kumar, Roshan M; Pajanivel, R; Koteeswaran, G; Menon, Surendra K; Charles, Pravin Mv

    2017-01-01

    Asthma is a chronic inflammatory disorder of the airway with involvement of various cellular populations and release of many inflammatory mediators. Eosinophils and serum immunoglobulin E (IgE) are considered a good marker of airway inflammation in asthma. The correlation of clinical assessment with various markers of airway inflammation in asthma is not well established in the Indian population. This study aims to study the correlation of serum IgE, sputum eosinophil count, and peripheral eosinophil count with clinical severity of Asthma. This is a cross-sectional study involving 76 stable asthmatic patients of 18-60 years of age attending the pulmonary medicine OPD. Spirometry measured at baseline. Participants were categorized according to the GINA criteria based on clinical symptoms and pulmonary function test. Blood samples were collected for peripheral eosinophil count, serum IgE levels, and sputum samples for eosinophil count. All three parameters were compared with severity of asthma. The correlation of sputum eosinophil count, peripheral eosinophil count, and serum IgE with severity of asthma was analyzed by Pearson's Chi-square test, Fisher's exact test, and the correlation coefficient was reported together with standard error of the estimate. The mean age of patients in our study was 37.42 years and 56.6% were male. There was a significant inverse correlation between serum IgE levels and predicted forced expiratory volume 1 s (FEV1). Sputum eosinophilia was significantly seen in severe persistent asthma patients (19.7%). There was a significant inverse correlation between sputum eosinophil count and predicted FEV1and forced vital capacity. We also found there was a significant association between peripheral eosinophil count, sputum eosinophil count, and elevated serum IgE (g100 IU/mL) with severe persistent asthma. The assessment of sputum eosinophil count is simple, inexpensive, noninvasive, and direct measurement of airway inflammation. It could be the preferred method in monitoring airway inflammation and guided management in day-to-day practice.

  8. Laser-induced photo emission detection: data acquisition based on light intensity counting

    NASA Astrophysics Data System (ADS)

    Yulianto, N.; Yudasari, N.; Putri, K. Y.

    2017-04-01

    Laser Induced Breakdown Detection (LIBD) is one of the quantification techniques for colloids. There are two ways of detection in LIBD: optical detection and acoustic detection. LIBD is based on the detection of plasma emission due to the interaction between particle and laser beam. In this research, the changing of light intensity during plasma formations was detected by a photodiode sensor. A photo emission data acquisition system was built to collect and transform them into digital counts. The real-time system used data acquisition device National Instrument DAQ 6009 and LABVIEW software. The system has been tested on distilled water and tap water samples. The result showed 99.8% accuracy by using counting technique in comparison to the acoustic detection with sample rate of 10 Hz, thus the acquisition system can be applied as an alternative method to the existing LIBD acquisition system.

  9. Lorentz-Shaped Comet Dust Trail Cross Section from New Hybrid Visual and Video Meteor Counting Technique - Implications for Future Leonid Storm Encounters

    NASA Technical Reports Server (NTRS)

    Jenniskens, Peter; Crawford, Chris; Butow, Steven J.; Nugent, David; Koop, Mike; Holman, David; Houston, Jane; Jobse, Klaas; Kronk, Gary

    2000-01-01

    A new hybrid technique of visual and video meteor observations was developed to provide high precision near real-time flux measurements for satellite operators from airborne platforms. A total of 33,000 Leonids. recorded on video during the 1999 Leonid storm, were watched by a team of visual observers using a video head display and an automatic counting tool. The counts reveal that the activity profile of the Leonid storm is a Lorentz profile. By assuming a radial profile for the dust trail that is also a Lorentzian, we make predictions for future encounters. If that assumption is correct, we passed 0.0003 AU deeper into the 1899 trailet than expected during the storm of 1999 and future encounters with the 1866 trailet will be less intense than. predicted elsewhere.

  10. Methods of detecting and counting raptors: A review

    USGS Publications Warehouse

    Fuller, M.R.; Mosher, J.A.; Ralph, C. John; Scott, J. Michael

    1981-01-01

    Most raptors are wide-ranging, secretive, and occur at relatively low densities. These factors, in conjunction with the nocturnal activity of owls, cause the counting of raptors by most standard census and survey efforts to be very time consuming and expensive. This paper reviews the most common methods of detecting and counting raptors. It is hoped that it will be of use to the ever-increasing number of biologists, land-use planners, and managers that must determine the occurrence, density, or population dynamics of raptors. Road counts of fixed station or continuous transect design are often used to sample large areas. Detection of spontaneous or elicited vocalizations, especially those of owls, provides a means of detecting and estimating raptor numbers. Searches for nests are accomplished from foot surveys, observations from automobiles and boats, or from aircraft when nest structures are conspicuous (e.g., Osprey). Knowledge of nest habitat, historic records, and inquiries of local residents are useful for locating nests. Often several of these techniques are combined to help find nest sites. Aerial searches have also been used to locate or count large raptors (e.g., eagles), or those that may be conspicuous in open habitats (e.g., tundra). Counts of birds entering or leaving nest colonies or colonial roosts have been attempted on a limited basis. Results from Christmas Bird Counts have provided an index of the abundance of some species. Trapping and banding generally has proven to be an inefficient method of detecting raptors or estimating their populations. Concentrations of migrants at strategically located points around the world afford the best opportunity to count many rap tors in a relatively short period of time, but the influence of many unquantified variables has inhibited extensive interpretation of these counts. Few data exist to demonstrate the effectiveness of these methods. We believe more research on sampling techniques, rather than complete counts or intensive searches, will provide adequate yet affordable estimates of raptor numbers in addition to providing methods for detecting the presence of raptors on areas of interest to researchers and managers.

  11. Draft SEI Program Plans: 1995-1999

    DTIC Science & Technology

    1994-08-01

    risk management because we believe that (a) structured techniques, even quite simple ones, can be effective in identifying and quantifying risk ; and (b...belief that (1) structured techniques, even quite simple ones, could be effective in identifying and quantifying risk ; and (2) techniques existed to

  12. Simultaneous in situ Optical Monitoring Techniques during Crystal Growth of ZnSe by Physical Vapor Transport

    NASA Technical Reports Server (NTRS)

    Su, C.- H.; Feth, S.; Lehoczky, S. L.

    1998-01-01

    ZnSe crystals grown in sealed ampoules by the physical vapor transport method were monitored in situ using three techniques, simultaneously. A Michelson interferometer was set-up to observe the growth rate and surface morphological evolution. An interference pattern (interferogram) is formed by the interaction between the reflection of a HeNe laser (632.8 nm wavelength) off the crystal-vapor interface and a reference beam from the same laser. Preliminary results indicate that the rate of growth/thermal-etching can be calculated using analog data acquisition and simple fringe counting techniques. Gross surface features may also be observed using a digital frame grabber and fringe analysis software. The second in situ technique uses optical absorption to determine the partial pressures of the vapor species. The Se2 and Zn vapor species present in the sealed ampoule absorb light at characteristic wavelengths. The optical absorption is determined by monitoring the light intensity difference between the sample and reference beams. The Se2 Partial pressure profile along the length of the ampoule was estimated from the vibronic absorption peaks at 340.5, 350.8, 361.3 and 379.2 nm using the Beer's law constants established in the calibration runs of pure Se. Finally, because the high temperature crystal growth furnace contains windows, in situ visual observation of the growing crystal is also possible. The use of these techniques not only permits in situ investigation of high temperature vapor growth of semiconductors, but also offers the potential for real time feed back on the growing crystal and allows the possibility of actively controlling the growth process.

  13. A Land Manager's Guide to Point Counts of Birds in the Southeast

    Treesearch

    Paul B. Hamel; Winston P. Smith; Daniel J. Twedt; James R. Woehr; Eddie Morris; Robert B. Hamilton; Robert J. Cooper

    1996-01-01

    Current widespread concern for the status of neotropical migratory birds has sparked interest in techniques for inventorying and monitoring populations of these and other birds in southeastern forest habitats. The present guide gives detailed instructions for conducting point counts of birds. It further presents a detailed methodology for the design and conduct of...

  14. Assessment of occupational exposure to asbestos fibers: Contribution of analytical transmission electron microscopy analysis and comparison with phase-contrast microscopy.

    PubMed

    Eypert-Blaison, Céline; Romero-Hariot, Anita; Clerc, Frédéric; Vincent, Raymond

    2018-03-01

    From November 2009 to October 2010, the French general directorate for labor organized a large field-study using analytical transmission electron microscopy (ATEM) to characterize occupational exposure to asbestos fibers during work on asbestos containing materials (ACM). The primary objective of this study was to establish a method and to validate the feasibility of using ATEM for the analysis of airborne asbestos of individual filters sampled in various occupational environments. For each sampling event, ATEM data were compared to those obtained by phase-contrast optical microscopy (PCOM), the WHO-recommended reference technique. A total of 265 results were obtained from 29 construction sites where workers were in contact with ACM. Data were sorted depending on the combination of the ACM type and the removal technique. For each "ACM-removal technique" combination, ATEM data were used to compute statistical indicators on short, fine and WHO asbestos fibers. Moreover, exposure was assessed taking into account the use of respiratory protective devices (RPD). As in previous studies, no simple relationship was found between results by PCOM and ATEM counting methods. Some ACM, such as asbestos-containing plasters, generated very high dust levels, and some techniques generated considerable levels of dust whatever the ACM treated. On the basis of these observations, recommendations were made to measure and control the occupational exposure limit. General prevention measures to be taken during work with ACM are also suggested. Finally, it is necessary to continue acquiring knowledge, in particular regarding RPD and the dust levels measured by ATEM for the activities not evaluated during this study.

  15. Selective photon counter for digital x-ray mammography tomosynthesis

    NASA Astrophysics Data System (ADS)

    Goldan, Amir H.; Karim, Karim S.; Rowlands, J. A.

    2006-03-01

    Photon counting is an emerging detection technique that is promising for mammography tomosynthesis imagers. In photon counting systems, the value of each image pixel is equal to the number of photons that interact with the detector. In this research, we introduce the design and implementation of a low noise, novel selective photon counting pixel for digital mammography tomosynthesis in crystalline silicon CMOS (complementary metal oxide semiconductor) 0.18 micron technology. The design comprises of a low noise charge amplifier (CA), two low offset voltage comparators, a decision-making unit (DMU), a mode selector, and a pseudo-random counter. Theoretical calculations and simulation results of linearity, gain, and noise of the photon counting pixel are presented.

  16. A land manager's guide to point counts of birds in the Southeast

    USGS Publications Warehouse

    Hamel, P.B.; Smith, W.P.; Twedt, D.J.; Woehr, J.R.; Morris, E.; Hamilton, R.B.; Cooper, R.J.

    1996-01-01

    Current widespread concern for the status of neotropical migratory birds has sparked interest in techniques for inventorying and monitoring populations of these and other birds in southeastern forest habitats. The present guide gives detailed instructions for conducting point counts of birds. It further presents a detailed methodology for the design and conduct of inventorial and monitoring surveys based on point counts, including discussion of sample size determination, distribution of counts among habitats, cooperation among neighboring land managers, vegetation sampling, standard data format, and other topics. Appendices provide additional information, making this guide a stand-alone text for managers interested in developing inventories of bird populations on their lands.

  17. Real-time bacterial microcolony counting using on-chip microscopy

    NASA Astrophysics Data System (ADS)

    Jung, Jae Hee; Lee, Jung Eun

    2016-02-01

    Observing microbial colonies is the standard method for determining the microbe titer and investigating the behaviors of microbes. Here, we report an automated, real-time bacterial microcolony-counting system implemented on a wide field-of-view (FOV), on-chip microscopy platform, termed ePetri. Using sub-pixel sweeping microscopy (SPSM) with a super-resolution algorithm, this system offers the ability to dynamically track individual bacterial microcolonies over a wide FOV of 5.7 mm × 4.3 mm without requiring a moving stage or lens. As a demonstration, we obtained high-resolution time-series images of S. epidermidis at 20-min intervals. We implemented an image-processing algorithm to analyze the spatiotemporal distribution of microcolonies, the development of which could be observed from a single bacterial cell. Test bacterial colonies with a minimum diameter of 20 μm could be enumerated within 6 h. We showed that our approach not only provides results that are comparable to conventional colony-counting assays but also can be used to monitor the dynamics of colony formation and growth. This microcolony-counting system using on-chip microscopy represents a new platform that substantially reduces the detection time for bacterial colony counting. It uses chip-scale image acquisition and is a simple and compact solution for the automation of colony-counting assays and microbe behavior analysis with applications in antibacterial drug discovery.

  18. Real-time bacterial microcolony counting using on-chip microscopy

    PubMed Central

    Jung, Jae Hee; Lee, Jung Eun

    2016-01-01

    Observing microbial colonies is the standard method for determining the microbe titer and investigating the behaviors of microbes. Here, we report an automated, real-time bacterial microcolony-counting system implemented on a wide field-of-view (FOV), on-chip microscopy platform, termed ePetri. Using sub-pixel sweeping microscopy (SPSM) with a super-resolution algorithm, this system offers the ability to dynamically track individual bacterial microcolonies over a wide FOV of 5.7 mm × 4.3 mm without requiring a moving stage or lens. As a demonstration, we obtained high-resolution time-series images of S. epidermidis at 20-min intervals. We implemented an image-processing algorithm to analyze the spatiotemporal distribution of microcolonies, the development of which could be observed from a single bacterial cell. Test bacterial colonies with a minimum diameter of 20 μm could be enumerated within 6 h. We showed that our approach not only provides results that are comparable to conventional colony-counting assays but also can be used to monitor the dynamics of colony formation and growth. This microcolony-counting system using on-chip microscopy represents a new platform that substantially reduces the detection time for bacterial colony counting. It uses chip-scale image acquisition and is a simple and compact solution for the automation of colony-counting assays and microbe behavior analysis with applications in antibacterial drug discovery. PMID:26902822

  19. Restoration of labral anatomy and biomechanics after superior labral anterior-posterior repair: comparison of mattress versus simple technique.

    PubMed

    Boddula, Madhav R; Adamson, Gregory J; Gupta, Akash; McGarry, Michelle H; Lee, Thay Q

    2012-04-01

    Both simple and mattress repair techniques have been utilized with success for type II superior labral anterior-posterior (SLAP) lesions; however, direct anatomic and biomechanical comparisons of these techniques have yet to be clearly demonstrated. For type II SLAP lesions, the mattress suture repair technique will result in greater labral height and better position on the glenoid face and exhibit stronger biomechanical characteristics, when cyclically loaded and loaded to failure through the biceps, compared with the simple suture repair technique. Controlled laboratory study. Six matched pairs of cadaveric shoulders were dissected, and a clock face was created on the glenoid from 9 o'clock (posterior) to 3 o'clock (anterior). For the intact specimen, labral height and labral distance from the glenoid edge were measured using a MicroScribe. A SLAP lesion was then created from 10 o'clock to 2 o'clock. Lesions were repaired with two 3.0-mm BioSuture-Tak anchors placed at 11 o'clock and 1 o'clock. For each pair, a mattress repair was used for one shoulder, and a simple repair was used for the contralateral shoulder. After repair, labral height and labral distance from the glenoid edge were again measured. The specimens were then cyclically loaded and loaded to failure through the biceps using an Instron machine. A paired t test was used for statistical analysis. After mattress repair, a significant increase in labral height occurred compared with intact from 2.5 ± 0.3 mm to 4.3 ± 0.3 mm at 11 o'clock (P = .013), 2.7 ± 0.5 mm to 4.2 ± 0.7 mm at 12:30 o'clock (P = .007), 3.1 ± 0.5 mm to 4.2 ± 0.7 mm at 1 o'clock (P = .006), and 2.8 ± 0.7 mm to 3.7 ± 0.8 mm at 1:30 o'clock (P = .037). There was no significant difference in labral height between the intact condition and after simple repair at any clock face position. Labral height was significantly increased in the mattress repairs compared with simple repairs at 11 o'clock (mean difference, 2.0 mm; P = .008) and 12:30 o'clock (mean difference, 1.3 mm; P = .044). Labral distance from the glenoid edge was not significantly different between techniques. No difference was observed between the mattress and simple repair techniques for all biomechanical parameters, except the simple technique had a higher load and energy absorbed at 2-mm displacement. The mattress technique created a greater labral height while maintaining similar biomechanical characteristics compared with the simple repair, with the exception of load and energy absorbed at 2-mm displacement, which was increased for the simple technique. Mattress repair for type II SLAP lesions creates a higher labral bumper compared with simple repairs, while both techniques resulted in similar biomechanical characteristics.

  20. Dermatoglyphics: A Diagnostic Aid?

    PubMed Central

    Fuller, I. C.

    1973-01-01

    Dermatoglyphics of patients suffering from diabetes, schizophrenia, duodenal ulcer, asthma, and various cancers have been contrasted and significant differences in the digital ridge counts, maximum atd angles, and distal palmar loop ridge counts have been found. A discriminant analysis of the digital ridge counts was performed and the function was used to attempt differential diagnosis between these conditions on dermatoglyphic evidence alone. This diagnostic trial failed, and possible reasons for its failure are discussed. Attention is drawn to the possibility that prognostic implications of dermatoglyphics might be relevant to screening techniques. PMID:4714584

  1. Submillimeter Galaxy Number Counts and Magnification by Galaxy Clusters

    NASA Astrophysics Data System (ADS)

    Lima, Marcos; Jain, Bhuvnesh; Devlin, Mark; Aguirre, James

    2010-07-01

    We present an analytical model that reproduces measured galaxy number counts from surveys in the wavelength range of 500 μm-2 mm. The model involves a single high-redshift galaxy population with a Schechter luminosity function that has been gravitationally lensed by galaxy clusters in the mass range 1013-1015 M sun. This simple model reproduces both the low-flux and the high-flux end of the number counts reported by the BLAST, SCUBA, AzTEC, and South Pole Telescope (SPT) surveys. In particular, our model accounts for the most luminous galaxies detected by SPT as the result of high magnifications by galaxy clusters (magnification factors of 10-30). This interpretation implies that submillimeter (submm) and millimeter surveys of this population may prove to be a useful addition to ongoing cluster detection surveys. The model also implies that the bulk of submm galaxies detected at wavelengths larger than 500 μm lie at redshifts greater than 2.

  2. The normalization of solar X-ray data from many experiments.

    NASA Technical Reports Server (NTRS)

    Wende, C. D.

    1972-01-01

    A conversion factor is used to convert Geiger (GM) tube count rates or ion chamber currents into units of the incident X-ray energy flux in a specified passband. A method is described which varies the passband to optimize these conversion factors such that they are relatively independent of the spectrum of the incident photons. This method was applied to GM tubes flown on Explorers 33 and 35 and Mariner 5 and to ion chambers flown on OSO 3 and OGO 4. Revised conversion factors and passbands are presented, and the resulting absolute solar X-ray fluxes based on these are shown to improve the agreement between the various experiments. Calculations have shown that, although the GM tubes on Explorer 33 viewed the Sun off-axis, the effective passband did not change appreciably, and the simple normalization of the count rates to the count rates of a similar GM tube on Explorer 35 was justified.

  3. Constraint counting for frictional jamming

    NASA Astrophysics Data System (ADS)

    Quint, D. A.; Henkes, S.; Schwarz, J. M.

    2012-02-01

    While the frictionless jamming transition has been intensely studied in recent years, more realistic frictional packings are less well understood. In frictionless sphere packings, the transition is predicted by a simple mean-field constraint counting argument, the isostaticity argument. For frictional packings, a modified constraint counting argument, which includes slipping contacts at the Coulomb threshold, has had limited success in accounting for the transition. We propose that the frictional jamming transition is not mean field and is triggered by the nucleation of unstable regions, which are themselves dynamical objects due to the Coulomb criterion. We create frictional packings using MD simulations and test for the presence and shape of rigid clusters with the pebble game to identify the partition of the packing into stable and unstable regions. To understand the dynamics of these unstable regions we follow perturbations at contacts crucial to the stability of the ``frictional house of cards.''

  4. A simple-rapid method to separate uranium, thorium, and protactinium for U-series age-dating of materials.

    PubMed

    Knight, Andrew W; Eitrheim, Eric S; Nelson, Andrew W; Nelson, Steven; Schultz, Michael K

    2014-08-01

    Uranium-series dating techniques require the isolation of radionuclides in high yields and in fractions free of impurities. Within this context, we describe a novel-rapid method for the separation and purification of U, Th, and Pa. The method takes advantage of differences in the chemistry of U, Th, and Pa, utilizing a commercially-available extraction chromatographic resin (TEVA) and standard reagents. The elution behavior of U, Th, and Pa were optimized using liquid scintillation counting techniques and fractional purity was evaluated by alpha-spectrometry. The overall method was further assessed by isotope dilution alpha-spectrometry for the preliminary age determination of an ancient carbonate sample obtained from the Lake Bonneville site in western Utah (United States). Preliminary evaluations of the method produced elemental purity of greater than 99.99% and radiochemical recoveries exceeding 90% for U and Th and 85% for Pa. Excellent purity and yields (76% for U, 96% for Th and 55% for Pa) were also obtained for the analysis of the carbonate samples and the preliminary Pa and Th ages of about 39,000 years before present are consistent with (14)C-derived age of the material. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pomorski, Michal; Mer-Calfati, Christine; Foulon, Francois

    Diamond exhibits a combination of properties which makes it attractive for neutron detection in hostile conditions. In the particular case of detection in a nuclear reactor, it is resilient to radiation, exhibits a natural low sensitivity to gamma rays, and its small size (as compared with that of gas ionisation chambers) enables fluency monitoring with a high position resolution. We report here on the use of synthetic CVD diamond as a solid state micro-fission chamber with U-235 converting material for in-core thermal neutron monitoring. Two types of thin diamond detectors were developed for this application. The first type of detectormore » is fabricated using thin diamond membrane obtained by etching low-cost commercially available single crystal CVD intrinsic diamond, so called 'optical grade' material. Starting from a few hundred of micrometre thick samples, the sample is sliced with a laser and then plasma etched down to a few tenths of micrometre. Here we report the result obtained with a 17 μm thick device. The detection surface of this detector is equal to 1 mm{sup 2}. Detectors with surfaces up to 1 cm{sup 2} can be fabricated with this technique. The second type of detector is fabricated by growing successively two thin films of diamond, by the microwave enhanced chemical vapour deposition technique, on HPHT single crystal diamond. A first, a film of boron doped (p+) single crystal diamond, a few microns thick, is deposited. Then a second film of intrinsic diamond with a thickness of a few tens of microns is deposited. This results in a P doped, Intrinsic, Metal structure (PIM) structure in which the intrinsic volume id the active part of the detector. Here we report the results obtained with a 20 μm thick intrinsic whose detection surface is equal to 0.5 mm{sup 2}, with the possibility to enlarge the surface of the detector up to 1 cm{sup 2}. These two types of detector were tested at the VR-1 research reactor at the Czech Technical University in Prague. The Training Reactor VR-1 is a pool type (light water) reactor based on UO{sub 2} low enriched uranium. It has a nominal power of 1 kW, and can be operated for a short period up to 5 kW. The arrangement of the reactor pool reactor facilitates access to the core, setting and removal of various experimental samples and detectors, and safe and easy handling of fuel assemblies. The reactor is equipped with two horizontal channels (radial and tangential) and 10 vertical channels, of varying diameters, which can be loaded into various core positions, and one pneumatic transfer system. It is also equipped with several specifically designed educational instrumentation systems that can be used to supply complementary measurements and characterization around the reactor. The reactor is operated by the Department of Nuclear Reactors of the Faculty of Nuclear Sciences and Physical Engineering of the Czech Technical University in Prague. The two detectors were placed in-core through one of the vertical insertion channel. They were coupled to remote placed (5 m BNC cable) classical nuclear charge sensitive electronics. Detection properties of both sensors, including: pulse height spectra of U-235 fission fragments (response linearity with neutron flux, count rate, gamma background, were evaluated varying the power of the reactor from 0.005 W to 500 W. The evolution of the counting rate of the thinned optical grade detector as a function of counting rate of a gas ionization chamber used currently for reactor monitoring shows the very good linearity of the detector over the 5 decades. Similar results were obtained with the PIM detector. Additionally fast transient current signals of the detectors were recorded on a digital storage oscilloscope (DSO) using broad-band amplifier and with a simple bias-T, showing potential use of such sensors for neutron counting with no need of an amplification stage, since non-amplified signals from fission fragments exceeded 4 mV in amplitude. Therefore, one can think of simple neutron counting system by feeding diamond detectors signals directly to the low threshold discriminators. The results obtained on the VR1 will be described and discussed in detail in the paper and associated presentation. The results demonstrate that diamond micro-fission chambers can be used for in-core neutron monitoring, where robust, simple and compact devices are required.« less

  6. Why did we elaborate an entangled photons experiment in our engineering school?

    NASA Astrophysics Data System (ADS)

    Jacubowiez, Lionel; Avignon, Thierry

    2005-10-01

    We will describe a simple setup experiment that allows students to create polarization-entangled photons pairs. These photon pairs are in an entangled state first described in the famous 1935 article in Phys.Rev by Einstein-Podolsky-Rosen, often called E.P.R. state. Photons pairs at 810 nm are produced in two nonlinear crystals by spontaneous parametric downconversion of photons at 405 nm emitted by a violet laser diode. The polarization state of the photons pairs is easily tunable with a half-wave plate and a Babinet compensator on the laser diode beam. After having adjusted the polarization-entangled state of the photon pairs, our students can perform a test of Bell's inequalities. They will find the amazing value for the Bell parameter between 2.3 and 2.6, depending on the quality of the adjustments of the state of polarization. The experiments described can be done in 4 or 5 hours. What is the importance of creating an entangled photons experiment for our engineering students? First of all, entanglement concept is clearly one of the most strikingly nonclassical features of quantum theory and it is playing an increasing role in present-day physics. But in this paper, we will emphasise the experimental point of view. We will try to explain why we believe that for our students this lab experiment is a unique opportunity to deal with established concepts and experimental techniques on polarization, non linear effects, phase matching, photon counting avalanche photodiodes, counting statistics, coincidences detectors. Let us recall that the first convincing experimental violations of Bell's inequalities were performed by Alain Aspect and Philippe Grangier with pairs of entangled photons at the Institut d'Optique between 1976 and 1982. Twenty five years later, due to recent advances in laser diode technology, new techniques for generation of photon pairs and avalanche photodiodes, this experiment is now part of the experimental lab courses for our students.

  7. Optimizations for the EcoPod field identification tool

    PubMed Central

    Manoharan, Aswath; Stamberger, Jeannie; Yu, YuanYuan; Paepcke, Andreas

    2008-01-01

    Background We sketch our species identification tool for palm sized computers that helps knowledgeable observers with census activities. An algorithm turns an identification matrix into a minimal length series of questions that guide the operator towards identification. Historic observation data from the census geographic area helps minimize question volume. We explore how much historic data is required to boost performance, and whether the use of history negatively impacts identification of rare species. We also explore how characteristics of the matrix interact with the algorithm, and how best to predict the probability of observing a previously unseen species. Results Point counts of birds taken at Stanford University's Jasper Ridge Biological Preserve between 2000 and 2005 were used to examine the algorithm. A computer identified species by correctly answering, and counting the algorithm's questions. We also explored how the character density of the key matrix and the theoretical minimum number of questions for each bird in the matrix influenced the algorithm. Our investigation of the required probability smoothing determined whether Laplace smoothing of observation probabilities was sufficient, or whether the more complex Good-Turing technique is required. Conclusion Historic data improved identification speed, but only impacted the top 25% most frequently observed birds. For rare birds the history based algorithms did not impose a noticeable penalty in the number of questions required for identification. For our dataset neither age of the historic data, nor the number of observation years impacted the algorithm. Density of characters for different taxa in the identification matrix did not impact the algorithms. Intrinsic differences in identifying different birds did affect the algorithm, but the differences affected the baseline method of not using historic data to exactly the same degree. We found that Laplace smoothing performed better for rare species than Simple Good-Turing, and that, contrary to expectation, the technique did not then adversely affect identification performance for frequently observed birds. PMID:18366649

  8. Single Spore Isolation as a Simple and Efficient Technique to obtain fungal pure culture

    NASA Astrophysics Data System (ADS)

    Noman, E.; Al-Gheethi, AA; Rahman, N. K.; Talip, B.; Mohamed, R.; H, N.; Kadir, O. A.

    2018-04-01

    The successful identification of fungi by phenotypic methods or molecular technique depends mainly on the using an advanced technique for purifying the isolates. The most efficient is the single spore technique due to the simple requirements and the efficiency in preventing the contamination by yeast, mites or bacteria. The method described in the present work is depends on the using of a light microscope to transfer one spore into a new culture medium. The present work describes a simple and efficient procedure for single spore isolation to purify of fungi recovered from the clinical wastes.

  9. Automated Video-Based Traffic Count Analysis.

    DOT National Transportation Integrated Search

    2016-01-01

    The goal of this effort has been to develop techniques that could be applied to the : detection and tracking of vehicles in overhead footage of intersections. To that end we : have developed and published techniques for vehicle tracking based on dete...

  10. Intraosseous repair of the inferior alveolar nerve in rats: an experimental model.

    PubMed

    Curtis, N J; Trickett, R I; Owen, E; Lanzetta, M

    1998-08-01

    A reliable method of exposure of the inferior alveolar nerve in Wistar rats has been developed, to allow intraosseous repair with two microsurgical techniques under halothane inhalational anaesthesia. The microsuturing technique involves anastomosis with 10-0 nylon sutures; a laser-weld technique uses an albumin-based solder containing indocyanine green, plus an infrared (810 nm wavelength) diode laser Seven animals had left inferior alveolar nerve repairs performed with the microsuture and laser-weld techniques. Controls were provided by unoperated nerves in the repaired cases. Histochemical analysis was performed utilizing neuron counts and horseradish peroxidase tracer (HRP) uptake in the mandibular division of the trigeminal ganglion, following sacrifice and staining of frozen sections with cresyl violet and diaminobenzidene. The results of this analysis showed similar mean neuron counts and mean HRP uptake by neurons for the unoperated controls and both microsuture and laser-weld groups. This new technique of intraosseous exposure of the inferior alveolar nerve in rats is described. It allows reliable and reproducible microsurgical repairs using both microsuture and laser-weld techniques.

  11. Simple Technique for Dark-Field Photography of Immunodiffusion Bands

    PubMed Central

    Jensh, Ronald P.; Brent, Robert L.

    1969-01-01

    A simple dark-field photographic technique was developed which enables laboratory personnel with minimal photographic training to easily record antigen-antibody patterns on immunodiffusion plates. Images PMID:4979944

  12. A technique for automatically extracting useful field of view and central field of view images.

    PubMed

    Pandey, Anil Kumar; Sharma, Param Dev; Aheer, Deepak; Kumar, Jay Prakash; Sharma, Sanjay Kumar; Patel, Chetan; Kumar, Rakesh; Bal, Chandra Sekhar

    2016-01-01

    It is essential to ensure the uniform response of the single photon emission computed tomography gamma camera system before using it for the clinical studies by exposing it to uniform flood source. Vendor specific acquisition and processing protocol provide for studying flood source images along with the quantitative uniformity parameters such as integral and differential uniformity. However, a significant difficulty is that the time required to acquire a flood source image varies from 10 to 35 min depending both on the activity of Cobalt-57 flood source and the pre specified counts in the vendors protocol (usually 4000K-10,000K counts). In case the acquired total counts are less than the total prespecified counts, and then the vendor's uniformity processing protocol does not precede with the computation of the quantitative uniformity parameters. In this study, we have developed and verified a technique for reading the flood source image, remove unwanted information, and automatically extract and save the useful field of view and central field of view images for the calculation of the uniformity parameters. This was implemented using MATLAB R2013b running on Ubuntu Operating system and was verified by subjecting it to the simulated and real flood sources images. The accuracy of the technique was found to be encouraging, especially in view of practical difficulties with vendor-specific protocols. It may be used as a preprocessing step while calculating uniformity parameters of the gamma camera in lesser time with fewer constraints.

  13. Modifications of haematology analyzers to improve cell counting and leukocyte differentiating in cerebrospinal fluid controls of the Joint German Society for Clinical Chemistry and Laboratory Medicine.

    PubMed

    Kleine, Tilmann O; Nebe, C Thomas; Löwer, Christa; Lehmitz, Reinhard; Kruse, Rolf; Geilenkeuser, Wolf-Jochen; Dorn-Beineke, Alexandra

    2009-08-01

    Flow cytometry (FCM) is used with haematology analyzers (HAs) to count cells and differentiate leukocytes in cerebrospinal fluid (CSF). To evaluate the FCM techniques of HAs, 10 external DGKL trials with CSF controls were carried out in 2004 to 2008. Eight single platform HAs with and without CSF equipment were evaluated with living blood leukocytes and erythrocytes in CSF like DGKL controls: Coulter (LH750,755), Abbott CD3200, CD3500, CD3700, CD4000, Sapphire, ADVIA 120(R) CSF assay, and Sysmex XE-2100(R). Results were compared with visual counting of native cells in Fuchs-Rosenthal chamber, unstained, and absolute values of leukocyte differentiation, assayed by dual platform analysis with immune-FCM (FACSCalibur, CD45, CD14) and the chamber counts. Reference values X were compared with HA values Y by statistical evaluation with Passing/Bablock (P/B) linear regression analysis to reveal conformity of both methods. The HAs, studied, produced no valid results with DGKL CSF controls, because P/B regression revealed no conformity with the reference values due to:-blank problems with impedance analysis,-leukocyte loss with preanalytical erythrocyte lysis procedures, especially of monocytes,-inaccurate results with ADVIA cell sphering and cell differentiation with algorithms and enzyme activities (e.g., peroxidase). HA techniques have to be improved, e.g., using no erythrocyte lysis and CSF adequate techniques, to examine CSF samples precise and accurate. Copyright 2009 International Society for Advancement of Cytometry.

  14. Still No 40 Acres, Still No Mule

    ERIC Educational Resources Information Center

    Keels, Crystal L.

    2005-01-01

    The simple mention of reparations for African-Americans in the United States can be counted on to generate a firestorm. When it comes to the issue of recompense for injustices Black Americans have suffered throughout U.S. history--slavery, Jim Crow segregation, and other political and social mechanisms designed to maintain racial inequality--the…

  15. The Future of Humanities Labor

    ERIC Educational Resources Information Center

    Bauerlein, Mark

    2008-01-01

    "Publish or perish" has long been the formula of academic labor at research universities, but for many humanities professors that imperative has decayed into a simple rule of production. The publish-or-perish model assumed a peer-review process that maintained quality, but more and more it is the bare volume of printed words that counts. When…

  16. Identities for Generalized Fibonacci Numbers: A Combinatorial Approach

    ERIC Educational Resources Information Center

    Plaza, A.; Falcon, S.

    2008-01-01

    This note shows a combinatorial approach to some identities for generalized Fibonacci numbers. While it is a straightforward task to prove these identities with induction, and also by arithmetical manipulations such as rearrangements, the approach used here is quite simple to follow and eventually reduces the proof to a counting problem. (Contains…

  17. Linkages Between Library Uses Through the Study of Individual Patron Behavior.

    ERIC Educational Resources Information Center

    Clark, Philip M.; Benson, James

    1985-01-01

    Proposes and investigates feasibility of using three new variables in addition to simple activity counts in measuring reference services: (1) user questioning regardless of where activity occurs; (2) treating individual user and question as unit of analysis; and (3) examination of questioning in context of other library use. (11 references) (EJS)

  18. Form-class volume tables for estimating board-foot content of northern conifers

    Treesearch

    C. Allen Bickford

    1951-01-01

    The timber cruiser counts volume tables among his most important working tools. He wants - if he can get them - tables that are simple, easy to use, and accurate. Before using a volume table in a new situation, the careful cruiser will check it by comparing table volumes with actual volumes.

  19. A pebble count procedure for assessing watershed cumulative effects

    Treesearch

    Gregory S. Bevenger; Rudy M. King

    1995-01-01

    Land mangement activities can result in the delivery of fine sediment to streams. Over time, such delivery can lead to cumulative impacts to the aquactic ecosystem. Because numerous laws require Federal land managers to analyze watershed cumulative effects, field personnel need simple monitoring procedures that can be used directly and consistently. One approach to...

  20. Counting the Nouns: Simple Structural Cues to Verb Meaning

    ERIC Educational Resources Information Center

    Yuan, Sylvia; Fisher, Cynthia; Snedeker, Jesse

    2012-01-01

    Two-year-olds use the sentence structures verbs appear in--"subcategorization frames"--to guide verb learning. This is syntactic bootstrapping. This study probed the developmental origins of this ability. The structure-mapping account proposes that children begin with a bias toward one-to-one mapping between nouns in sentences and participant…

  1. A loop-counting method for covariate-corrected low-rank biclustering of gene-expression and genome-wide association study data.

    PubMed

    Rangan, Aaditya V; McGrouther, Caroline C; Kelsoe, John; Schork, Nicholas; Stahl, Eli; Zhu, Qian; Krishnan, Arjun; Yao, Vicky; Troyanskaya, Olga; Bilaloglu, Seda; Raghavan, Preeti; Bergen, Sarah; Jureus, Anders; Landen, Mikael

    2018-05-14

    A common goal in data-analysis is to sift through a large data-matrix and detect any significant submatrices (i.e., biclusters) that have a low numerical rank. We present a simple algorithm for tackling this biclustering problem. Our algorithm accumulates information about 2-by-2 submatrices (i.e., 'loops') within the data-matrix, and focuses on rows and columns of the data-matrix that participate in an abundance of low-rank loops. We demonstrate, through analysis and numerical-experiments, that this loop-counting method performs well in a variety of scenarios, outperforming simple spectral methods in many situations of interest. Another important feature of our method is that it can easily be modified to account for aspects of experimental design which commonly arise in practice. For example, our algorithm can be modified to correct for controls, categorical- and continuous-covariates, as well as sparsity within the data. We demonstrate these practical features with two examples; the first drawn from gene-expression analysis and the second drawn from a much larger genome-wide-association-study (GWAS).

  2. An evaluation of population index and estimation techniques for tadpoles in desert pools

    USGS Publications Warehouse

    Jung, Robin E.; Dayton, Gage H.; Williamson, Stephen J.; Sauer, John R.; Droege, Sam

    2002-01-01

    Using visual (VI) and dip net indices (DI) and double-observer (DOE), removal (RE), and neutral red dye capture-recapture (CRE) estimates, we counted, estimated, and censused Couch's spadefoot (Scaphiopus couchii) and canyon treefrog (Hyla arenicolor) tadpole populations in Big Bend National Park, Texas. Initial dye experiments helped us determine appropriate dye concentrations and exposure times to use in mesocosm and field trials. The mesocosm study revealed higher tadpole detection rates, more accurate population estimates, and lower coefficients of variation among pools compared to those from the field study. In both mesocosm and field studies, CRE was the best method for estimating tadpole populations, followed by DOE and RE. In the field, RE, DI, and VI often underestimated populations in pools with higher tadpole numbers. DI improved with increased sampling. Larger pools supported larger tadpole populations, and tadpole detection rates in general decreased with increasing pool volume and surface area. Hence, pool size influenced bias in tadpole sampling. Across all techniques, tadpole detection rates differed among pools, indicating that sampling bias was inherent and techniques did not consistently sample the same proportion of tadpoles in each pool. Estimating bias (i.e., calculating detection rates) therefore was essential in assessing tadpole abundance. Unlike VI and DOE, DI, RE, and CRE could be used in turbid waters in which tadpoles are not visible. The tadpole population estimates we used accommodated differences in detection probabilities in simple desert pool environments but may not work in more complex habitats.

  3. Deconvolution of astronomical images using SOR with adaptive relaxation.

    PubMed

    Vorontsov, S V; Strakhov, V N; Jefferies, S M; Borelli, K J

    2011-07-04

    We address the potential performance of the successive overrelaxation technique (SOR) in image deconvolution, focusing our attention on the restoration of astronomical images distorted by atmospheric turbulence. SOR is the classical Gauss-Seidel iteration, supplemented with relaxation. As indicated by earlier work, the convergence properties of SOR, and its ultimate performance in the deconvolution of blurred and noisy images, can be made competitive to other iterative techniques, including conjugate gradients, by a proper choice of the relaxation parameter. The question of how to choose the relaxation parameter, however, remained open, and in the practical work one had to rely on experimentation. In this paper, using constructive (rather than exact) arguments, we suggest a simple strategy for choosing the relaxation parameter and for updating its value in consecutive iterations to optimize the performance of the SOR algorithm (and its positivity-constrained version, +SOR) at finite iteration counts. We suggest an extension of the algorithm to the notoriously difficult problem of "blind" deconvolution, where both the true object and the point-spread function have to be recovered from the blurred image. We report the results of numerical inversions with artificial and real data, where the algorithm is compared with techniques based on conjugate gradients. In all of our experiments +SOR provides the highest quality results. In addition +SOR is found to be able to detect moderately small changes in the true object between separate data frames: an important quality for multi-frame blind deconvolution where stationarity of the object is a necesessity.

  4. Transition Icons for Time-Series Visualization and Exploratory Analysis.

    PubMed

    Nickerson, Paul V; Baharloo, Raheleh; Wanigatunga, Amal A; Manini, Todd M; Tighe, Patrick J; Rashidi, Parisa

    2018-03-01

    The modern healthcare landscape has seen the rapid emergence of techniques and devices that temporally monitor and record physiological signals. The prevalence of time-series data within the healthcare field necessitates the development of methods that can analyze the data in order to draw meaningful conclusions. Time-series behavior is notoriously difficult to intuitively understand due to its intrinsic high-dimensionality, which is compounded in the case of analyzing groups of time series collected from different patients. Our framework, which we call transition icons, renders common patterns in a visual format useful for understanding the shared behavior within groups of time series. Transition icons are adept at detecting and displaying subtle differences and similarities, e.g., between measurements taken from patients receiving different treatment strategies or stratified by demographics. We introduce various methods that collectively allow for exploratory analysis of groups of time series, while being free of distribution assumptions and including simple heuristics for parameter determination. Our technique extracts discrete transition patterns from symbolic aggregate approXimation representations, and compiles transition frequencies into a bag of patterns constructed for each group. These transition frequencies are normalized and aligned in icon form to intuitively display the underlying patterns. We demonstrate the transition icon technique for two time-series datasets-postoperative pain scores, and hip-worn accelerometer activity counts. We believe transition icons can be an important tool for researchers approaching time-series data, as they give rich and intuitive information about collective time-series behaviors.

  5. Multiple sensitive estimation and optimal sample size allocation in the item sum technique.

    PubMed

    Perri, Pier Francesco; Rueda García, María Del Mar; Cobo Rodríguez, Beatriz

    2018-01-01

    For surveys of sensitive issues in life sciences, statistical procedures can be used to reduce nonresponse and social desirability response bias. Both of these phenomena provoke nonsampling errors that are difficult to deal with and can seriously flaw the validity of the analyses. The item sum technique (IST) is a very recent indirect questioning method derived from the item count technique that seeks to procure more reliable responses on quantitative items than direct questioning while preserving respondents' anonymity. This article addresses two important questions concerning the IST: (i) its implementation when two or more sensitive variables are investigated and efficient estimates of their unknown population means are required; (ii) the determination of the optimal sample size to achieve minimum variance estimates. These aspects are of great relevance for survey practitioners engaged in sensitive research and, to the best of our knowledge, were not studied so far. In this article, theoretical results for multiple estimation and optimal allocation are obtained under a generic sampling design and then particularized to simple random sampling and stratified sampling designs. Theoretical considerations are integrated with a number of simulation studies based on data from two real surveys and conducted to ascertain the efficiency gain derived from optimal allocation in different situations. One of the surveys concerns cannabis consumption among university students. Our findings highlight some methodological advances that can be obtained in life sciences IST surveys when optimal allocation is achieved. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Comparative analysis of dose rates in bricks determined by neutron activation analysis, alpha counting and X-ray fluorescence analysis for the thermoluminescence fine grain dating method

    NASA Astrophysics Data System (ADS)

    Bártová, H.; Kučera, J.; Musílek, L.; Trojek, T.

    2014-11-01

    In order to evaluate the age from the equivalent dose and to obtain an optimized and efficient procedure for thermoluminescence (TL) dating, it is necessary to obtain the values of both the internal and the external dose rates from dated samples and from their environment. The measurements described and compared in this paper refer to bricks from historic buildings and a fine-grain dating method. The external doses are therefore negligible, if the samples are taken from a sufficient depth in the wall. However, both the alpha dose rate and the beta and gamma dose rates must be taken into account in the internal dose. The internal dose rate to fine-grain samples is caused by the concentrations of natural radionuclides 238U, 235U, 232Th and members of their decay chains, and by 40K concentrations. Various methods can be used for determining trace concentrations of these natural radionuclides and their contributions to the dose rate. The dose rate fraction from 238U and 232Th can be calculated, e.g., from the alpha count rate, or from the concentrations of 238U and 232Th, measured by neutron activation analysis (NAA). The dose rate fraction from 40K can be calculated from the concentration of potassium measured, e.g., by X-ray fluorescence analysis (XRF) or by NAA. Alpha counting and XRF are relatively simple and are accessible for an ordinary laboratory. NAA can be considered as a more accurate method, but it is more demanding regarding time and costs, since it needs a nuclear reactor as a neutron source. A comparison of these methods allows us to decide whether the time- and cost-saving simpler techniques introduce uncertainty that is still acceptable.

  7. Transforming geographic scale: a comparison of combined population and areal weighting to other interpolation methods.

    PubMed

    Hallisey, Elaine; Tai, Eric; Berens, Andrew; Wilt, Grete; Peipins, Lucy; Lewis, Brian; Graham, Shannon; Flanagan, Barry; Lunsford, Natasha Buchanan

    2017-08-07

    Transforming spatial data from one scale to another is a challenge in geographic analysis. As part of a larger, primary study to determine a possible association between travel barriers to pediatric cancer facilities and adolescent cancer mortality across the United States, we examined methods to estimate mortality within zones at varying distances from these facilities: (1) geographic centroid assignment, (2) population-weighted centroid assignment, (3) simple areal weighting, (4) combined population and areal weighting, and (5) geostatistical areal interpolation. For the primary study, we used county mortality counts from the National Center for Health Statistics (NCHS) and population data by census tract for the United States to estimate zone mortality. In this paper, to evaluate the five mortality estimation methods, we employed address-level mortality data from the state of Georgia in conjunction with census data. Our objective here is to identify the simplest method that returns accurate mortality estimates. The distribution of Georgia county adolescent cancer mortality counts mirrors the Poisson distribution of the NCHS counts for the U.S. Likewise, zone value patterns, along with the error measures of hierarchy and fit, are similar for the state and the nation. Therefore, Georgia data are suitable for methods testing. The mean absolute value arithmetic differences between the observed counts for Georgia and the five methods were 5.50, 5.00, 4.17, 2.74, and 3.43, respectively. Comparing the methods through paired t-tests of absolute value arithmetic differences showed no statistical difference among the methods. However, we found a strong positive correlation (r = 0.63) between estimated Georgia mortality rates and combined weighting rates at zone level. Most importantly, Bland-Altman plots indicated acceptable agreement between paired arithmetic differences of Georgia rates and combined population and areal weighting rates. This research contributes to the literature on areal interpolation, demonstrating that combined population and areal weighting, compared to other tested methods, returns the most accurate estimates of mortality in transforming small counts by county to aggregated counts for large, non-standard study zones. This conceptually simple cartographic method should be of interest to public health practitioners and researchers limited to analysis of data for relatively large enumeration units.

  8. How many Laysan Teal Anas laysanensis are on Midway Atoll? Methods for monitoring abundance after reintroduction

    USGS Publications Warehouse

    Reynolds, Michelle H.; Courtot, Karen; Hatfield, Jeffrey

    2017-01-01

    Wildlife managers often request a simple approach to monitor the status of species of concern. In response to that need, we used eight years of monitoring data to estimate population size and test the validity of an index for monitoring accurately the abundance of reintroduced, endangered Laysan Teal Anas laysanensis. The population was established at Midway Atoll in the Hawaiian archipelago after 42 wild birds were translocated from Laysan Island during 2004–2005. We fitted 587 birds with unique markers during 2004–2015, recorded 21,309 sightings until March 2016, and conducted standardised survey counts during 2007–2015. A modified Lincoln-Petersen mark-resight estimator and ANCOVA models were used to test the relationship between survey counts, seasonal detectability, and population abundance. Differences were found between the breeding and non-breeding seasons in detection and how maximum counts recorded related to population estimates. The results showed strong, positive correlations between the seasonal maximum counts and population estimates. The ANCOVA models supported the use of standardised bi-monthly counts of unmarked birds as a valid index to monitor trends among years within a season at Midway Atoll. The translocated population increased to 661 adult and juvenile birds (95% CI = 608–714) by 2010, then declined by 38% between 2010 and 2012 after the Toˉhoku Japan earthquake-generated tsunami inundated 41% of the atoll and triggered an Avian Botulism type C Clostridium botulinum outbreak. Following another severe botulism outbreak during 2015, the population experienced a 37% decline. Data indicated that the Midway Atoll population, like the founding Laysan Island population, is susceptible to catastrophic population declines. Consistent standardised monitoring using simple counts, in place of mark-recapture and resightings surveys, can be used to evaluate population status over the long-term. We estimate there were 314–435 Laysan Teal (95% CI for population estimate; point estimate = 375 individuals) at Midway Atoll in 2015; c. 50% of the global population. In comparison, the most recent estimate for numbers on Laysan Island was of 339 individuals in 2012 (95% CI = 265–413). We suggest that this approach can be used to validate a survey index for any marked, reintroduced resident wildlife population.

  9. Immunological techniques as tools to characterize the subsurface microbial community at a trichloroethylene contaminated site

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fliermans, C.B.; Dougherty, J.M.; Franck, M.M.

    Effective in situ bioremediation strategies require an understanding of the effects pollutants and remediation techniques have on subsurface microbial communities. Therefore, detailed characterization of a site`s microbial communities is important. Subsurface sediment borings and water samples were collected from a trichloroethylene (TCE) contaminated site, before and after horizontal well in situ air stripping and bioventing, as well as during methane injection for stimulation of methane-utilizing microorganisms. Subsamples were processed for heterotrophic plate counts, acridine orange direct counts (AODC), community diversity, direct fluorescent antibodies (DFA) enumeration for several nitrogen-transforming bacteria, and Biolog {reg_sign} evaluation of enzyme activity in collected water samples.more » Plate counts were higher in near-surface depths than in the vadose zone sediment samples. During the in situ air stripping and bioventing, counts increased at or near the saturated zone, remained elevated throughout the aquifer, but did not change significantly after the air stripping. Sporadic increases in plate counts at different depths as well as increased diversity appeared to be linked to differing lithologies. AODCs were orders of magnitude higher than plate counts and remained relatively constant with depth except for slight increases near the surface depths and the capillary fringe. Nitrogen-transforming bacteria, as measured by serospecific DFA, were greatly affected both by the in situ air stripping and the methane injection. Biolog{reg_sign} activity appeared to increase with subsurface stimulation both by air and methane. The complexity of subsurface systems makes the use of selective monitoring tools imperative.« less

  10. Lens-free microscopy of cerebrospinal fluid for the laboratory diagnosis of meningitis

    NASA Astrophysics Data System (ADS)

    Delacroix, Robin; Morel, Sophie Nhu An; Hervé, Lionel; Bordy, Thomas; Blandin, Pierre; Dinten, Jean-Marc; Drancourt, Michel; Allier, Cédric

    2018-02-01

    The cytology of the cerebrospinal fluid is traditionally performed by an operator (physician, biologist) by means of a conventional light microscope. The operator visually counts the leukocytes (white blood cells) present in a sample of cerebrospinal fluid (10 μl). It is a tedious job and the result is operator-dependent. Here in order to circumvent the limitations of manual counting, we approach the question of numeration of erythrocytes and leukocytes for the cytological diagnosis of meningitis by means of lens-free microscopy. In a first step, a prospective counts of leukocytes was performed by five different operators using conventional optical microscopy. The visual counting yielded an overall 16.7% misclassification of 72 cerebrospinal fluid specimens in meningitis/non-meningitis categories using a 10 leukocyte/μL cut-off. In a second step, the lens-free microscopy algorithm was adapted step-by-step for counting cerebrospinal fluid cells and discriminating leukocytes from erythrocytes. The optimization of the automatic lens-free counting was based on the prospective analysis of 215 cerebrospinal fluid specimens. The optimized algorithm yielded a 100% sensitivity and a 86% specificity compared to confirmed diagnostics. In a third step, a blind lens-free microscopic analysis of 116 cerebrospinal fluid specimens, including six cases of microbiology confirmed infectious meningitis, yielded a 100% sensitivity and a 79% specificity. Adapted lens-free microscopy is thus emerging as an operator-independent technique for the rapid numeration of leukocytes and erythrocytes in cerebrospinal fluid. In particular, this technique is well suited to the rapid diagnosis of meningitis at point-of-care laboratories.

  11. A review of costing methodologies in critical care studies.

    PubMed

    Pines, Jesse M; Fager, Samuel S; Milzman, David P

    2002-09-01

    Clinical decision making in critical care has traditionally been based on clinical outcome measures such as mortality and morbidity. Over the past few decades, however, increasing competition in the health care marketplace has made it necessary to consider costs when making clinical and managerial decisions in critical care. Sophisticated costing methodologies have been developed to aid this decision-making process. We performed a narrative review of published costing studies in critical care during the past 6 years. A total of 282 articles were found, of which 68 met our search criteria. They involved a mean of 508 patients (range, 20-13,907). A total of 92.6% of the studies (63 of 68) used traditional cost analysis, whereas the remaining 7.4% (5 of 68) used cost-effectiveness analysis. None (0 of 68) used cost-benefit analysis or cost-utility analysis. A total of 36.7% (25 of 68) used hospital charges as a surrogate for actual costs. Of the 43 articles that actually counted costs, 37.2% (16 of 43) counted physician costs, 27.9% (12 of 43) counted facility costs, 34.9% (15 of 43) counted nursing costs, 9.3% (4 of 43) counted societal costs, and 90.7% (39 of 43) counted laboratory, equipment, and pharmacy costs. Our conclusion is that despite considerable progress in costing methodologies, critical care studies have not adequately implemented these techniques. Given the importance of financial implications in medicine, it would be prudent for critical care studies to use these more advanced techniques. Copyright 2002, Elsevier Science (USA). All rights reserved.

  12. Statistical tests to compare motif count exceptionalities

    PubMed Central

    Robin, Stéphane; Schbath, Sophie; Vandewalle, Vincent

    2007-01-01

    Background Finding over- or under-represented motifs in biological sequences is now a common task in genomics. Thanks to p-value calculation for motif counts, exceptional motifs are identified and represent candidate functional motifs. The present work addresses the related question of comparing the exceptionality of one motif in two different sequences. Just comparing the motif count p-values in each sequence is indeed not sufficient to decide if this motif is significantly more exceptional in one sequence compared to the other one. A statistical test is required. Results We develop and analyze two statistical tests, an exact binomial one and an asymptotic likelihood ratio test, to decide whether the exceptionality of a given motif is equivalent or significantly different in two sequences of interest. For that purpose, motif occurrences are modeled by Poisson processes, with a special care for overlapping motifs. Both tests can take the sequence compositions into account. As an illustration, we compare the octamer exceptionalities in the Escherichia coli K-12 backbone versus variable strain-specific loops. Conclusion The exact binomial test is particularly adapted for small counts. For large counts, we advise to use the likelihood ratio test which is asymptotic but strongly correlated with the exact binomial test and very simple to use. PMID:17346349

  13. Beta/alpha continuous air monitor

    DOEpatents

    Becker, Gregory K.; Martz, Dowell E.

    1989-01-01

    A single deep layer silicon detector in combination with a microcomputer, recording both alpha and beta activity and the energy of each pulse, distinguishing energy peaks using a novel curve fitting technique to reduce the natural alpha counts in the energy region where plutonium and other transuranic alpha emitters are present, and using a novel algorithm to strip out radon daughter contribution to actual beta counts.

  14. An Effectiveness Evaluation Between Manual and Automated Readability Counting Techniques. CNETS Report 5-75.

    ERIC Educational Resources Information Center

    Bunde, Gary R.

    A statistical comparison was made between two automated devices which were used to count data points (words, sentences, and syllables) needed in the Flesch Reading Ease Score to determine the reading grade level of written material. Determination of grade level of all Rate Training Manuals and Non-Resident Career Courses had been requested by the…

  15. Estimation of U content in coffee samples by fission-track counting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sharma, P.K.; Lal, N.; Nagpaul, K.K.

    1985-06-01

    Because coffee is consumed in large quantities by humans, the authors undertook the study of the uranium content of coffee as a continuation of earlier work to estimate the U content of foodstuffs. Since literature on this subject is scarce, they decided to use the well-established fission-track-counting technique to determine the U content of coffee.

  16. Application of the microbiological method DEFT/APC to detect minimally processed vegetables treated with gamma radiation

    NASA Astrophysics Data System (ADS)

    Araújo, M. M.; Duarte, R. C.; Silva, P. V.; Marchioni, E.; Villavicencio, A. L. C. H.

    2009-07-01

    Marketing of minimally processed vegetables (MPV) are gaining impetus due to its convenience, freshness and apparent health effect. However, minimal processing does not reduce pathogenic microorganisms to safe levels. Food irradiation is used to extend the shelf life and to inactivate food-borne pathogens. In combination with minimal processing it could improve safety and quality of MPV. A microbiological screening method based on the use of direct epifluorescent filter technique (DEFT) and aerobic plate count (APC) has been established for the detection of irradiated foodstuffs. The aim of this study was to evaluate the applicability of this technique in detecting MPV irradiation. Samples from retail markets were irradiated with 0.5 and 1.0 kGy using a 60Co facility. In general, with a dose increment, DEFT counts remained similar independent of the irradiation while APC counts decreased gradually. The difference of the two counts gradually increased with dose increment in all samples. It could be suggested that a DEFT/APC difference over 2.0 log would be a criteria to judge if a MPV was treated by irradiation. The DEFT/APC method could be used satisfactorily as a screening method for indicating irradiation processing.

  17. Refining comparative proteomics by spectral counting to account for shared peptides and multiple search engines

    PubMed Central

    Chen, Yao-Yi; Dasari, Surendra; Ma, Ze-Qiang; Vega-Montoto, Lorenzo J.; Li, Ming

    2013-01-01

    Spectral counting has become a widely used approach for measuring and comparing protein abundance in label-free shotgun proteomics. However, when analyzing complex samples, the ambiguity of matching between peptides and proteins greatly affects the assessment of peptide and protein inventories, differentiation, and quantification. Meanwhile, the configuration of database searching algorithms that assign peptides to MS/MS spectra may produce different results in comparative proteomic analysis. Here, we present three strategies to improve comparative proteomics through spectral counting. We show that comparing spectral counts for peptide groups rather than for protein groups forestalls problems introduced by shared peptides. We demonstrate the advantage and flexibility of this new method in two datasets. We present four models to combine four popular search engines that lead to significant gains in spectral counting differentiation. Among these models, we demonstrate a powerful vote counting model that scales well for multiple search engines. We also show that semi-tryptic searching outperforms tryptic searching for comparative proteomics. Overall, these techniques considerably improve protein differentiation on the basis of spectral count tables. PMID:22552787

  18. Refining comparative proteomics by spectral counting to account for shared peptides and multiple search engines.

    PubMed

    Chen, Yao-Yi; Dasari, Surendra; Ma, Ze-Qiang; Vega-Montoto, Lorenzo J; Li, Ming; Tabb, David L

    2012-09-01

    Spectral counting has become a widely used approach for measuring and comparing protein abundance in label-free shotgun proteomics. However, when analyzing complex samples, the ambiguity of matching between peptides and proteins greatly affects the assessment of peptide and protein inventories, differentiation, and quantification. Meanwhile, the configuration of database searching algorithms that assign peptides to MS/MS spectra may produce different results in comparative proteomic analysis. Here, we present three strategies to improve comparative proteomics through spectral counting. We show that comparing spectral counts for peptide groups rather than for protein groups forestalls problems introduced by shared peptides. We demonstrate the advantage and flexibility of this new method in two datasets. We present four models to combine four popular search engines that lead to significant gains in spectral counting differentiation. Among these models, we demonstrate a powerful vote counting model that scales well for multiple search engines. We also show that semi-tryptic searching outperforms tryptic searching for comparative proteomics. Overall, these techniques considerably improve protein differentiation on the basis of spectral count tables.

  19. CALCIUM ABSORPTION IN MAN: BASED ON LARGE VOLUME LIQUID SCINTILLATION COUNTER STUDIES.

    PubMed

    LUTWAK, L; SHAPIRO, J R

    1964-05-29

    A technique has been developed for the in vivo measurement of absorption of calcium in man after oral administration of 1 to 5 microcuries of calcium-47 and continuous counting of the radiation in the subject's arm with a large volume liquid scintillation counter. The maximum value for the arm counting technique is proportional to the absorption of tracer as measured by direct stool analysis. The rate of uptake by the arm is lower in subjects with either the malabsorption syndrome or hypoparathyroidism. The administration of vitamin D increases both the absorption rate and the maximum amount of calcium absorbed.

  20. Relativistic Transformations of Light Power.

    ERIC Educational Resources Information Center

    McKinley, John M.

    1979-01-01

    Using a photon-counting technique, finds the angular distribution of emitted and detected power and the total radiated power of an arbitrary moving source, and uses the technique to verify the predicted effect of the earth's motion through the cosmic blackbody radiation. (Author/GA)

  1. Chair-side detection of Prevotella Intermedia in mature dental plaque by its fluorescence.

    PubMed

    Nomura, Yoshiaki; Takeuchi, Hiroaki; Okamoto, Masaaki; Sogabe, Kaoru; Okada, Ayako; Hanada, Nobuhiro

    2017-06-01

    Prevotella intermedia/nigrescens is one of the well-known pathogens causing periodontal diseases, and the red florescence excited by the visible blue light caused by the protoporphyrin IX in the bacterial cells could be useful for the chair-side detection. The aim of this study was to evaluated levels of periodontal pathogen, especially P. intermedia in clinical samples of red fluorescent dental plaque. Thirty two supra gingival plaque samples from six individuals were measured its fluorescence at 640nm wavelength excited by 409nm. Periodontopathic bacteria were counted by the Invader PLUS PCR assay. Co-relations the fluorescence intensity and bacterial counts were analyzed by Person's correlation coefficient and simple and multiple regression analysis. Positive and negative predictive values of the fluorescence intensities for with or without P. intermedia in supragingival plaque was calculated. When relative fluorescence unit (RFU) were logarithmic transformed, statistically significant linear relations between RFU and bacterial counts were obtained for P. intermedia, Porphyromonas gingivalis and Tannerella forsythia. By the multiple regression analysis, only P. intermedia had statistically significant co-relation with fluorescence intensities. All of the fluorescent dental plaque contained P. intermedia m. In contrast, 28% of non-fluorescent plaques contained P. intermedia. To check the fluorescence dental plaque in the oral cavity could be the simple chair-side screening of the mature dental plaque before examining the periodontal pathogens especially P. intermedia by the PCR method. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. SIMPL/AVIRIS-NG Greenland 2015: Flight Report

    NASA Technical Reports Server (NTRS)

    Brunt, Kelly M.; Neumann, Thomas A.; Markus, Thorsten

    2015-01-01

    In August 2015, NASA conducted a two-­aircraft, coordinated campaign based out of Thule Air Base, Greenland, in support of Ice, Cloud, and land Elevation Satellite-2 (ICESat-2) algorithm development. The survey targeted the Greenland Ice Sheet and sea ice in the Arctic Ocean during the summer melt season. The survey was conducted with a photon-counting laser altimeter in one aircraft and an imaging spectrometer in the second aircraft. Ultimately, the mission, SIMPL/AVIRIS-NG Greenland 2015, conducted nine coordinated science flights, for a total of 37 flight hours over the ice sheet and sea ice.

  3. Uptake Index of 123I-metaiodobenzylguanidine Myocardial Scintigraphy for Diagnosing Lewy Body Disease

    PubMed Central

    Kamiya, Yoshito; Ota, Satoru; Okumiya, Shintaro; Yamashita, Kosuke; Takaki, Akihiro; Ito, Shigeki

    2017-01-01

    Objective(s): Iodine-123 metaiodobenzylguanidine (123I-MIBG) myocardial scintigraphy has been used to evaluate cardiac sympathetic denervation in Lewy body disease (LBD), including Parkinson’s disease (PD) and dementia with Lewy bodies (DLB). The heart-to-mediastinum ratio (H/M) in PD and DLB is significantly lower than that in Parkinson’s plus syndromes and Alzheimer’s disease. Although this ratio is useful for distinguishing LBD from non-LBD, it fluctuates depending on the system performance of the gamma cameras. Therefore, a new, simple quantification method using 123I-MIBG uptake analysis is required for clinical study. The purpose of this study was to develop a new uptake index with a simple protocol to determine 123I-MIBG uptake on planar images. Methods: The 123I-MIBG input function was obtained from the input counts of the pulmonary artery (PA), which were assessed by analyzing the PA time-activity curves. The heart region of interest used for determining the H/M was used for calculating the uptake index, which was obtained by dividing the heart count by the input count. Results: Forty-eight patients underwent 123I-MIBG chest angiography and planar imaging, after clinical feature assessment and tracer injection. The H/M and 123I-MIBG uptake index were calculated and correlated with clinical features. Values for LBD were significantly lower than those for non-LBD in all analyses (P<0.001). The overlapping ranges between non-LBD and LBD were 2.15 to 2.49 in the H/M method, and 1.04 to 1.22% in the uptake index method. The diagnostic accuracy of the uptake index (area under the curve (AUC), 0.98; sensitivity, 96%; specificity, 91%; positive predictive value (PPV), 90%; negative predictive value (NPV), 93%; and accuracy, 92%) was approximately equal to that of the H/M (AUC, 0.95; sensitivity, 93%; specificity, 91%; PPV, 90%; NPV, 93%; and accuracy, 92%) for discriminating patients with LBD and non-LBD. Conclusion: A simple uptake index method was developed using 123I-MIBG planar imaging and the input counts determined by analyzing chest radioisotope angiography images of the PA. The diagnostic accuracy of the uptake index was approximately equal to that of the H/M for discriminating patients with LBD and non-LBD. PMID:28840137

  4. Horizontal Running Mattress Suture Modified with Intermittent Simple Loops

    PubMed Central

    Chacon, Anna H; Shiman, Michael I; Strozier, Narissa; Zaiac, Martin N

    2013-01-01

    Using the combination of a horizontal running mattress suture with intermittent loops achieves both good eversion with the horizontal running mattress plus the ease of removal of the simple loops. This combination technique also avoids the characteristic railroad track marks that result from prolonged non-absorbable suture retention. The unique feature of our technique is the incorporation of one simple running suture after every two runs of the horizontal running mattress suture. To demonstrate its utility, we used the suturing technique on several patients and analyzed the cosmetic outcome with post-operative photographs in comparison to other suturing techniques. In summary, the combination of running horizontal mattress suture with simple intermittent loops demonstrates functional and cosmetic benefits that can be readily taught, comprehended, and employed, leading to desirable aesthetic results and wound edge eversion. PMID:23723610

  5. Multimorbidity and health-related quality of life (HRQoL) in a nationally representative population sample: implications of count versus cluster method for defining multimorbidity on HRQoL.

    PubMed

    Wang, Lili; Palmer, Andrew J; Cocker, Fiona; Sanderson, Kristy

    2017-01-09

    No universally accepted definition of multimorbidity (MM) exists, and implications of different definitions have not been explored. This study examined the performance of the count and cluster definitions of multimorbidity on the sociodemographic profile and health-related quality of life (HRQoL) in a general population. Data were derived from the nationally representative 2007 Australian National Survey of Mental Health and Wellbeing (n = 8841). The HRQoL scores were measured using the Assessment of Quality of Life (AQoL-4D) instrument. The simple count (2+ & 3+ conditions) and hierarchical cluster methods were used to define/identify clusters of multimorbidity. Linear regression was used to assess the associations between HRQoL and multimorbidity as defined by the different methods. The assessment of multimorbidity, which was defined using the count method, resulting in the prevalence of 26% (MM2+) and 10.1% (MM3+). Statistically significant clusters identified through hierarchical cluster analysis included heart or circulatory conditions (CVD)/arthritis (cluster-1, 9%) and major depressive disorder (MDD)/anxiety (cluster-2, 4%). A sensitivity analysis suggested that the stability of the clusters resulted from hierarchical clustering. The sociodemographic profiles were similar between MM2+, MM3+ and cluster-1, but were different from cluster-2. HRQoL was negatively associated with MM2+ (β: -0.18, SE: -0.01, p < 0.001), MM3+ (β: -0.23, SE: -0.02, p < 0.001), cluster-1 (β: -0.10, SE: 0.01, p < 0.001) and cluster-2 (β: -0.36, SE: 0.01, p < 0.001). Our findings confirm the existence of an inverse relationship between multimorbidity and HRQoL in the Australian population and indicate that the hierarchical clustering approach is validated when the outcome of interest is HRQoL from this head-to-head comparison. Moreover, a simple count fails to identify if there are specific conditions of interest that are driving poorer HRQoL. Researchers should exercise caution when selecting a definition of multimorbidity because it may significantly influence the study outcomes.

  6. A simple technique for laparoscopic gastrostomy.

    PubMed

    Murphy, C; Rosemurgy, A S; Albrink, M H; Carey, L C

    1992-05-01

    While endoscopically placed gastrostomy tubes are routinely simple, they are not always feasible. Endoscopic technique also does not uniformly secure the tube to the abdominal wall, which presents possible complications, including leakage, accidental early tube removal, intraperitoneal catheter migration and necrosis of the stomach or abdominal wall because of excessive traction. Presented herein is a technique that is rapid, simple and eliminates some of these potential complications. The technique is easily combined with other operative procedures, such as tracheostomy, is done under direct vision, can be performed quickly with intravenous sedation and local anesthetic and is a safe method of tube placement for enteral feeding or gastric decompression.

  7. Using Pinochle to motivate the restricted combinations with repetitions problem

    NASA Astrophysics Data System (ADS)

    Gorman, Patrick S.; Kunkel, Jeffrey D.; Vasko, Francis J.

    2011-07-01

    A standard example used in introductory combinatoric courses is to count the number of five-card poker hands possible from a straight deck of 52 distinct cards. A more interesting problem is to count the number of distinct hands possible from a Pinochle deck in which there are multiple, but obviously limited, copies of each type of card (two copies for single-deck, four for double deck). This problem is more interesting because our only concern is to count the number of distinguishable hands that can be dealt. In this note, under various scenarios, we will discuss two combinatoric techniques for counting these hands; namely, the inclusion-exclusion principle and generating functions. We will then show that these Pinochle examples motivate a general counting formula for what are called 'regular' combinations by Riordan. Finally, we prove the correctness of this formula using generating functions.

  8. Characterization of photon-counting multislit breast tomosynthesis.

    PubMed

    Berggren, Karl; Cederström, Björn; Lundqvist, Mats; Fredenberg, Erik

    2018-02-01

    It has been shown that breast tomosynthesis may improve sensitivity and specificity compared to two-dimensional mammography, resulting in increased detection-rate of cancers or lowered call-back rates. The purpose of this study is to characterize a spectral photon-counting multislit breast tomosynthesis system that is able to do single-scan spectral imaging with multiple collimated x-ray beams. The system differs in many aspects compared to conventional tomosynthesis using energy-integrating flat-panel detectors. The investigated system was a prototype consisting of a dual-threshold photon-counting detector with 21 collimated line detectors scanning across the compressed breast. A review of the system is done in terms of detector, acquisition geometry, and reconstruction methods. Three reconstruction methods were used, simple back-projection, filtered back-projection and an iterative algebraic reconstruction technique. The image quality was evaluated by measuring the modulation transfer-function (MTF), normalized noise-power spectrum, detective quantum-efficiency (DQE), and artifact spread-function (ASF) on reconstructed spectral tomosynthesis images for a total-energy bin (defined by a low-energy threshold calibrated to remove electronic noise) and for a high-energy bin (with a threshold calibrated to split the spectrum in roughly equal parts). Acquisition was performed using a 29 kVp W/Al x-ray spectrum at a 0.24 mGy exposure. The difference in MTF between the two energy bins was negligible, that is, there was no energy dependence on resolution. The MTF dropped to 50% at 1.5 lp/mm to 2.3 lp/mm in the scan direction and 2.4 lp/mm to 3.3 lp/mm in the slit direction, depending on the reconstruction method. The full width at half maximum of the ASF was found to range from 13.8 mm to 18.0 mm for the different reconstruction methods. The zero-frequency DQE of the system was found to be 0.72. The fraction of counts in the high-energy bin was measured to be 59% of the total detected spectrum. Scantimes ranged from 4 s to 16.5 s depending on voltage and current settings. The characterized system generates spectral tomosynthesis images with a dual-energy photon-counting detector. Measurements show a high DQE, enabling high image quality at a low dose, which is beneficial for low-dose applications such as screening. The single-scan spectral images open up for applications such as quantitative material decomposition and contrast-enhanced tomosynthesis. © 2017 American Association of Physicists in Medicine.

  9. Evaluation of a Multicolor, Single-Tube Technique To Enumerate Lymphocyte Subpopulations▿

    PubMed Central

    Colombo, F.; Cattaneo, A.; Lopa, R.; Portararo, P.; Rebulla, P.; Porretti, L.

    2008-01-01

    To evaluate the fully automated FACSCanto software, we compared lymphocyte subpopulation counts obtained using three-color FACSCalibur-CELLQuest and six-color FACSCanto-FACSCanto software techniques. High correlations were observed between data obtained with these techniques. Our study indicated that FACSCanto clinical software is accurate and sensitive in single-platform lymphocyte immunophenotyping. PMID:18448621

  10. Early diagnosis of severe combined immunodeficiency (SCID) in Turkey: a pilot study.

    PubMed

    Can, Ceren; Hamilçıkan, Şahin; Can, Emrah

    2017-08-29

    Severe combined immunodeficiency (SCID) is a neonatal emergency. As the T-cell receptor excision circles (TREC) test is not cost effective for neonatal screening of SCID in developing countries, this pilot study's objective aimed at identifying preliminary data to enable SCID identification in the general population. This observational study was performed in Bagcılar Training and Research Hospital, Istanbul, Turkey. Cord-blood complete blood count (CBC) was recorded in all neonates included in the study. Absolute lymphopenia was considered in cord-blood samples if the absolute lymphocyte count was less than 2500/mm 3 . A control blood count was performed 1-month later for cases with detected lymphopenia. A total of 2945 term neonates were included in the study. Absolute lymphopenia was found in nine (0.3%) neonates, while 2936 (99.7%) had an absolute lymphocytic count above 2.5 × 10 3 /mm 3 . The mean counts of red blood cells (RBC), hemoglobin (HGB), hematocrit (HCT), platelets (PLT), and monocytes in the lymphopenia group were not found to significantly differ from the non-lymphopenia group. However, there were significantly lower mean white blood cell (WBC), lymphocyte, and neutrophil counts between the groups (p < .05). Absolute lymphopenia detected using CBC analysis is a simple, easier, more non-invasive, and cheaper method than the TREC method for detection of SCID neonates, and this method may prove to be a useful alternative, especially in developing countries.

  11. Two-dimensional photon-counting detector arrays based on microchannel array plates

    NASA Technical Reports Server (NTRS)

    Timothy, J. G.; Bybee, R. L.

    1975-01-01

    The production of simple and rugged photon-counting detector arrays has been made possible by recent improvements in the performance of the microchannel array plate (MCP) and by the parallel development of compatible electronic readout systems. The construction of proximity-focused MCP arrays of novel design in which photometric information from (n x m) picture elements is read out with a total of (n + m) amplifier and discriminator circuits is described. Results obtained with a breadboard (32 x 32)-element array employing 64 charge-sensitive amplifiers are presented, and the application of systems of this type in spectrometers and cameras for use with ground-based telescopes and on orbiting spacecraft discussed.

  12. On Matrices, Automata, and Double Counting

    NASA Astrophysics Data System (ADS)

    Beldiceanu, Nicolas; Carlsson, Mats; Flener, Pierre; Pearson, Justin

    Matrix models are ubiquitous for constraint problems. Many such problems have a matrix of variables M, with the same constraint defined by a finite-state automaton A on each row of M and a global cardinality constraint gcc on each column of M. We give two methods for deriving, by double counting, necessary conditions on the cardinality variables of the gcc constraints from the automaton A. The first method yields linear necessary conditions and simple arithmetic constraints. The second method introduces the cardinality automaton, which abstracts the overall behaviour of all the row automata and can be encoded by a set of linear constraints. We evaluate the impact of our methods on a large set of nurse rostering problem instances.

  13. Beyond core count: a look at new mainstream computing platforms for HEP workloads

    NASA Astrophysics Data System (ADS)

    Szostek, P.; Nowak, A.; Bitzes, G.; Valsan, L.; Jarp, S.; Dotti, A.

    2014-06-01

    As Moore's Law continues to deliver more and more transistors, the mainstream processor industry is preparing to expand its investments in areas other than simple core count. These new interests include deep integration of on-chip components, advanced vector units, memory, cache and interconnect technologies. We examine these moving trends with parallelized and vectorized High Energy Physics workloads in mind. In particular, we report on practical experience resulting from experiments with scalable HEP benchmarks on the Intel "Ivy Bridge-EP" and "Haswell" processor families. In addition, we examine the benefits of the new "Haswell" microarchitecture and its impact on multiple facets of HEP software. Finally, we report on the power efficiency of new systems.

  14. New non-randomised model to assess the prevalence of discriminating behaviour: a pilot study on mephedrone

    PubMed Central

    2011-01-01

    Background An advantage of randomised response and non-randomised models investigating sensitive issues arises from the characteristic that individual answers about discriminating behaviour cannot be linked to the individuals. This study proposed a new fuzzy response model coined 'Single Sample Count' (SSC) to estimate prevalence of discriminating or embarrassing behaviour in epidemiologic studies. Methods The SSC was tested and compared to the established Forced Response (FR) model estimating Mephedrone use. Estimations from both SSC and FR were then corroborated with qualitative hair screening data. Volunteers (n = 318, mean age = 22.69 ± 5.87, 59.1% male) in a rural area in north Wales and a metropolitan area in England completed a questionnaire containing the SSC and FR in alternating order, and four questions canvassing opinions and beliefs regarding Mephedrone. Hair samples were screened for Mephedrone using a qualitative Liquid Chromatography-Mass Spectrometry method. Results The SSC algorithm improves upon the existing item count techniques by utilizing known population distributions and embeds the sensitive question among four unrelated innocuous questions with binomial distribution. Respondents are only asked to indicate how many without revealing which ones are true. The two probability models yielded similar estimates with the FR being between 2.6% - 15.0%; whereas the new SSC ranged between 0% - 10%. The six positive hair samples indicated that the prevalence rate in the sample was at least 4%. The close proximity of these estimates provides evidence to support the validity of the new SSC model. Using simulations, the recommended sample sizes as the function of the statistical power and expected prevalence rate were calculated. Conclusion The main advantages of the SSC over other indirect methods are: simple administration, completion and calculation, maximum use of the data and good face validity for all respondents. Owing to the key feature that respondents are not required to answer the sensitive question directly, coupled with the absence of forced response or obvious self-protective response strategy, the SSC has the potential to cut across self-protective barriers more effectively than other estimation models. This elegantly simple, quick and effective method can be successfully employed in public health research investigating compromising behaviours. PMID:21812979

  15. Beta/alpha continuous air monitor

    DOEpatents

    Becker, G.K.; Martz, D.E.

    1988-06-27

    A single deep layer silicon detector in combination with a microcomputer, recording both alpha and beta activity and the energy of each pulse, distinquishing energy peaks using a novel curve fitting technique to reduce the natural alpha counts in the energy region where plutonium and other transuranic alpha emitters are present, and using a novel algorithm to strip out radon daughter contribution to actual beta counts. 7 figs.

  16. Comparison study of membrane filtration direct count and an automated coliform and Escherichia coli detection system for on-site water quality testing.

    PubMed

    Habash, Marc; Johns, Robert

    2009-10-01

    This study compared an automated Escherichia coli and coliform detection system with the membrane filtration direct count technique for water testing. The automated instrument performed equal to or better than the membrane filtration test in analyzing E. coli-spiked samples and blind samples with interference from Proteus vulgaris or Aeromonas hydrophila.

  17. Determination of allergenic load and pollen count of Cupressus arizonica pollen by flow cytometry using Cup a1 polyclonal antibody.

    PubMed

    Benítez, Francisco Moreno; Camacho, Antonio Letrán; del Cuvillo Bernal, Alfonso; de Medina, Pedro Lobatón Sánchez; García Cózar, Francisco J; Romeu, Marisa Espinazo

    2014-01-01

    There is an increase in the incidence of pollen related allergy, thus information on pollen schedules would be a great asset for physicians to improve the clinical care of patients. Like cypress pollen sensitization shows a high prevalence among the causes of allergic rhinitis, and therefore it is of interest to use it like a model of study, distinguishing cypress pollen, pollen count, and allergenic load level. In this work, we use a flow cytometry based technique to obtain both Cupressus arizonica pollen count and allergenic load, using specific rabbit polyclonal antibody Cup a1 and its comparison with optical microscopy technique measurement. Airborne samples were collected from Burkard Spore-Trap and Burkard Cyclone Cupressus arizonica pollen was studied using specific rabbit polyclonal antibody Cup a1, labeled with AlexaFluor(®) 488 or 750 and analysed by Flow Cytometry in both an EPICS XL and Cyan ADP cytometers (Beckman Coulter(®) ). Optical microscopy study was realized with a Leica optical microscope. Bland and Altman was used to determine agreement between both techniques measured. We can identify three different populations based on rabbit polyclonal antibody Cup a1 staining. The main region (44.5%) had 97.3% recognition, a second region (25%) with 28% and a third region (30.5%) with 68% respectively. Immunofluorescence and confocal microscopy showed that main region corresponds to whole pollen grains, the second region are pollen without exine and the third region is constituted by smaller particles with allergenic properties. Pollen schedule shows a higher correlation measured by optical microscopy and flow cytometry in the pollen count with a P-value: 0.0008 E(-2) and 0.0002 with regard to smaller particles, so the Bland and Altman measurement showed a good correlation between them, P-value: 0.0003. Determination of pollen count and allergenic load by flow cytometry represents an important tool in the determination of airborne respiratory allergens. We showed that not only whole pollen but also smaller particles could induce allergic sensitization. This is the first study where flow cytometry is used for calculating pollen counts and allergenic load. Copyright © 2013 Clinical Cytometry Society.

  18. Assessment of Intervertebral Disc Degeneration Based on Quantitative MRI Analysis: an in vivo study

    PubMed Central

    Grunert, Peter; Hudson, Katherine D.; Macielak, Michael R.; Aronowitz, Eric; Borde, Brandon H.; Alimi, Marjan; Njoku, Innocent; Ballon, Douglas; Tsiouris, Apostolos John; Bonassar, Lawrence J.; Härtl, Roger

    2015-01-01

    Study design Animal experimental study Objective To evaluate a novel quantitative imaging technique for assessing disc degeneration. Summary of Background Data T2-relaxation time (T2-RT) measurements have been used to quantitatively assess disc degeneration. T2 values correlate with the water content of inter vertebral disc tissue and thereby allow for the indirect measurement of nucleus pulposus (NP) hydration. Methods We developed an algorithm to subtract out MRI voxels not representing NP tissue based on T2-RT values. Filtered NP voxels were used to measure nuclear size by their amount and nuclear hydration by their mean T2-RT. This technique was applied to 24 rat-tail intervertebral discs’ (IVDs), which had been punctured with an 18-gauge needle according to different techniques to induce varying degrees of degeneration. NP voxel count and average T2-RT were used as parameters to assess the degeneration process at 1 and 3 months post puncture. NP voxel counts were evaluated against X-ray disc height measurements and qualitative MRI studies based on the Pfirrmann grading system. Tails were collected for histology to correlate NP voxel counts to histological disc degeneration grades and to NP cross-sectional area measurements. Results NP voxel count measurements showed strong correlations to qualitative MRI analyses (R2=0.79, p<0.0001), histological degeneration grades (R2=0.902, p<0.0001) and histological NP cross-sectional area measurements (R2=0.887, p<0.0001). In contrast to NP voxel counts, the mean T2-RT for each punctured group remained constant between months 1 and 3. The mean T2-RTs for the punctured groups did not show a statistically significant difference from those of healthy IVDs (63.55ms ±5.88ms month 1 and 62.61ms ±5.02ms) at either time point. Conclusion The NP voxel count proved to be a valid parameter to quantitatively assess disc degeneration in a needle puncture model. The mean NP T2-RT does not change significantly in needle-puncture induced degenerated IVDs. IVDs can be segmented into different tissue components according to their innate T2-RT. PMID:24384655

  19. Stereological evaluation of the volume and volume fraction of newborns' brain compartment and brain in magnetic resonance images.

    PubMed

    Nisari, Mehtap; Ertekin, Tolga; Ozçelik, Ozlem; Cınar, Serife; Doğanay, Selim; Acer, Niyazi

    2012-11-01

    Brain development in early life is thought to be critical period in neurodevelopmental disorder. Knowledge relating to this period is currently quite limited. This study aimed to evaluate the volume relation of total brain (TB), cerebrum, cerebellum and bulbus+pons by the use of Archimedes' principle and stereological (point-counting) method and after that to compare these approaches with each other in newborns. This study was carried out on five newborn cadavers mean weighing 2.220 ± 1.056 g with no signs of neuropathology. The mean (±SD) age of the subjects was 39.7 (±1.5) weeks. The volume and volume fraction of the total brain, cerebrum, cerebellum and bulbus+pons were determined on magnetic resonance (MR) images using the point-counting approach of stereological methods and by the use of fluid displacement technique. The mean (±SD) TB, cerebrum, cerebellum and bulbus+pons volumes by fluid displacement were 271.48 ± 78.3, 256.6 ± 71.8, 12.16 ± 6.1 and 2.72 ± 1.6 cm3, respectively. By the Cavalieri principle (point-counting) using sagittal MRIs, they were 262.01 ± 74.9, 248.11 ± 68.03, 11.68 ± 6.1 and 2.21 ± 1.13 cm3, respectively. The mean (± SD) volumes by point-counting technique using axial MR images were 288.06 ± 88.5, 275.2 ± 83.1, 19.75 ± 5.3 and 2.11 ± 0.7 cm3, respectively. There were no differences between the fluid displacement and point-counting (using axial and sagittal images) for all structures (p > 0.05). This study presents the basic data for studies relative to newborn's brain volume fractions according to two methods. Stereological (point-counting) estimation may be accepted a beneficial and new tool for neurological evaluation in vivo research of the brain. Based on these techniques we introduce here, the clinician may evaluate the growth of the brain in a more efficient and precise manner.

  20. The impact of varicocelectomy on sperm parameters: a meta-analysis.

    PubMed

    Schauer, Ingrid; Madersbacher, Stephan; Jost, Romy; Hübner, Wilhelm Alexander; Imhof, Martin

    2012-05-01

    We determined the impact of 3 surgical techniques (high ligation, inguinal varicocelectomy and the subinguinal approach) for varicocelectomy on sperm parameters (count and motility) and pregnancy rates. By searching the literature using MEDLINE and the Cochrane Library with the last search performed in February 2011, focusing on the last 20 years, a total of 94 articles published between 1975 and 2011 reporting on sperm parameters before and after varicocelectomy were identified. Inclusion criteria for this meta-analysis were at least 2 semen analyses (before and 3 or more months after the procedure), patient age older than 19 years, clinical subfertility and/or abnormal semen parameters, and a clinically palpable varicocele. To rule out skewing factors a bias analysis was performed, and statistical analysis was done with RevMan5(®) and SPSS 15.0(®). A total of 14 articles were included in the statistical analysis. All 3 surgical approaches led to significant or highly significant postoperative improvement of both parameters with only slight numeric differences among the techniques. This difference did not reach statistical significance for sperm count (p = 0.973) or sperm motility (p = 0.372). After high ligation surgery sperm count increased by 10.85 million per ml (p = 0.006) and motility by 6.80% (p <0.00001) on the average. Inguinal varicocelectomy led to an improvement in sperm count of 7.17 million per ml (p <0.0001) while motility changed by 9.44% (p = 0.001). Subinguinal varicocelectomy provided an increase in sperm count of 9.75 million per ml (p = 0.002) and sperm motility by 12.25% (p = 0.001). Inguinal varicocelectomy showed the highest pregnancy rate of 41.48% compared to 26.90% and 26.56% after high ligation and subinguinal varicocelectomy, respectively, and the difference was statistically significant (p = 0.035). This meta-analysis suggests that varicocelectomy leads to significant improvements in sperm count and motility regardless of surgical technique, with the inguinal approach offering the highest pregnancy rate. Copyright © 2012 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  1. A Review of Statistical Disclosure Control Techniques Employed by Web-Based Data Query Systems.

    PubMed

    Matthews, Gregory J; Harel, Ofer; Aseltine, Robert H

    We systematically reviewed the statistical disclosure control techniques employed for releasing aggregate data in Web-based data query systems listed in the National Association for Public Health Statistics and Information Systems (NAPHSIS). Each Web-based data query system was examined to see whether (1) it employed any type of cell suppression, (2) it used secondary cell suppression, and (3) suppressed cell counts could be calculated. No more than 30 minutes was spent on each system. Of the 35 systems reviewed, no suppression was observed in more than half (n = 18); observed counts below the threshold were observed in 2 sites; and suppressed values were recoverable in 9 sites. Six sites effectively suppressed small counts. This inquiry has revealed substantial weaknesses in the protective measures used in data query systems containing sensitive public health data. Many systems utilized no disclosure control whatsoever, and the vast majority of those that did deployed it inconsistently or inadequately.

  2. Compact SPAD-Based Pixel Architectures for Time-Resolved Image Sensors

    PubMed Central

    Perenzoni, Matteo; Pancheri, Lucio; Stoppa, David

    2016-01-01

    This paper reviews the state of the art of single-photon avalanche diode (SPAD) image sensors for time-resolved imaging. The focus of the paper is on pixel architectures featuring small pixel size (<25 μm) and high fill factor (>20%) as a key enabling technology for the successful implementation of high spatial resolution SPAD-based image sensors. A summary of the main CMOS SPAD implementations, their characteristics and integration challenges, is provided from the perspective of targeting large pixel arrays, where one of the key drivers is the spatial uniformity. The main analog techniques aimed at time-gated photon counting and photon timestamping suitable for compact and low-power pixels are critically discussed. The main features of these solutions are the adoption of analog counting techniques and time-to-analog conversion, in NMOS-only pixels. Reliable quantum-limited single-photon counting, self-referenced analog-to-digital conversion, time gating down to 0.75 ns and timestamping with 368 ps jitter are achieved. PMID:27223284

  3. Detection of bremsstrahlung radiation of 90Sr-90Y for emergency lung counting.

    PubMed

    Ho, A; Hakmana Witharana, S S; Jonkmans, G; Li, L; Surette, R A; Dubeau, J; Dai, X

    2012-09-01

    This study explores the possibility of developing a field-deployable (90)Sr detector for rapid lung counting in emergency situations. The detection of beta-emitters (90)Sr and its daughter (90)Y inside the human lung via bremsstrahlung radiation was performed using a 3″ × 3″ NaI(Tl) crystal detector and a polyethylene-encapsulated source to emulate human lung tissue. The simulation results show that this method is a viable technique for detecting (90)Sr with a minimum detectable activity (MDA) of 1.07 × 10(4) Bq, using a realistic dual-shielded detector system in a 0.25-µGy h(-1) background field for a 100-s scan. The MDA is sufficiently sensitive to meet the requirement for emergency lung counting of Type S (90)Sr intake. The experimental data were verified using Monte Carlo calculations, including an estimate for internal bremsstrahlung, and an optimisation of the detector geometry was performed. Optimisations in background reduction techniques and in the electronic acquisition systems are suggested.

  4. Counter Conjectures: Using Manipulatives to Scaffold the Development of Number Sense and Algebra

    ERIC Educational Resources Information Center

    West, John

    2016-01-01

    This article takes the position that teachers can use simple manipulative materials to model relatively complex situations and in doing so scaffold the development of students' number sense and early algebra skills. While students' early experiences are usually dominated by the cardinal aspect of number (i.e., counting the number of items in a…

  5. QUANTITATION OF MENSTRUAL BLOOD LOSS: A RADIOACTIVE METHOD UTILIZING A COUNTING DOME

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tauxe, W.N.

    A description has been given of a simple, accurate tech nique for the quantitation of menstrual blood loss, involving the determination of a three- dimensional isosensitivity curve and the fashioning of a lucite dome with cover to fit these specifications. Ten normal subjects lost no more than 50 ml each per menstrual period. (auth)

  6. Impact of donor- and collection-related variables on product quality in ex utero cord blood banking.

    PubMed

    Askari, Sabeen; Miller, John; Chrysler, Gayl; McCullough, Jeffrey

    2005-02-01

    Optimizing product quality is a current focus in cord blood banking. This study evaluates the role of selected donor- and collection-related variables. Retrospective review was performed of cord blood units (CBUs) collected ex utero between February 1, 2000, and February 28, 2002. Preprocessing volume and total nucleated cell (TNC) counts and postprocessing CD34 cell counts were used as product quality indicators. Of 2084 CBUs, volume determinations and TNC counts were performed on 1628 and CD34+ counts on 1124 CBUs. Mean volume and TNC and CD34+ counts were 85.2 mL, 118.9 x 10(7), and 5.2 x 10(6), respectively. In univariate analysis, placental weight of greater than 500 g and meconium in amniotic fluid correlated with better volume and TNC and CD34+ counts. Greater than 40 weeks' gestation predicted enhanced volume and TNC count. Cesarean section, two- versus one-person collection, and not greater than 5 minutes between placental delivery and collection produced superior volume. Increased TNC count was also seen in Caucasian women, primigravidae, female newborns, and collection duration of more than 5 minutes. A time between delivery of newborn and placenta of not greater than 10 minutes predicted better volume and CD34+ count. By regression analysis, collection within not greater than 5 minutes of placental delivery produced superior volume and TNC count. Donor selection and collection technique modifications may improve product quality. TNC count appears to be more affected by different variables than CD34+ count.

  7. Differential white cell count by centrifugal microfluidics.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sommer, Gregory Jon; Tentori, Augusto M.; Schaff, Ulrich Y.

    We present a method for counting white blood cells that is uniquely compatible with centrifugation based microfluidics. Blood is deposited on top of one or more layers of density media within a microfluidic disk. Spinning the disk causes the cell populations within whole blood to settle through the media, reaching an equilibrium based on the density of each cell type. Separation and fluorescence measurement of cell types stained with a DNA dye is demonstrated using this technique. The integrated signal from bands of fluorescent microspheres is shown to be proportional to their initial concentration in suspension. Among the current generationmore » of medical diagnostics are devices based on the principle of centrifuging a CD sized disk functionalized with microfluidics. These portable 'lab on a disk' devices are capable of conducting multiple assays directly from a blood sample, embodied by platforms developed by Gyros, Samsung, and Abaxis. [1,2] However, no centrifugal platform to date includes a differential white blood cell count, which is an important metric complimentary to diagnostic assays. Measuring the differential white blood cell count (the relative fraction of granulocytes, lymphocytes, and monocytes) is a standard medical diagnostic technique useful for identifying sepsis, leukemia, AIDS, radiation exposure, and a host of other conditions that affect the immune system. Several methods exist for measuring the relative white blood cell count including flow cytometry, electrical impedance, and visual identification from a stained drop of blood under a microscope. However, none of these methods is easily incorporated into a centrifugal microfluidic diagnostic platform.« less

  8. Path-Following Solutions Of Nonlinear Equations

    NASA Technical Reports Server (NTRS)

    Barger, Raymond L.; Walters, Robert W.

    1989-01-01

    Report describes some path-following techniques for solution of nonlinear equations and compares with other methods. Use of multipurpose techniques applicable at more than one stage of path-following computation results in system relatively simple to understand, program, and use. Comparison of techniques with method of parametric differentiation (MPD) reveals definite advantages for path-following methods. Emphasis in investigation on multiuse techniques being applied at more than one stage of path-following computation. Incorporation of multipurpose techniques results in concise computer code relatively simple to use.

  9. Effect of background correction on peak detection and quantification in online comprehensive two-dimensional liquid chromatography using diode array detection.

    PubMed

    Allen, Robert C; John, Mallory G; Rutan, Sarah C; Filgueira, Marcelo R; Carr, Peter W

    2012-09-07

    A singular value decomposition-based background correction (SVD-BC) technique is proposed for the reduction of background contributions in online comprehensive two-dimensional liquid chromatography (LC×LC) data. The SVD-BC technique was compared to simply subtracting a blank chromatogram from a sample chromatogram and to a previously reported background correction technique for one dimensional chromatography, which uses an asymmetric weighted least squares (AWLS) approach. AWLS was the only background correction technique to completely remove the background artifacts from the samples as evaluated by visual inspection. However, the SVD-BC technique greatly reduced or eliminated the background artifacts as well and preserved the peak intensity better than AWLS. The loss in peak intensity by AWLS resulted in lower peak counts at the detection thresholds established using standards samples. However, the SVD-BC technique was found to introduce noise which led to detection of false peaks at the lower detection thresholds. As a result, the AWLS technique gave more precise peak counts than the SVD-BC technique, particularly at the lower detection thresholds. While the AWLS technique resulted in more consistent percent residual standard deviation values, a statistical improvement in peak quantification after background correction was not found regardless of the background correction technique used. Copyright © 2012 Elsevier B.V. All rights reserved.

  10. Wild turkey poult survival in southcentral Iowa

    USGS Publications Warehouse

    Hubbard, M.W.; Garner, D.L.; Klaas, E.E.

    1999-01-01

    Poult survival is key to understanding annual change in wild turkey (Meleagris gallopavo) populations. Survival of eastern wild turkey poults (M. g. silvestris) 0-4 weeks posthatch was studied in southcentral Iowa during 1994-97. Survival estimates of poults were calculated based on biweekly flush counts and daily locations acquired via radiotelemetry. Poult survival averaged 0.52 ?? 0.14% (?? ?? SE) for telemetry counts and 0.40 ?? 0.15 for flush counts. No within-year or across-year differences were detected between estimation techniques. More than 72% (n = 32) of documented poult mortality occurred ???14 days posthatch, and mammalian predation accounted for 92.9% of documented mortality. If mortality agents are not of concern, we suggest biologists conduct 4-week flush counts to obtain poult survival estimates for use in population models and development of harvest recommendations.

  11. Stochastic hybrid systems for studying biochemical processes.

    PubMed

    Singh, Abhyudai; Hespanha, João P

    2010-11-13

    Many protein and mRNA species occur at low molecular counts within cells, and hence are subject to large stochastic fluctuations in copy numbers over time. Development of computationally tractable frameworks for modelling stochastic fluctuations in population counts is essential to understand how noise at the cellular level affects biological function and phenotype. We show that stochastic hybrid systems (SHSs) provide a convenient framework for modelling the time evolution of population counts of different chemical species involved in a set of biochemical reactions. We illustrate recently developed techniques that allow fast computations of the statistical moments of the population count, without having to run computationally expensive Monte Carlo simulations of the biochemical reactions. Finally, we review different examples from the literature that illustrate the benefits of using SHSs for modelling biochemical processes.

  12. Marrow transplantation in the treatment of a murine heritable hemolytic anemia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Barker, J.E.; McFarland-Starr, E.C.

    1989-05-15

    Mice with hemolytic anemia, sphha/sphha, have extremely fragile RBCs with a lifespan of approximately one day. Neither splenectomy nor simple transplantation of normal marrow after lethal irradiation cures the anemia but instead causes rapid deterioration and death of the mutant unless additional prophylactic procedures are used. In this report, we show that normal marrow transplantation preceded by sublethal irradiation increases but does not normalize RBC count. The mutant RBCs but not all the WBCs are replaced by donor cells. Splenectomy of the improved recipient causes a dramatic decrease in RBC count, indicating that the mutant spleen is a site ofmore » donor-origin erythropoiesis as well as of RBC destruction. Injections of iron dextran did not improve RBC counts. Transplantation of primary recipient marrow cells into a secondary host with a heritable stem cell deficiency (W/Wv) corrects the defect caused by residence of the normal cells in the sphha/sphha host. The original +/+ donor cells replace the RBCs of the secondary host, and the RBC count is normalized. Results indicate that the environment in the sphha/sphha host is detrimental to normal (as well as mutant) erythroid cells but the restriction is not transmitted.« less

  13. Estimating abundance while accounting for rarity, correlated behavior, and other sources of variation in counts

    USGS Publications Warehouse

    Dorazio, Robert M.; Martin, Juulien; Edwards, Holly H.

    2013-01-01

    The class of N-mixture models allows abundance to be estimated from repeated, point count surveys while adjusting for imperfect detection of individuals. We developed an extension of N-mixture models to account for two commonly observed phenomena in point count surveys: rarity and lack of independence induced by unmeasurable sources of variation in the detectability of individuals. Rarity increases the number of locations with zero detections in excess of those expected under simple models of abundance (e.g., Poisson or negative binomial). Correlated behavior of individuals and other phenomena, though difficult to measure, increases the variation in detection probabilities among surveys. Our extension of N-mixture models includes a hurdle model of abundance and a beta-binomial model of detectability that accounts for additional (extra-binomial) sources of variation in detections among surveys. As an illustration, we fit this model to repeated point counts of the West Indian manatee, which was observed in a pilot study using aerial surveys. Our extension of N-mixture models provides increased flexibility. The effects of different sets of covariates may be estimated for the probability of occurrence of a species, for its mean abundance at occupied locations, and for its detectability.

  14. Estimating abundance while accounting for rarity, correlated behavior, and other sources of variation in counts.

    PubMed

    Dorazio, Robert M; Martin, Julien; Edwards, Holly H

    2013-07-01

    The class of N-mixture models allows abundance to be estimated from repeated, point count surveys while adjusting for imperfect detection of individuals. We developed an extension of N-mixture models to account for two commonly observed phenomena in point count surveys: rarity and lack of independence induced by unmeasurable sources of variation in the detectability of individuals. Rarity increases the number of locations with zero detections in excess of those expected under simple models of abundance (e.g., Poisson or negative binomial). Correlated behavior of individuals and other phenomena, though difficult to measure, increases the variation in detection probabilities among surveys. Our extension of N-mixture models includes a hurdle model of abundance and a beta-binomial model of detectability that accounts for additional (extra-binomial) sources of variation in detections among surveys. As an illustration, we fit this model to repeated point counts of the West Indian manatee, which was observed in a pilot study using aerial surveys. Our extension of N-mixture models provides increased flexibility. The effects of different sets of covariates may be estimated for the probability of occurrence of a species, for its mean abundance at occupied locations, and for its detectability.

  15. An Anatomic and Biomechanical Comparison of Bankart Repair Configurations.

    PubMed

    Judson, Christopher H; Voss, Andreas; Obopilwe, Elifho; Dyrna, Felix; Arciero, Robert A; Shea, Kevin P

    2017-11-01

    Suture anchor repair for anterior shoulder instability can be performed using a number of different repair techniques, but none has been proven superior in terms of anatomic and biomechanical properties. Purpose/Hypothesis: The purpose was to compare the anatomic footprint coverage and biomechanical characteristics of 4 different Bankart repair techniques: (1) single row with simple sutures, (2) single row with horizontal mattress sutures, (3) double row with sutures, and (4) double row with labral tape. The hypotheses were as follows: (1) double-row techniques would improve the footprint coverage and biomechanical properties compared with single-row techniques, (2) horizontal mattress sutures would increase the footprint coverage compared with simple sutures, and (3) repair techniques with labral tape and sutures would not show different biomechanical properties. Controlled laboratory study. Twenty-four fresh-frozen cadaveric specimens were dissected. The native labrum was removed and the footprint marked and measured. Repair for each of the 4 groups was performed, and the uncovered footprint was measured using a 3-dimensional digitizer. The strength of the repair sites was assessed using a servohydraulic testing machine and a digital video system to record load to failure, cyclic displacement, and stiffness. The double-row repair techniques with sutures and labral tape covered 73.4% and 77.0% of the footprint, respectively. These percentages were significantly higher than the footprint coverage achieved by single-row repair techniques using simple sutures (38.1%) and horizontal mattress sutures (32.8%) ( P < .001). The footprint coverage of the simple suture and horizontal mattress suture groups was not significantly different ( P = .44). There were no significant differences in load to failure, cyclic displacement, or stiffness between the single-row and double-row groups or between the simple suture and horizontal mattress suture techniques. Likewise, there was no difference in the biomechanical properties of the double-row repair techniques with sutures versus labral tape. Double-row repair techniques provided better coverage of the native footprint of the labrum but did not provide superior biomechanical properties compared with single-row repair techniques. There was no difference in footprint coverage or biomechanical strength between the simple suture and horizontal mattress suture repair techniques. Although the double-row repair techniques had no difference in initial strength, they may improve healing in high-risk patients by improving the footprint coverage.

  16. A Next Generation Digital Counting System For Low-Level Tritium Studies (Project Report)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bowman, P.

    2016-10-03

    Since the early seventies, SRNL has pioneered low-level tritium analysis using various nuclear counting technologies and techniques. Since 1999, SRNL has successfully performed routine low-level tritium analyses with counting systems based on digital signal processor (DSP) modules developed in the late 1990s. Each of these counting systems are complex, unique to SRNL, and fully dedicated to performing routine tritium analyses of low-level environmental samples. It is time to modernize these systems due to a variety of issues including (1) age, (2) lack of direct replacement electronics modules and (3) advances in digital signal processing and computer technology. There has beenmore » considerable development in many areas associated with the enterprise of performing low-level tritium analyses. The objective of this LDRD project was to design, build, and demonstrate a Next Generation Tritium Counting System (NGTCS), while not disrupting the routine low-level tritium analyses underway in the facility on the legacy counting systems. The work involved (1) developing a test bed for building and testing new counting system hardware that does not interfere with our routine analyses, (2) testing a new counting system based on a modern state of the art DSP module, and (3) evolving the low-level tritium counter design to reflect the state of the science.« less

  17. Arraycount, an algorithm for automatic cell counting in microwell arrays.

    PubMed

    Kachouie, Nezamoddin; Kang, Lifeng; Khademhosseini, Ali

    2009-09-01

    Microscale technologies have emerged as a powerful tool for studying and manipulating biological systems and miniaturizing experiments. However, the lack of software complementing these techniques has made it difficult to apply them for many high-throughput experiments. This work establishes Arraycount, an approach to automatically count cells in microwell arrays. The procedure consists of fluorescent microscope imaging of cells that are seeded in microwells of a microarray system and then analyzing images via computer to recognize the array and count cells inside each microwell. To start counting, green and red fluorescent images (representing live and dead cells, respectively) are extracted from the original image and processed separately. A template-matching algorithm is proposed in which pre-defined well and cell templates are matched against the red and green images to locate microwells and cells. Subsequently, local maxima in the correlation maps are determined and local maxima maps are thresholded. At the end, the software records the cell counts for each detected microwell on the original image in high-throughput. The automated counting was shown to be accurate compared with manual counting, with a difference of approximately 1-2 cells per microwell: based on cell concentration, the absolute difference between manual and automatic counting measurements was 2.5-13%.

  18. Evaluation of canoe surveys for anurans along the Rio Grande in Big Bend National Park, Texas

    USGS Publications Warehouse

    Jung, R.E.; Bonine, K.E.; Rosenshield, M.L.; de la Reza, A.; Raimondo, S.; Droege, S.

    2002-01-01

    Surveys for amphibians along large rivers pose monitoring and sampling problems. We used canoes at night to spotlight and listen for anurans along four stretches of the Rio Grande in Big Bend National Park, Texas, in 1998 and 1999. We explored temporal and spatial variation in amphibian counts and species richness and assessed relationships between amphibian counts and environmental variables, as well as amphibian-habitat associations along the banks of the Rio Grande. We documented seven anuran species, but Rio Grande leopard frogs (Rana berlandieri) accounted for 96% of the visual counts. Chorus surveys along the river detected similar or fewer numbers of species, but orders of magnitude fewer individuals compared to visual surveys. The number of species varied on average by 37% across monthly and nightly surveys. We found similar average coefficients of variation in counts of Rio Grande leopard frogs on monthly and nightly bases (CVs = 42-44%), suggesting that canoe surveys are a fairly precise technique for counts of this species. Numbers of Rio Grande leopard frogs observed were influenced by river gage levels and air and water temperatures, suggesting that surveys should be conducted under certain environmental conditions to maximize counts and maintain consistency. We found significant differences in species richness and bullfrog (Rana catesbeiana) counts among the four river stretches. Four rare anuran species were found along certain stretches but not others, which could represent either sampling error or unmeasured environmental or habitat differences among the river stretches. We found a greater association of Rio Grande leopard frogs with mud banks compared to rock or cliff (canyon) areas and with seepwillow and open areas compared to giant reed and other vegetation types. Canoe surveys appear to be a useful survey technique for anurans along the Rio Grande and may work for other large river systems as well.

  19. Rapid enumeration of viable bacteria by image analysis

    NASA Technical Reports Server (NTRS)

    Singh, A.; Pyle, B. H.; McFeters, G. A.

    1989-01-01

    A direct viable counting method for enumerating viable bacteria was modified and made compatible with image analysis. A comparison was made between viable cell counts determined by the spread plate method and direct viable counts obtained using epifluorescence microscopy either manually or by automatic image analysis. Cultures of Escherichia coli, Salmonella typhimurium, Vibrio cholerae, Yersinia enterocolitica and Pseudomonas aeruginosa were incubated at 35 degrees C in a dilute nutrient medium containing nalidixic acid. Filtered samples were stained for epifluorescence microscopy and analysed manually as well as by image analysis. Cells enlarged after incubation were considered viable. The viable cell counts determined using image analysis were higher than those obtained by either the direct manual count of viable cells or spread plate methods. The volume of sample filtered or the number of cells in the original sample did not influence the efficiency of the method. However, the optimal concentration of nalidixic acid (2.5-20 micrograms ml-1) and length of incubation (4-8 h) varied with the culture tested. The results of this study showed that under optimal conditions, the modification of the direct viable count method in combination with image analysis microscopy provided an efficient and quantitative technique for counting viable bacteria in a short time.

  20. Mean Platelet Volume, Red Cell Distribution Width to Platelet Count Ratio, Globulin Platelet Index, and 16 Other Indirect Noninvasive Fibrosis Scores: How Much Do Routine Blood Tests Tell About Liver Fibrosis in Chronic Hepatitis C?

    PubMed

    Thandassery, Ragesh B; Al Kaabi, Saad; Soofi, Madiha E; Mohiuddin, Syed A; John, Anil K; Al Mohannadi, Muneera; Al Ejji, Khalid; Yakoob, Rafie; Derbala, Moutaz F; Wani, Hamidullah; Sharma, Manik; Al Dweik, Nazeeh; Butt, Mohammed T; Kamel, Yasser M; Sultan, Khaleel; Pasic, Fuad; Singh, Rajvir

    2016-07-01

    Many indirect noninvasive scores to predict liver fibrosis are calculated from routine blood investigations. Only limited studies have compared their efficacy head to head. We aimed to compare these scores with liver biopsy fibrosis stages in patients with chronic hepatitis C. From blood investigations of 1602 patients with chronic hepatitis C who underwent a liver biopsy before initiation of antiviral treatment, 19 simple noninvasive scores were calculated. The area under the receiver operating characteristic curves and diagnostic accuracy of each of these scores were calculated (with reference to the Scheuer staging) and compared. The mean age of the patients was 41.8±9.6 years (1365 men). The most common genotype was genotype 4 (65.6%). Significant fibrosis, advanced fibrosis, and cirrhosis were seen in 65.1%, 25.6, and 6.6% of patients, respectively. All the scores except the aspartate transaminase (AST) alanine transaminase ratio, Pohl score, mean platelet volume, fibro-alpha, and red cell distribution width to platelet count ratio index showed high predictive accuracy for the stages of fibrosis. King's score (cutoff, 17.5) showed the highest predictive accuracy for significant and advanced fibrosis. King's score, Göteborg university cirrhosis index, APRI (the AST/platelet count ratio index), and Fibrosis-4 (FIB-4) had the highest predictive accuracy for cirrhosis, with the APRI (cutoff, 2) and FIB-4 (cutoff, 3.25) showing the highest diagnostic accuracy.We derived the study score 8.5 - 0.2(albumin, g/dL) +0.01(AST, IU/L) -0.02(platelet count, 10/L), which at a cutoff of >4.7 had a predictive accuracy of 0.868 (95% confidence interval, 0.833-0.904) for cirrhosis. King's score for significant and advanced fibrosis and the APRI or FIB-4 score for cirrhosis could be the best simple indirect noninvasive scores.

  1. On the Challenge of Interpreting Census Data: Insights from a Study of an Endangered Pinniped

    PubMed Central

    Trillmich, Fritz; Meise, Kristine; Kalberer, Stephanie; Mueller, Birte; Piedrahita, Paolo; Pörschmann, Ulrich; Wolf, Jochen B. W.; Krüger, Oliver

    2016-01-01

    Population monitoring is vital for conservation and management. However, simple counts of animals can be misleading and this problem is exacerbated in seals (pinnipeds) where individuals spend much time foraging away from colonies. We analyzed a 13-year-series of census data of Galapagos sea lions (Zalophus wollebaeki) from the colony of Caamaño, an islet in the center of the Galapagos archipelago where a large proportion of animals was individually marked. Based on regular resighting efforts during the cold, reproductive (cold-R; August to January) and the warm, non-reproductive (warm-nR; February to May) season, we document changes in numbers for different sex and age classes. During the cold-R season the number of adults increased as the number of newborn pups increased. Numbers were larger in the morning and evening than around mid-day and not significantly influenced by tide levels. More adults frequented the colony during the warm-nR season than the cold-R season. Raw counts suggested a decline in numbers over the 13 years, but Lincoln-Petersen (LP-) estimates (assuming a closed population) did not support that conclusion. Raw counts and LP estimates were not significantly correlated, demonstrating the overwhelming importance of variability in attendance patterns of individuals. The probability of observing a given adult in the colony varied between 16% (mean for cold-R season) and 23% (warm-nR season) and may be much less for independent 2 to 4 year olds. Dependent juveniles (up to the age of about 2 years) are observed much more frequently ashore (35% during the cold-R and 50% during the warm-nR seasons). Simple counts underestimate real population size by a factor of 4–6 and may lead to erroneous conclusions about trends in population size. PMID:27148735

  2. Computer Simulation of the Population Growth (Schizosaccharomyces Pombe) Experiment.

    ERIC Educational Resources Information Center

    Daley, Michael; Hillier, Douglas

    1981-01-01

    Describes a computer program (available from authors) developed to simulate "Growth of a Population (Yeast) Experiment." Students actively revise the counting techniques with realistically simulated haemocytometer or eye-piece grid and are reminded of the necessary dilution technique. Program can be modified to introduce such variables…

  3. "Both Answers Make Sense!"

    ERIC Educational Resources Information Center

    Lockwood, Elise

    2014-01-01

    Formulas, problem types, keywords, and tricky techniques can certainly be valuable tools for successful counters. However, they can easily become substitutes for critical thinking about counting problems and for deep consideration of the set of outcomes. Formulas and techniques should serve as tools for students as they think critically about…

  4. Acta Aeronautica et Astronautica Sinica,

    DTIC Science & Technology

    1983-03-04

    power spectrum and counting methods [1,2,3]. If the stochastic load-time mechanism (such as gusts of wind, random 38...34 "- " ° - " . . .. . . . . . . . . ’ - - - Ř vibrations, etc.), then we can use the power spectrum technique, and we can also use the counting method. However, the...simplification for treat - ment so that the differences in obtained results are very minute, and are also closest to the random spectrum. This then tells us

  5. Automatic detection and counting of cattle in UAV imagery based on machine vision technology (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Rahnemoonfar, Maryam; Foster, Jamie; Starek, Michael J.

    2017-05-01

    Beef production is the main agricultural industry in Texas, and livestock are managed in pasture and rangeland which are usually huge in size, and are not easily accessible by vehicles. The current research method for livestock location identification and counting is visual observation which is very time consuming and costly. For animals on large tracts of land, manned aircraft may be necessary to count animals which is noisy and disturbs the animals, and may introduce a source of error in counts. Such manual approaches are expensive, slow and labor intensive. In this paper we study the combination of small unmanned aerial vehicle (sUAV) and machine vision technology as a valuable solution to manual animal surveying. A fixed-wing UAV fitted with GPS and digital RGB camera for photogrammetry was flown at the Welder Wildlife Foundation in Sinton, TX. Over 600 acres were flown with four UAS flights and individual photographs used to develop orthomosaic imagery. To detect animals in UAV imagery, a fully automatic technique was developed based on spatial and spectral characteristics of objects. This automatic technique can even detect small animals that are partially occluded by bushes. Experimental results in comparison to ground-truth show the effectiveness of our algorithm.

  6. Bayesian analysis of energy and count rate data for detection of low count rate radioactive sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klumpp, John

    We propose a radiation detection system which generates its own discrete sampling distribution based on past measurements of background. The advantage to this approach is that it can take into account variations in background with respect to time, location, energy spectra, detector-specific characteristics (i.e. different efficiencies at different count rates and energies), etc. This would therefore be a 'machine learning' approach, in which the algorithm updates and improves its characterization of background over time. The system would have a 'learning mode,' in which it measures and analyzes background count rates, and a 'detection mode,' in which it compares measurements frommore » an unknown source against its unique background distribution. By characterizing and accounting for variations in the background, general purpose radiation detectors can be improved with little or no increase in cost. The statistical and computational techniques to perform this kind of analysis have already been developed. The necessary signal analysis can be accomplished using existing Bayesian algorithms which account for multiple channels, multiple detectors, and multiple time intervals. Furthermore, Bayesian machine-learning techniques have already been developed which, with trivial modifications, can generate appropriate decision thresholds based on the comparison of new measurements against a nonparametric sampling distribution. (authors)« less

  7. Ocular Biocompatibility of Nitinol Intraocular Clips

    PubMed Central

    Velez-Montoya, Raul; Erlanger, Michael

    2012-01-01

    Purpose. To evaluate the tolerance and biocompatibility of a preformed nitinol intraocular clip in an animal model after anterior segment surgery. Methods. Yucatan mini-pigs were used. A 30-gauge prototype injector was used to attach a shape memory nitinol clip to the iris of five pigs. Another five eyes received conventional polypropylene suture with a modified Seipser slip knot. The authors compared the surgical time of each technique. All eyes underwent standard full-field electroretinogram at baseline and 8 weeks after surgery. The animals were euthanized and eyes collected for histologic analysis after 70 days (10 weeks) postsurgery. The corneal thickness, corneal endothelial cell counts, specular microscopy parameters, retina cell counts, and electroretinogram parameters were compared between the groups. A two sample t-test for means and a P value of 0.05 were use for assessing statistical differences between measurements. Results. The injection of the nitinol clip was 15 times faster than conventional suturing. There were no statistical differences between the groups for corneal thickness, endothelial cell counts, specular microscopy parameters, retina cell counts, and electroretinogram measurements. Conclusions. The nitinol clip prototype is well tolerated and showed no evidence of toxicity in the short-term. The injectable delivery system was faster and technically less challenging than conventional suture techniques. PMID:22064995

  8. On the use of positron counting for radio-Assay in nuclear pharmaceutical production.

    PubMed

    Maneuski, D; Giacomelli, F; Lemaire, C; Pimlott, S; Plenevaux, A; Owens, J; O'Shea, V; Luxen, A

    2017-07-01

    Current techniques for the measurement of radioactivity at various points during PET radiopharmaceutical production and R&D are based on the detection of the annihilation gamma rays from the radionuclide in the labelled compound. The detection systems to measure these gamma rays are usually variations of NaI or CsF scintillation based systems requiring costly and heavy lead shielding to reduce background noise. These detectors inherently suffer from low detection efficiency, high background noise and very poor linearity. They are also unable to provide any reasonably useful position information. A novel positron counting technique is proposed for the radioactivity assay during radiopharmaceutical manufacturing that overcomes these limitations. Detection of positrons instead of gammas offers an unprecedented level of position resolution of the radiation source (down to sub-mm) thanks to the nature of the positron interaction with matter. Counting capability instead of charge integration in the detector brings the sensitivity down to the statistical limits at the same time as offering very high dynamic range and linearity from zero to any arbitrarily high activity. This paper reports on a quantitative comparison between conventional detector systems and the proposed positron counting detector. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  9. Estimating the burden of recurrent events in the presence of competing risks: the method of mean cumulative count.

    PubMed

    Dong, Huiru; Robison, Leslie L; Leisenring, Wendy M; Martin, Leah J; Armstrong, Gregory T; Yasui, Yutaka

    2015-04-01

    Cumulative incidence has been widely used to estimate the cumulative probability of developing an event of interest by a given time, in the presence of competing risks. When it is of interest to measure the total burden of recurrent events in a population, however, the cumulative incidence method is not appropriate because it considers only the first occurrence of the event of interest for each individual in the analysis: Subsequent occurrences are not included. Here, we discuss a straightforward and intuitive method termed "mean cumulative count," which reflects a summarization of all events that occur in the population by a given time, not just the first event for each subject. We explore the mathematical relationship between mean cumulative count and cumulative incidence. Detailed calculation of mean cumulative count is described by using a simple hypothetical example, and the computation code with an illustrative example is provided. Using follow-up data from January 1975 to August 2009 collected in the Childhood Cancer Survivor Study, we show applications of mean cumulative count and cumulative incidence for the outcome of subsequent neoplasms to demonstrate different but complementary information obtained from the 2 approaches and the specific utility of the former. © The Author 2015. Published by Oxford University Press on behalf of the Johns Hopkins Bloomberg School of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  10. Vehicle counting system using real-time video processing

    NASA Astrophysics Data System (ADS)

    Crisóstomo-Romero, Pedro M.

    2006-02-01

    Transit studies are important for planning a road network with optimal vehicular flow. A vehicular count is essential. This article presents a vehicle counting system based on video processing. An advantage of such system is the greater detail than is possible to obtain, like shape, size and speed of vehicles. The system uses a video camera placed above the street to image transit in real-time. The video camera must be placed at least 6 meters above the street level to achieve proper acquisition quality. Fast image processing algorithms and small image dimensions are used to allow real-time processing. Digital filters, mathematical morphology, segmentation and other techniques allow identifying and counting all vehicles in the image sequences. The system was implemented under Linux in a 1.8 GHz Pentium 4 computer. A successful count was obtained with frame rates of 15 frames per second for images of size 240x180 pixels and 24 frames per second for images of size 180x120 pixels, thus being able to count vehicles whose speeds do not exceed 150 km/h.

  11. SUBMILLIMETER GALAXY NUMBER COUNTS AND MAGNIFICATION BY GALAXY CLUSTERS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lima, Marcos; Jain, Bhuvnesh; Devlin, Mark

    2010-07-01

    We present an analytical model that reproduces measured galaxy number counts from surveys in the wavelength range of 500 {mu}m-2 mm. The model involves a single high-redshift galaxy population with a Schechter luminosity function that has been gravitationally lensed by galaxy clusters in the mass range 10{sup 13}-10{sup 15} M{sub sun}. This simple model reproduces both the low-flux and the high-flux end of the number counts reported by the BLAST, SCUBA, AzTEC, and South Pole Telescope (SPT) surveys. In particular, our model accounts for the most luminous galaxies detected by SPT as the result of high magnifications by galaxy clustersmore » (magnification factors of 10-30). This interpretation implies that submillimeter (submm) and millimeter surveys of this population may prove to be a useful addition to ongoing cluster detection surveys. The model also implies that the bulk of submm galaxies detected at wavelengths larger than 500 {mu}m lie at redshifts greater than 2.« less

  12. Wavelets, ridgelets, and curvelets for Poisson noise removal.

    PubMed

    Zhang, Bo; Fadili, Jalal M; Starck, Jean-Luc

    2008-07-01

    In order to denoise Poisson count data, we introduce a variance stabilizing transform (VST) applied on a filtered discrete Poisson process, yielding a near Gaussian process with asymptotic constant variance. This new transform, which can be deemed as an extension of the Anscombe transform to filtered data, is simple, fast, and efficient in (very) low-count situations. We combine this VST with the filter banks of wavelets, ridgelets and curvelets, leading to multiscale VSTs (MS-VSTs) and nonlinear decomposition schemes. By doing so, the noise-contaminated coefficients of these MS-VST-modified transforms are asymptotically normally distributed with known variances. A classical hypothesis-testing framework is adopted to detect the significant coefficients, and a sparsity-driven iterative scheme reconstructs properly the final estimate. A range of examples show the power of this MS-VST approach for recovering important structures of various morphologies in (very) low-count images. These results also demonstrate that the MS-VST approach is competitive relative to many existing denoising methods.

  13. Pile-up corrections for high-precision superallowed β decay half-life measurements via γ-ray photopeak counting

    NASA Astrophysics Data System (ADS)

    Grinyer, G. F.; Svensson, C. E.; Andreoiu, C.; Andreyev, A. N.; Austin, R. A. E.; Ball, G. C.; Bandyopadhyay, D.; Chakrawarthy, R. S.; Finlay, P.; Garrett, P. E.; Hackman, G.; Hyland, B.; Kulp, W. D.; Leach, K. G.; Leslie, J. R.; Morton, A. C.; Pearson, C. J.; Phillips, A. A.; Sarazin, F.; Schumaker, M. A.; Smith, M. B.; Valiente-Dobón, J. J.; Waddington, J. C.; Williams, S. J.; Wong, J.; Wood, J. L.; Zganjar, E. F.

    2007-09-01

    A general technique that corrects γ-ray gated β decay-curve data for detector pulse pile-up is presented. The method includes corrections for non-zero time-resolution and energy-threshold effects in addition to a special treatment of saturating events due to cosmic rays. This technique is verified through a Monte Carlo simulation and experimental data using radioactive beams of Na26 implanted at the center of the 8π γ-ray spectrometer at the ISAC facility at TRIUMF in Vancouver, Canada. The β-decay half-life of Na26 obtained from counting 1809-keV γ-ray photopeaks emitted by the daughter Mg26 was determined to be T=1.07167±0.00055 s following a 27σ correction for detector pulse pile-up. This result is in excellent agreement with the result of a previous measurement that employed direct β counting and demonstrates the feasibility of high-precision β-decay half-life measurements through the use of high-purity germanium γ-ray detectors. The technique presented here, while motivated by superallowed-Fermi β decay studies, is general and can be used for all half-life determinations (e.g. α-, β-, X-ray, fission) in which a γ-ray photopeak is used to select the decays of a particular isotope.

  14. Evaluation of Am–Li neutron spectra data for active well type neutron multiplicity measurements of uranium

    DOE PAGES

    Goddard, Braden; Croft, Stephen; Lousteau, Angela; ...

    2016-05-25

    Safeguarding nuclear material is an important and challenging task for the international community. One particular safeguards technique commonly used for uranium assay is active neutron correlation counting. This technique involves irradiating unused uranium with ( α,n) neutrons from an Am-Li source and recording the resultant neutron pulse signal which includes induced fission neutrons. Although this non-destructive technique is widely employed in safeguards applications, the neutron energy spectra from an Am-Li sources is not well known. Several measurements over the past few decades have been made to characterize this spectrum; however, little work has been done comparing the measured spectra ofmore » various Am-Li sources to each other. This paper examines fourteen different Am-Li spectra, focusing on how these spectra affect simulated neutron multiplicity results using the code Monte Carlo N-Particle eXtended (MCNPX). Two measurement and simulation campaigns were completed using Active Well Coincidence Counter (AWCC) detectors and uranium standards of varying enrichment. The results of this work indicate that for standard AWCC measurements, the fourteen Am-Li spectra produce similar doubles and triples count rates. Finally, the singles count rates varied by as much as 20% between the different spectra, although they are usually not used in quantitative analysis.« less

  15. Diagnostic performance of T lymphocyte subpopulations in assessment of liver fibrosis stages in hepatitis C virus patients: simple noninvasive score.

    PubMed

    Toson, El-Shatat A; Shiha, Gamal E; El-Mezayen, Hatem A; El-Sharkawy, Aml M

    2016-08-01

    Evaluation of liver fibrosis in patients infected with hepatitis C virus is highly useful for the diagnosis of the disease as well as therapeutic decision. Our aim was to develop and validate a simple noninvasive score for liver fibrosis staging in chronic hepatitis C (CHC) patients and compare its performance against three published simple noninvasive indexes. CHC patients were divided into two groups: an estimated group (n=70) and a validated group (n=52). Liver fibrosis was tested in biopsies using the Metavair score system. CD4 and CD8 count/percentage were assayed by fluorescence-activated cell sorting analysis. The multivariate discriminant analysis selects a function on the basis of absolute values of five biochemical markers: immune fibrosis index (IFI); score=3.07+3.06×CD4/CD8+0.02×α-fetoprotein (U/l)-0.07×alanine aminotransferase ratio-0.005×platelet count (10/l)-1.4×albumin (g/dl). The IFI score produced areas under curve of 0.949, 0.947, and 0.806 for differentiation of all patient categories [significant fibrosis (F2-F4), advanced fibrosis (F3-F4), and cirrhosis (F4)]. The IFI score, a novel noninvasive test, can be used easily for the prediction of liver fibrosis stage in CHC patients. Our score was more efficient than aspartate aminotransferase to platelet ratio index, fibrosis index, and fibroQ and more suitable for use in Egyptian hepatitis C virus patients.

  16. The effect of microchannel plate gain depression on PAPA photon counting cameras

    NASA Astrophysics Data System (ADS)

    Sams, Bruce J., III

    1991-03-01

    PAPA (precision analog photon address) cameras are photon counting imagers which employ microchannel plates (MCPs) for image intensification. They have been used extensively in astronomical speckle imaging. The PAPA camera can produce artifacts when light incident on its MCP is highly concentrated. The effect is exacerbated by adjusting the strobe detection level too low, so that the camera accepts very small MCP pulses. The artifacts can occur even at low total count rates if the image has highly a concentrated bright spot. This paper describes how to optimize PAPA camera electronics, and describes six techniques which can avoid or minimize addressing errors.

  17. Garment Counting in a Textile Warehouse by Means of a Laser Imaging System

    PubMed Central

    Martínez-Sala, Alejandro Santos; Sánchez-Aartnoutse, Juan Carlos; Egea-López, Esteban

    2013-01-01

    Textile logistic warehouses are highly automated mechanized places where control points are needed to count and validate the number of garments in each batch. This paper proposes and describes a low cost and small size automated system designed to count the number of garments by processing an image of the corresponding hanger hooks generated using an array of phototransistors sensors and a linear laser beam. The generated image is processed using computer vision techniques to infer the number of garment units. The system has been tested on two logistic warehouses with a mean error in the estimated number of hangers of 0.13%. PMID:23628760

  18. Garment counting in a textile warehouse by means of a laser imaging system.

    PubMed

    Martínez-Sala, Alejandro Santos; Sánchez-Aartnoutse, Juan Carlos; Egea-López, Esteban

    2013-04-29

    Textile logistic warehouses are highly automated mechanized places where control points are needed to count and validate the number of garments in each batch. This paper proposes and describes a low cost and small size automated system designed to count the number of garments by processing an image of the corresponding hanger hooks generated using an array of phototransistors sensors and a linear laser beam. The generated image is processed using computer vision techniques to infer the number of garment units. The system has been tested on two logistic warehouses with a mean error in the estimated number of hangers of 0.13%.

  19. Simple and accurate methods for quantifying deformation, disruption, and development in biological tissues

    PubMed Central

    Boyle, John J.; Kume, Maiko; Wyczalkowski, Matthew A.; Taber, Larry A.; Pless, Robert B.; Xia, Younan; Genin, Guy M.; Thomopoulos, Stavros

    2014-01-01

    When mechanical factors underlie growth, development, disease or healing, they often function through local regions of tissue where deformation is highly concentrated. Current optical techniques to estimate deformation can lack precision and accuracy in such regions due to challenges in distinguishing a region of concentrated deformation from an error in displacement tracking. Here, we present a simple and general technique for improving the accuracy and precision of strain estimation and an associated technique for distinguishing a concentrated deformation from a tracking error. The strain estimation technique improves accuracy relative to other state-of-the-art algorithms by directly estimating strain fields without first estimating displacements, resulting in a very simple method and low computational cost. The technique for identifying local elevation of strain enables for the first time the successful identification of the onset and consequences of local strain concentrating features such as cracks and tears in a highly strained tissue. We apply these new techniques to demonstrate a novel hypothesis in prenatal wound healing. More generally, the analytical methods we have developed provide a simple tool for quantifying the appearance and magnitude of localized deformation from a series of digital images across a broad range of disciplines. PMID:25165601

  20. Micro-computed tomography in murine models of cerebral cavernous malformations as a paradigm for brain disease.

    PubMed

    Girard, Romuald; Zeineddine, Hussein A; Orsbon, Courtney; Tan, Huan; Moore, Thomas; Hobson, Nick; Shenkar, Robert; Lightle, Rhonda; Shi, Changbin; Fam, Maged D; Cao, Ying; Shen, Le; Neander, April I; Rorrer, Autumn; Gallione, Carol; Tang, Alan T; Kahn, Mark L; Marchuk, Douglas A; Luo, Zhe-Xi; Awad, Issam A

    2016-09-15

    Cerebral cavernous malformations (CCMs) are hemorrhagic brain lesions, where murine models allow major mechanistic discoveries, ushering genetic manipulations and preclinical assessment of therapies. Histology for lesion counting and morphometry is essential yet tedious and time consuming. We herein describe the application and validations of X-ray micro-computed tomography (micro-CT), a non-destructive technique allowing three-dimensional CCM lesion count and volumetric measurements, in transgenic murine brains. We hereby describe a new contrast soaking technique not previously applied to murine models of CCM disease. Volumetric segmentation and image processing paradigm allowed for histologic correlations and quantitative validations not previously reported with the micro-CT technique in brain vascular disease. Twenty-two hyper-dense areas on micro-CT images, identified as CCM lesions, were matched by histology. The inter-rater reliability analysis showed strong consistency in the CCM lesion identification and staging (K=0.89, p<0.0001) between the two techniques. Micro-CT revealed a 29% greater CCM lesion detection efficiency, and 80% improved time efficiency. Serial integrated lesional area by histology showed a strong positive correlation with micro-CT estimated volume (r(2)=0.84, p<0.0001). Micro-CT allows high throughput assessment of lesion count and volume in pre-clinical murine models of CCM. This approach complements histology with improved accuracy and efficiency, and can be applied for lesion burden assessment in other brain diseases. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Simple performance evaluation of pulsed spontaneous parametric down-conversion sources for quantum communications.

    PubMed

    Smirr, Jean-Loup; Guilbaud, Sylvain; Ghalbouni, Joe; Frey, Robert; Diamanti, Eleni; Alléaume, Romain; Zaquine, Isabelle

    2011-01-17

    Fast characterization of pulsed spontaneous parametric down conversion (SPDC) sources is important for applications in quantum information processing and communications. We propose a simple method to perform this task, which only requires measuring the counts on the two output channels and the coincidences between them, as well as modeling the filter used to reduce the source bandwidth. The proposed method is experimentally tested and used for a complete evaluation of SPDC sources (pair emission probability, total losses, and fidelity) of various bandwidths. This method can find applications in the setting up of SPDC sources and in the continuous verification of the quality of quantum communication links.

  2. Dressed tunneling approximation for electronic transport through molecular transistors

    NASA Astrophysics Data System (ADS)

    Seoane Souto, R.; Yeyati, A. Levy; Martín-Rodero, A.; Monreal, R. C.

    2014-02-01

    A theoretical approach for the nonequilibrium transport properties of nanoscale systems coupled to metallic electrodes with strong electron-phonon interactions is presented. It consists of a resummation of the dominant Feynman diagrams from the perturbative expansion in the coupling to the leads. We show that this scheme eliminates the main pathologies found in previous simple analytical approaches for the polaronic regime. The results for the spectral and transport properties are compared with those from several other approaches for a wide range of parameters. The method can be formulated in a simple way to obtain the full counting statistics. Results for the shot and thermal noise are presented.

  3. Book review: Bird census techniques, Second edition

    USGS Publications Warehouse

    Sauer, John R.

    2002-01-01

    Conservation concerns, federal mandates to monitor birds, and citizen science programs have spawned a variety of surveys that collect information on bird populations. Unfortunately, all too frequently these surveys are poorly designed and use inappropriate counting methods. Some of the flawed approaches reflect a lack of understanding of statistical design; many ornithologists simply are not aware that many of our most entrenched counting methods (such as point counts) cannot appropriately be used in studies that compare densities of birds over space and time. It is likely that most of the readers of The Condor have participated in a bird population survey that has been criticized for poor sampling methods. For example, North American readers may be surprised to read in Bird Census Techniques that the North American Breeding Bird Survey 'is seriously flawed in its design,' and that 'Analysis of trends is impossible from points that are positioned along roads' (p. 109). Our conservation efforts are at risk if we do not acknowledge these concerns and improve our survey designs. Other surveys suffer from a lack of focus. In Bird Census Techniques, the authors emphasize that all surveys require clear statements of objectives and an understanding of appropriate survey designs to meet their objectives. Too often, we view survey design as the realm of ornithologists who know the life histories and logistical issues relevant to counting birds. This view reflects pure hubris: survey design is a collaboration between ornithologists, statisticians, and managers, in which goals based on management needs are met by applying statistical principles for design to the biological context of the species of interest. Poor survey design is often due to exclusion of some of these partners from survey development. Because ornithologists are too frequently unaware of these issues, books such as Bird Census Techniques take on added importance as manuals for educating ornithologists about the relevance of survey design and methods and the often subtle interdisciplinary nature of surveys.Review info: Bird Census Techniques, Second Edition. By Colin J. Bibby, Neil D. Burgess, David A. Hill, and Simon H. Mustoe. 2000. Academic Press, London, UK. xvii 1 302 pp.  ISBN 0- 12-095831-7.

  4. Hemodynamic and inflammatory responses following transumbilical and transthoracic lung wedge resection in a live canine model.

    PubMed

    Lu, Hung-Yi; Chu, Yen; Wu, Yi-Cheng; Liu, Chien-Ying; Hsieh, Ming-Ju; Chao, Yin-Kai; Wu, Ching-Yang; Yuan, Hsu-Chia; Ko, Po-Jen; Liu, Yun-Hen; Liu, Hui-Ping

    2015-04-01

    Single-port transumbilical surgery is a well-established platform for minimally invasive abdominal surgery. The aim of this study was to compare the hemodynamics and inflammatory response of a novel transumbilical technique with that of a conventional transthoracic technique in thoracic exploration and lung resection in a canine model. Sixteen dogs were randomly assigned to undergo transumbilical thoracoscopy (n = 8) or standard thoracoscopy (n = 8). Animals in the umbilical group received lung resection via a 3-cm transumbilical incision in combination with a 2.5-cm transdiaphragmatic incision. Animals in the standard thoracoscopy group underwent lung resection via a 3-cm thoracic incision. Hemodynamic parameters (e.g., mean arterial pressure, heart rate, cardiac index, systemic vascular resistance, and global end-diastolic volume index) and inflammatory parameters (e.g., neutrophil count, neutrophil 2',7' -dichlorohydrofluorescein [DCFH] expression, monocyte count, monocyte inducible nitric oxide synthase expression, total lymphocyte count, CD4+ and CD8+ lymphocyte counts, the CD4+/CD8+ratio, plasma Creactive protein level, interleukin-6 level) were evaluated before surgery, during the operation, and on postoperative days 1, 3, 7, and 14. Lung resections were successfully performed in all 16 animals. There were 2 surgery-related mortality complications (1 animal in each group). In the transumbilical group, 1 death was caused by early extubation before the animal fully recovered from the anesthesia. In the thoracoscopic group, 1 death was caused by respiratory distress and the complication of sepsis at 5 days after surgery. There was no significant difference between the two techniques with regard to the hemodynamic and immunologic impact of the surgeries. This study suggests that the hemodynamic and inflammatory changes with endoscopic lung resection performed by the transumbilical approach are comparable to those after using the conventional transthoracic approach. This information is novel and relevant for surgeons interested in developing new surgical techniques in minimally invasive surgery. Copyright © 2015 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  5. Exploring phlebotomy technique as a pre-analytical factor in proteomic analyses by mass spectrometry.

    PubMed

    Penn, Andrew M; Lu, Linghong; Chambers, Andrew G; Balshaw, Robert F; Morrison, Jaclyn L; Votova, Kristine; Wood, Eileen; Smith, Derek S; Lesperance, Maria; del Zoppo, Gregory J; Borchers, Christoph H

    2015-12-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) is an emerging technology for blood biomarker verification and validation; however, the results may be influenced by pre-analytical factors. This exploratory study was designed to determine if differences in phlebotomy techniques would significantly affect the abundance of plasma proteins in an upcoming biomarker development study. Blood was drawn from 10 healthy participants using four techniques: (1) a 20-gauge IV with vacutainer, (2) a 21-gauge direct vacutainer, (3) an 18-gauge butterfly with vacutainer, and (4) an 18-gauge butterfly with syringe draw. The abundances of a panel of 122 proteins (117 proteins, plus 5 matrix metalloproteinase (MMP) proteins) were targeted by LC/MRM-MS. In addition, complete blood count (CBC) data were also compared across the four techniques. Phlebotomy technique significantly affected 2 of the 11 CBC parameters (red blood cell count, p = 0.010; hemoglobin concentration, p = 0.035) and only 12 of the targeted 117 proteins (p < 0.05). Of the five MMP proteins, only MMP7 was detectable and its concentration was not significantly affected by different techniques. Overall, most proteins in this exploratory study were not significantly influenced by phlebotomy technique; however, a larger study with additional patients will be required for confirmation.

  6. Contrast-enhanced spectral mammography based on a photon-counting detector: quantitative accuracy and radiation dose

    NASA Astrophysics Data System (ADS)

    Lee, Seungwan; Kang, Sooncheol; Eom, Jisoo

    2017-03-01

    Contrast-enhanced mammography has been used to demonstrate functional information about a breast tumor by injecting contrast agents. However, a conventional technique with a single exposure degrades the efficiency of tumor detection due to structure overlapping. Dual-energy techniques with energy-integrating detectors (EIDs) also cause an increase of radiation dose and an inaccuracy of material decomposition due to the limitations of EIDs. On the other hands, spectral mammography with photon-counting detectors (PCDs) is able to resolve the issues induced by the conventional technique and EIDs using their energy-discrimination capabilities. In this study, the contrast-enhanced spectral mammography based on a PCD was implemented by using a polychromatic dual-energy model, and the proposed technique was compared with the dual-energy technique with an EID in terms of quantitative accuracy and radiation dose. The results showed that the proposed technique improved the quantitative accuracy as well as reduced radiation dose comparing to the dual-energy technique with an EID. The quantitative accuracy of the contrast-enhanced spectral mammography based on a PCD was slightly improved as a function of radiation dose. Therefore, the contrast-enhanced spectral mammography based on a PCD is able to provide useful information for detecting breast tumors and improving diagnostic accuracy.

  7. Use of Surveillance Data on HIV Diagnoses with HIV-Related Symptoms to Estimate the Number of People Living with Undiagnosed HIV in Need of Antiretroviral Therapy

    PubMed Central

    van Sighem, Ard; Sabin, Caroline A.; Phillips, Andrew N.

    2015-01-01

    Background It is important to have methods available to estimate the number of people who have undiagnosed HIV and are in need of antiretroviral therapy (ART). Methods The method uses the concept that a predictable level of occurrence of AIDS or other HIV-related clinical symptoms which lead to presentation for care, and hence diagnosis of HIV, arises in undiagnosed people with a given CD4 count. The method requires surveillance data on numbers of new HIV diagnoses with HIV-related symptoms, and the CD4 count at diagnosis. The CD4 count-specific rate at which HIV-related symptoms develop are estimated from cohort data. 95% confidence intervals can be constructed using a simple simulation method. Results For example, if there were 13 HIV diagnoses with HIV-related symptoms made in one year with CD4 count at diagnosis between 150–199 cells/mm3, then since the CD4 count-specific rate of HIV-related symptoms is estimated as 0.216 per person-year, the estimated number of person years lived in people with undiagnosed HIV with CD4 count 150–199 cells/mm3 is 13/0.216 = 60 (95% confidence interval: 29–100), which is considered an estimate of the number of people living with undiagnosed HIV in this CD4 count stratum. Conclusions The method is straightforward to implement within a short period once a surveillance system of all new HIV diagnoses, collecting data on HIV-related symptoms at diagnosis, is in place and is most suitable for estimating the number of undiagnosed people with CD4 count <200 cells/mm3 due to the low rate of developing HIV-related symptoms at higher CD4 counts. A potential source of bias is under-diagnosis and under-reporting of diagnoses with HIV-related symptoms. Although this method has limitations as with all approaches, it is important for prompting increased efforts to identify undiagnosed people, particularly those with low CD4 count, and for informing levels of unmet need for ART. PMID:25768925

  8. Use of surveillance data on HIV diagnoses with HIV-related symptoms to estimate the number of people living with undiagnosed HIV in need of antiretroviral therapy.

    PubMed

    Lodwick, Rebecca K; Nakagawa, Fumiyo; van Sighem, Ard; Sabin, Caroline A; Phillips, Andrew N

    2015-01-01

    It is important to have methods available to estimate the number of people who have undiagnosed HIV and are in need of antiretroviral therapy (ART). The method uses the concept that a predictable level of occurrence of AIDS or other HIV-related clinical symptoms which lead to presentation for care, and hence diagnosis of HIV, arises in undiagnosed people with a given CD4 count. The method requires surveillance data on numbers of new HIV diagnoses with HIV-related symptoms, and the CD4 count at diagnosis. The CD4 count-specific rate at which HIV-related symptoms develop are estimated from cohort data. 95% confidence intervals can be constructed using a simple simulation method. For example, if there were 13 HIV diagnoses with HIV-related symptoms made in one year with CD4 count at diagnosis between 150-199 cells/mm3, then since the CD4 count-specific rate of HIV-related symptoms is estimated as 0.216 per person-year, the estimated number of person years lived in people with undiagnosed HIV with CD4 count 150-199 cells/mm3 is 13/0.216 = 60 (95% confidence interval: 29-100), which is considered an estimate of the number of people living with undiagnosed HIV in this CD4 count stratum. The method is straightforward to implement within a short period once a surveillance system of all new HIV diagnoses, collecting data on HIV-related symptoms at diagnosis, is in place and is most suitable for estimating the number of undiagnosed people with CD4 count <200 cells/mm3 due to the low rate of developing HIV-related symptoms at higher CD4 counts. A potential source of bias is under-diagnosis and under-reporting of diagnoses with HIV-related symptoms. Although this method has limitations as with all approaches, it is important for prompting increased efforts to identify undiagnosed people, particularly those with low CD4 count, and for informing levels of unmet need for ART.

  9. Neonatal nucleated red blood cell counts in small-for-gestational age fetuses with abnormal umbilical artery Doppler studies.

    PubMed

    Bernstein, P S; Minior, V K; Divon, M Y

    1997-11-01

    The presence of elevated nucleated red blood cell counts in neonatal blood has been associated with fetal hypoxia. We sought to determine whether small-for-gestational-age fetuses with abnormal umbilical artery Doppler velocity waveforms have elevated nucleated red blood cell counts. Hospital charts of neonates with the discharge diagnosis of small for gestational age (birth weight < 10th percentile) who were delivered between October 1988 and June 1995 were reviewed for antepartum testing, delivery conditions, and neonatal outcome. We studied fetuses who had an umbilical artery systolic/diastolic ratio within 3 days of delivery and a complete blood cell count on the first day of life. Multiple gestations, anomalous fetuses, and infants of diabetic mothers were excluded. Statistical analysis included the Student t test, chi 2 analysis, analysis of variance, and simple and stepwise regression. Fifty-two infants met the inclusion criteria. Those with absent or reversed end-diastolic velocity (n = 19) had significantly greater nucleated red blood cell counts than did those with end-diastolic velocity present (n = 33) (nucleated red blood cells/100 nucleated cells +/- SD: 135.5 +/- 138 vs 17.4 +/- 23.7, p < 0.0001). These infants exhibited significantly longer time intervals for clearance of nucleated red blood cells from their circulation (p < 0.0001). They also had lower birth weights (p < 0.05), lower initial platelet count (p = 0.0006), lower arterial cord blood pH (p < 0.05), higher cord blood base deficit (p < 0.05), and an increased likelihood of cesarean section for "fetal distress" (p < 0.05). Multivariate analysis demonstrated that absent or reversed end-diastolic velocity (p < 0.0001) and low birth weight (p < 0.0001) contributed to the elevation of the nucleated red blood cell count, whereas gestational age at delivery was not a significant contributor. We observed significantly greater nucleated red blood cell counts and lower platelet counts in small-for-gestational-age fetuses with abnormal umbilical artery Doppler studies. This may suggest that antenatal thrombotic events lead to an increased placental impedance. Fetal response to this chronic condition may result in an increased nucleated red blood cell count.

  10. [Identification of Env-specific monoclonal antibodies from Chinese HIV-1 infected person by magnetic beads separating B cells and single cell RT-PCR cloning].

    PubMed

    Huang, Xiang-Ying; Yu, Shuang-Qing; Cheng, Zhan; Ye, Jing-Rong; Xu, Ke; Feng, Xia; Zeng, Yi

    2013-04-01

    To establish a simple and practical method for screening of Env-specific monoclonal antibodies from HIV-1 infected individuals. Human B cells were purified by negative sorting from PBMCs and memory B cells were further enriched using anti-CD27 microbeads. Gp120 antigen labbled with biotin was incubated with memory B cells to specifically bind IgG on cells membrane. The memory B cells expressing the Env-specific antibody were harvested by magnetic beads separating, counted and diluted to the level of single cell in each PCR well that loading with catch buffer containing RNase inhibitor to get RNAs. The antibody genes were amplified by single cell RT-PCR and nested PCR, cloned into eukaryotic expression vectors and transfected into 293T cells. The binding activity of recombinant antibodies to Env were tested by ELISA. Three monocolonal Env-specific antibodies were isolated from one HIV-1 infected individual. We can obtain Env-specific antibody by biotin labbled antigen, magnetic beads separating technique coupled with single cell RT-PCR and expression cloning.

  11. Inference with minimal Gibbs free energy in information field theory.

    PubMed

    Ensslin, Torsten A; Weig, Cornelius

    2010-11-01

    Non-linear and non-gaussian signal inference problems are difficult to tackle. Renormalization techniques permit us to construct good estimators for the posterior signal mean within information field theory (IFT), but the approximations and assumptions made are not very obvious. Here we introduce the simple concept of minimal Gibbs free energy to IFT, and show that previous renormalization results emerge naturally. They can be understood as being the gaussian approximation to the full posterior probability, which has maximal cross information with it. We derive optimized estimators for three applications, to illustrate the usage of the framework: (i) reconstruction of a log-normal signal from poissonian data with background counts and point spread function, as it is needed for gamma ray astronomy and for cosmography using photometric galaxy redshifts, (ii) inference of a gaussian signal with unknown spectrum, and (iii) inference of a poissonian log-normal signal with unknown spectrum, the combination of (i) and (ii). Finally we explain how gaussian knowledge states constructed by the minimal Gibbs free energy principle at different temperatures can be combined into a more accurate surrogate of the non-gaussian posterior.

  12. Efficient Levenberg-Marquardt minimization of the maximum likelihood estimator for Poisson deviates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Laurence, T; Chromy, B

    2009-11-10

    Histograms of counted events are Poisson distributed, but are typically fitted without justification using nonlinear least squares fitting. The more appropriate maximum likelihood estimator (MLE) for Poisson distributed data is seldom used. We extend the use of the Levenberg-Marquardt algorithm commonly used for nonlinear least squares minimization for use with the MLE for Poisson distributed data. In so doing, we remove any excuse for not using this more appropriate MLE. We demonstrate the use of the algorithm and the superior performance of the MLE using simulations and experiments in the context of fluorescence lifetime imaging. Scientists commonly form histograms ofmore » counted events from their data, and extract parameters by fitting to a specified model. Assuming that the probability of occurrence for each bin is small, event counts in the histogram bins will be distributed according to the Poisson distribution. We develop here an efficient algorithm for fitting event counting histograms using the maximum likelihood estimator (MLE) for Poisson distributed data, rather than the non-linear least squares measure. This algorithm is a simple extension of the common Levenberg-Marquardt (L-M) algorithm, is simple to implement, quick and robust. Fitting using a least squares measure is most common, but it is the maximum likelihood estimator only for Gaussian-distributed data. Non-linear least squares methods may be applied to event counting histograms in cases where the number of events is very large, so that the Poisson distribution is well approximated by a Gaussian. However, it is not easy to satisfy this criterion in practice - which requires a large number of events. It has been well-known for years that least squares procedures lead to biased results when applied to Poisson-distributed data; a recent paper providing extensive characterization of these biases in exponential fitting is given. The more appropriate measure based on the maximum likelihood estimator (MLE) for the Poisson distribution is also well known, but has not become generally used. This is primarily because, in contrast to non-linear least squares fitting, there has been no quick, robust, and general fitting method. In the field of fluorescence lifetime spectroscopy and imaging, there have been some efforts to use this estimator through minimization routines such as Nelder-Mead optimization, exhaustive line searches, and Gauss-Newton minimization. Minimization based on specific one- or multi-exponential models has been used to obtain quick results, but this procedure does not allow the incorporation of the instrument response, and is not generally applicable to models found in other fields. Methods for using the MLE for Poisson-distributed data have been published by the wider spectroscopic community, including iterative minimization schemes based on Gauss-Newton minimization. The slow acceptance of these procedures for fitting event counting histograms may also be explained by the use of the ubiquitous, fast Levenberg-Marquardt (L-M) fitting procedure for fitting non-linear models using least squares fitting (simple searches obtain {approx}10000 references - this doesn't include those who use it, but don't know they are using it). The benefits of L-M include a seamless transition between Gauss-Newton minimization and downward gradient minimization through the use of a regularization parameter. This transition is desirable because Gauss-Newton methods converge quickly, but only within a limited domain of convergence; on the other hand the downward gradient methods have a much wider domain of convergence, but converge extremely slowly nearer the minimum. L-M has the advantages of both procedures: relative insensitivity to initial parameters and rapid convergence. Scientists, when wanting an answer quickly, will fit data using L-M, get an answer, and move on. Only those that are aware of the bias issues will bother to fit using the more appropriate MLE for Poisson deviates. However, since there is a simple, analytical formula for the appropriate MLE measure for Poisson deviates, it is inexcusable that least squares estimators are used almost exclusively when fitting event counting histograms. There have been ways found to use successive non-linear least squares fitting to obtain similarly unbiased results, but this procedure is justified by simulation, must be re-tested when conditions change significantly, and requires two successive fits. There is a great need for a fitting routine for the MLE estimator for Poisson deviates that has convergence domains and rates comparable to the non-linear least squares L-M fitting. We show in this report that a simple way to achieve that goal is to use the L-M fitting procedure not to minimize the least squares measure, but the MLE for Poisson deviates.« less

  13. Proton radiography in three dimensions: A proof of principle of a new technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Raytchev, Milen; Seco, Joao

    2013-10-15

    Purpose: Monte Carlo simulations were used to investigate a range of phantom configurations to establish enabling three-dimensional proton radiographic techniques.Methods: A large parameter space of stacked phantom geometries composed of tissue inhomogeneity materials such as lung, bone, and cartilage inserted within water background were simulated using a purposefully modified version of TOPAS, an application running on top of the GEANT4 Monte Carlo code. The phantoms were grouped in two classes, one with the inhomogeneity inserted only half-way in the lateral direction and another with complete inhomogeneity insertion. The former class was used to calculate the track count and the energymore » fluence of the protons as they exit the phantoms either having traversed the inhomogeneity or not. The latter class was used to calculate one yield value accounting for loss of protons due to physical processes only and another yield value accounting for deliberately discarded protons due to large scattering angles. A graphical fingerprinting method was developed to determine the inhomogeneity thickness and location within the phantom based on track count and energy fluence information. Two additional yield values extended this method to the general case which also determines the inhomogeneity material and the phantom thickness.Results: The graphical fingerprinting method was manually validated for two, and automatically tested for all, tissue materials using an exhaustive set of inhomogeneity geometries for 16 cm thick phantoms. Unique recognition of test phantom configurations was achieved in the large majority of cases. The method in the general case was further tested using an exhaustive set of inhomogeneity and phantom tissues and geometries where the phantom thicknesses ranged between 8 and 24 cm. Unique recognition of the test phantom configurations was achieved only for part of the phantom parameter space. The correlations between the remaining false positive recognitions were analyzed.Conclusions: The concept of 3D proton radiography for tissue inhomogeneities of simple geometries was established with the current work. In contrast to conventional 2D proton radiography, the main objective of the demonstrated 3D technique is not proton range. Rather, it is to measure the depth and thickness of an inhomogeneity located in an imaged geometry. Further work is needed to extend and apply the method to more complex geometries.« less

  14. Using Google Scholar Citations to Support the Impact of Scholarly Work

    ERIC Educational Resources Information Center

    Pitney, William A.; Gilson, Todd A.

    2012-01-01

    Athletic training faculty seeking tenure and promotion, or simply undergoing an annual merit review, may need an understanding of the impact of their scholarly work. To that end, citation counts are frequently used as a measure of impact that a journal article has had in a given discipline. As compared to the simple quantity of publications, the…

  15. The Combinatorial Trace Method in Action

    ERIC Educational Resources Information Center

    Krebs, Mike; Martinez, Natalie C.

    2013-01-01

    On any finite graph, the number of closed walks of length k is equal to the sum of the kth powers of the eigenvalues of any adjacency matrix. This simple observation is the basis for the combinatorial trace method, wherein we attempt to count (or bound) the number of closed walks of a given length so as to obtain information about the graph's…

  16. Indicators as Judgment Devices: An Empirical Study of Citizen Bibliometrics in Research Evaluation

    ERIC Educational Resources Information Center

    Hammarfelt, Björn; Rushforth, Alexander D.

    2017-01-01

    A researcher's number of publications has been a fundamental merit in the competition for academic positions since the late 18th century. Today, the simple counting of publications has been supplemented with a whole range of bibliometric indicators, which supposedly not only measures the volume of research but also its impact. In this study, we…

  17. A singlechip-computer-controlled conductivity meter based on conductance-frequency transformation

    NASA Astrophysics Data System (ADS)

    Chen, Wenxiang; Hong, Baocai

    2005-02-01

    A portable conductivity meter controlled by singlechip computer was designed. The instrument uses conductance-frequency transformation method to measure the conductivity of solution. The circuitry is simple and reliable. Another feature of the instrument is that the temperature compensation is realised by changing counting time of the timing counter. The theoretical based and the usage of temperature compensation are narrated.

  18. The value of crossdating to retain high-frequency variability, climate signals, and extreme events in environmental proxies

    Treesearch

    Bryan A. Black; Daniel Griffin; Peter van der Sleen; Alan D. Wanamaker; James H. Speer; David C. Frank; David W. Stahle; Neil Pederson; Carolyn A. Copenheaver; Valerie Trouet; Shelly Griffin; Bronwyn M. Gillanders

    2016-01-01

    High-resolution biogenic and geologic proxies in which one increment or layer is formed per year are crucial to describing natural ranges of environmental variability in Earth's physical and biological systems. However, dating controls are necessary to ensure temporal precision and accuracy; simple counts cannot ensure that all layers are placed correctly in time...

  19. Counting Parasites: Using Shrimp to Teach Students about Estimation

    ERIC Educational Resources Information Center

    Gunzburger, Lindsay; Curran, Mary Carla

    2013-01-01

    Estimation is an important skill that we rely on every day for simple tasks, such as providing food for a dinner party or arriving at an appointment on time. Despite its importance, most people have never been formally taught how to estimate. Estimation can also be a vital tool for scientific inquiry. We have created an activity designed to teach…

  20. Arthroscopic Medial Meniscus Posterior Root Fixation Using a Modified Mason-Allen Stitch.

    PubMed

    Chung, Kyu Sung; Ha, Jeong Ku; Ra, Ho Jong; Kim, Jin Goo

    2016-02-01

    A complete radial tear of the meniscus posterior root, which can effectively cause a state of total meniscectomy via loss of hoop tension, requires that the torn root be repaired. Several methods have been used to repair medial meniscus posterior root tears, most of which are based on a simple stitch technique that is known to have stitch-holding strength. We applied a modified version of the Mason-Allen stitch technique, which is recognized as a method for rotator cuff repair surgery because its locking effect overcomes the potential weakness of simple stitches. This article introduces the medial meniscus posterior root tears repair procedure based on a modified Mason-Allen stitch technique in which 2 strands (i.e., 1 simple horizontal and 1 simple vertical stitch) are used.

  1. A combined model based on spleen stiffness measurement and Baveno VI criteria to rule out high-risk varices in advanced chronic liver disease.

    PubMed

    Colecchia, Antonio; Ravaioli, Federico; Marasco, Giovanni; Colli, Agostino; Dajti, Elton; Di Biase, Anna Rita; Bacchi Reggiani, Maria Letizia; Berzigotti, Annalisa; Pinzani, Massimo; Festi, Davide

    2018-05-03

    Recently, Baveno VI guidelines suggested that esophagogastroduodenoscopy (EGD) can be avoided in patients with compensated advanced chronic liver disease (cACLD) who have a liver stiffness measurement (LSM) <20 kPa and platelet count >150,000/mm 3 . We aimed to: assess the performance of spleen stiffness measurement (SSM) in ruling out patients with high-risk varices (HRV); validate Baveno VI criteria in a large population and assess how the sequential use of Baveno VI criteria and SSM could safely avoid the need for endoscopy. We retrospectively analyzed 498 patients with cACLD who had undergone LSM/SSM by transient elastography (TE) (FibroScan®), platelet count and EGDs from 2012 to 2016 referred to our tertiary centre. The new combined model was validated internally by a split-validation method, and externally in a prospective multicentre cohort of 115 patients. SSM, LSM, platelet count and Child-Pugh-B were independent predictors of HRV. Applying the newly identified SSM cut-off (≤46 kPa) or Baveno VI criteria, 35.8% and 21.7% of patients in the internal validation cohort could have avoided EGD, with only 2% of HRVs being missed with either model. The combination of SSM with Baveno VI criteria would have avoided an additional 22.5% of EGDs, reaching a final value of 43.8% spared EGDs, with <5% missed HRVs. Results were confirmed in the prospective external validation cohort, as the combined Baveno VI/SSM ≤46 model would have safely spared (0 HRV missed) 37.4% of EGDs, compared to 16.5% when using the Baveno VI criteria alone. A non-invasive prediction model combining SSM with Baveno VI criteria may be useful to rule out HRV and could make it possible to avoid a significantly larger number of unnecessary EGDs compared to Baveno VI criteria only. Spleen stiffness measurement assessed by transient elastography, the most widely used elastography technique, is a non-invasive technique that can help the physician to better stratify the degree of portal hypertension and the risk of esophageal varices in patients with compensated advanced chronic liver disease. Performing spleen stiffness measurement together with liver stiffness measurement during the same examination is simple and fast and this sequential model can identify a greater number of patients that can safely avoid endoscopy, which is an invasive and expensive examination. Crown Copyright © 2018. Published by Elsevier B.V. All rights reserved.

  2. SU-E-I-42: Some New Aspects of the Energy Weighting Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ganezer, K; Krmar, M; Josipovic, I

    2015-06-15

    Purpose: The development in the last few years of photon-counting pixel detectors creates an important and significant opportunity for x-ray spectroscopy to be applied in diagnostics. The energy weighting technique was originally developed to obtain the maximum benefit from the spectroscopic information. In all previous published papers the concept of an energy weighting function was tested on relatively simple test objects having only two materials with different absorption coefficients. Methods: In this study the shape of the energy weighting function was investigated with a set of ten trabecular bone test objects each with a similar geometry and structure but withmore » different attenuation properties. In previous publications it was determined that the function E-3 was a very good choice for the weighting function (wi). Results: The most important Result from this study was the discovery that a single function of the form E-b was not sufficient to explain the energy dependence of the different types of materials that might be used to describe common bodily tissues such as trabecular bone. It was also determined from the data contained in this paper that the exponent b is often significantly dependent upon the attenuation properties of the materials that were used to make the test objects. Conclusion: Studies of the attenuation properties will be useful in further studies involving energy weighting.« less

  3. Detecting trends in raptor counts: power and type I error rates of various statistical tests

    USGS Publications Warehouse

    Hatfield, J.S.; Gould, W.R.; Hoover, B.A.; Fuller, M.R.; Lindquist, E.L.

    1996-01-01

    We conducted simulations that estimated power and type I error rates of statistical tests for detecting trends in raptor population count data collected from a single monitoring site. Results of the simulations were used to help analyze count data of bald eagles (Haliaeetus leucocephalus) from 7 national forests in Michigan, Minnesota, and Wisconsin during 1980-1989. Seven statistical tests were evaluated, including simple linear regression on the log scale and linear regression with a permutation test. Using 1,000 replications each, we simulated n = 10 and n = 50 years of count data and trends ranging from -5 to 5% change/year. We evaluated the tests at 3 critical levels (alpha = 0.01, 0.05, and 0.10) for both upper- and lower-tailed tests. Exponential count data were simulated by adding sampling error with a coefficient of variation of 40% from either a log-normal or autocorrelated log-normal distribution. Not surprisingly, tests performed with 50 years of data were much more powerful than tests with 10 years of data. Positive autocorrelation inflated alpha-levels upward from their nominal levels, making the tests less conservative and more likely to reject the null hypothesis of no trend. Of the tests studied, Cox and Stuart's test and Pollard's test clearly had lower power than the others. Surprisingly, the linear regression t-test, Collins' linear regression permutation test, and the nonparametric Lehmann's and Mann's tests all had similar power in our simulations. Analyses of the count data suggested that bald eagles had increasing trends on at least 2 of the 7 national forests during 1980-1989.

  4. Can Simple Biophysical Principles Yield Complicated Biological Functions?

    NASA Astrophysics Data System (ADS)

    Liphardt, Jan

    2011-03-01

    About once a year, a new regulatory paradigm is discovered in cell biology. As of last count, eukaryotic cells have more than 40 distinct ways of regulating protein concentration and function. Regulatory possibilities include site-specific phosphorylation, epigenetics, alternative splicing, mRNA (re)localization, and modulation of nucleo-cytoplasmic transport. This raises a simple question. Do all the remarkable things cells do, require an intricately choreographed supporting cast of hundreds of molecular machines and associated signaling networks? Alternatively, are there a few simple biophysical principles that can generate apparently very complicated cellular behaviors and functions? I'll discuss two problems, spatial organization of the bacterial chemotaxis system and nucleo-cytoplasmic transport, where the latter might be true. In both cases, the ability to precisely quantify biological organization and function, at the single-molecule level, helped to find signatures of basic biological organizing principles.

  5. Three-dimensional reconstruction of glycosomes in trypanosomatids of the genus Phytomonas.

    PubMed

    Attias, M; de Souza, W

    1995-02-01

    Computer aided three dimensional (3-D) reconstruction of cells from two isolates of protozoa of the genus Phytomonas, trypanosomatids found in plants, were made from 0.3 microm thick sections, imaged on a Zeiss 902 electron microscope with a energy filter for in ellastically scattered electrons, in order to obtain information about glycosomal shape diversity. Direct counts of peroxisomes (glycosomes) from Phytomonas sp. from Chamaesyce thymifolia indicated that there were fewer glycosomes per cell than the simple count of ultrathin section profiles would suggest and that these organelles could be long and branched. On the other hand, the stacked glycosomes observed in the isolate from Euphorbia characias were small individual structures and no connection was seen between them.

  6. The supercontinuum laser as a flexible source for quasi-steady state and time resolved fluorescence studies

    NASA Astrophysics Data System (ADS)

    Fenske, Roger; Näther, Dirk U.; Dennis, Richard B.; Smith, S. Desmond

    2010-02-01

    Commercial Fluorescence Lifetime Spectrometers have long suffered from the lack of a simple, compact and relatively inexpensive broad spectral band light source that can be flexibly employed for both quasi-steady state and time resolved measurements (using Time Correlated Single Photon Counting [TCSPC]). This paper reports the integration of an optically pumped photonic crystal fibre, supercontinuum source1 (Fianium model SC400PP) as a light source in Fluorescence Lifetime Spectrometers (Edinburgh Instruments FLS920 and Lifespec II), with single photon counting detectors (micro-channel plate photomultiplier and a near-infrared photomultiplier) covering the UV to NIR range. An innovative method of spectral selection of the supercontinuum source involving wedge interference filters is also discussed.

  7. Analysis and control of employee turnover.

    PubMed

    McConnell, Charles R

    2007-01-01

    Turnover is a relatively simple and easily described concept. However, considerable confusion often results when addressing turnover because of differences in how it is defined; that is, what is counted, how it is counted, and how the turnover rates are expressed. Turnover is also costly, although not enough attention is paid to its cost because so much of it is indirect, and thus, not readily visible. There are a variety of causes of turnover, some which can be corrected and some which cannot be avoided. Reducing or otherwise controlling turnover requires continuing management attention to its causes and constant recognition of what can and should be controlled and what cannot be controlled. Ongoing attention to turnover is an essential part of the department manager's role.

  8. The use of FDEM in hydrogeophysics: A review

    NASA Astrophysics Data System (ADS)

    Boaga, Jacopo

    2017-04-01

    Hydrogeophysics is a rapidly evolving discipline emerging from geophysical methods. Geophysical methods are nowadays able to illustrate not only the fabric and the structure of the underground, but also the subsurface processes that occur within it, as fluids dynamic and biogeochemical reactions. This is a growing wide inter-disciplinary field, specifically dedicated to revealing soil properties and monitoring processes of change due to soil/bio/atmosphere interactions. The discipline involves environmental, hydrological, agricultural research and counts application for several engineering purposes. The most frequently used techniques in the hydrogeophysical framework are the electric and electromagnetic methods because they are highly sensitive to soil physical properties such as texture, salinity, mineralogy, porosity and water content. Non-invasive techniques are applied in a number of problems related to characterization of subsurface hydrology and groundwater dynamic processes. Ground based methods, as electrical tomography, proved to obtain considerable resolution but they are difficult to extend to wider exploration purposes due to their logistical limitation. Methods that don't need electrical contact with soil can be, on the contrary, easily applied to broad areas. Among these methods, a rapidly growing role is played by frequency domain electro-magnetic (FDEM) survey. This is due thanks to the improvement of multi-frequency and multi-coils instrumentation, simple time-lapse repeatability, cheap and accurate topographical referencing, and the emerging development of inversion codes. From raw terrain apparent conductivity meter, FDEM survey is becoming a key tool for 3D soil characterization and dynamics observation in near surface hydrological studies. Dozens of papers are here summarized and presented, in order to describe the promising potential of the technique.

  9. Real-Time Airborne Gamma-Ray Background Estimation Using NASVD with MLE and Radiation Transport for Calibration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kulisek, Jonathan A.; Schweppe, John E.; Stave, Sean C.

    2015-06-01

    Helicopter-mounted gamma-ray detectors can provide law enforcement officials the means to quickly and accurately detect, identify, and locate radiological threats over a wide geographical area. The ability to accurately distinguish radiological threat-generated gamma-ray signatures from background gamma radiation in real time is essential in order to realize this potential. This problem is non-trivial, especially in urban environments for which the background may change very rapidly during flight. This exacerbates the challenge of estimating background due to the poor counting statistics inherent in real-time airborne gamma-ray spectroscopy measurements. To address this, we have developed a new technique for real-time estimation ofmore » background gamma radiation from aerial measurements. This method is built upon on the noise-adjusted singular value decomposition (NASVD) technique that was previously developed for estimating the potassium (K), uranium (U), and thorium (T) concentrations in soil post-flight. The method can be calibrated using K, U, and T spectra determined from radiation transport simulations along with basis functions, which may be determined empirically by applying maximum likelihood estimation (MLE) to previously measured airborne gamma-ray spectra. The method was applied to both measured and simulated airborne gamma-ray spectra, with and without man-made radiological source injections. Compared to schemes based on simple averaging, this technique was less sensitive to background contamination from the injected man-made sources and may be particularly useful when the gamma-ray background frequently changes during the course of the flight.« less

  10. Efficient statistical mapping of avian count data

    USGS Publications Warehouse

    Royle, J. Andrew; Wikle, C.K.

    2005-01-01

    We develop a spatial modeling framework for count data that is efficient to implement in high-dimensional prediction problems. We consider spectral parameterizations for the spatially varying mean of a Poisson model. The spectral parameterization of the spatial process is very computationally efficient, enabling effective estimation and prediction in large problems using Markov chain Monte Carlo techniques. We apply this model to creating avian relative abundance maps from North American Breeding Bird Survey (BBS) data. Variation in the ability of observers to count birds is modeled as spatially independent noise, resulting in over-dispersion relative to the Poisson assumption. This approach represents an improvement over existing approaches used for spatial modeling of BBS data which are either inefficient for continental scale modeling and prediction or fail to accommodate important distributional features of count data thus leading to inaccurate accounting of prediction uncertainty.

  11. STAR (Simple Targeted Arterial Rendering) Technique: a Novel and Simple Method to Visualize the Fetal Cardiac Outflow Tracts

    PubMed Central

    Yeo, Lami; Romero, Roberto; Jodicke, Cristiano; Kim, Sun Kwon; Gonzalez, Juan M.; Oggè, Giovanna; Lee, Wesley; Kusanovic, Juan Pedro; Vaisbuch, Edi; Hassan, Sonia S.

    2010-01-01

    Objective To describe a novel and simple technique (STAR: Simple Targeted Arterial Rendering) to visualize the fetal cardiac outflow tracts from dataset volumes obtained with spatiotemporal image correlation (STIC) and applying a new display technology (OmniView). Methods We developed a technique to image the outflow tracts by drawing three dissecting lines through the four-chamber view of the heart contained in a STIC volume dataset. Each line generated the following plane: 1) Line 1: ventricular septum “en face” with both great vessels (pulmonary artery anterior to the aorta); 2) Line 2: pulmonary artery with continuation into the longitudinal view of the ductal arch; and 3) Line 3: long axis view of the aorta arising from the left ventricle. The pattern formed by all 3 lines intersecting approximately through the crux of the heart resembles a “star”. The technique was then tested in 50 normal hearts (15.3 – 40.4 weeks of gestation). To determine if the technique could identify planes that departed from the normal images, we tested the technique in 4 cases with proven congenital heart defects (ventricular septal defect, transposition of great vessels, tetralogy of Fallot, and pulmonary atresia with intact ventricular septum). Results The STAR technique was able to generate the intended planes in all 50 normal cases. In the abnormal cases, the STAR technique allowed identification of the ventricular septal defect, demonstrated great vessel anomalies, and displayed views that deviated from what was expected from the examination of normal hearts. Conclusions This novel and simple technique can be used to visualize the outflow tracts and ventricular septum “en face” in normal fetal hearts. The inability to obtain expected views or the appearance of abnormal views in the generated planes should raise the index of suspicion for congenital heart disease involving the great vessels and/or the ventricular septum. The STAR technique may simplify examination of the fetal heart and could reduce operator dependency. PMID:20878672

  12. A New Statistics-Based Online Baseline Restorer for a High Count-Rate Fully Digital System.

    PubMed

    Li, Hongdi; Wang, Chao; Baghaei, Hossain; Zhang, Yuxuan; Ramirez, Rocio; Liu, Shitao; An, Shaohui; Wong, Wai-Hoi

    2010-04-01

    The goal of this work is to develop a novel, accurate, real-time digital baseline restorer using online statistical processing for a high count-rate digital system such as positron emission tomography (PET). In high count-rate nuclear instrumentation applications, analog signals are DC-coupled for better performance. However, the detectors, pre-amplifiers and other front-end electronics would cause a signal baseline drift in a DC-coupling system, which will degrade the performance of energy resolution and positioning accuracy. Event pileups normally exist in a high-count rate system and the baseline drift will create errors in the event pileup-correction. Hence, a baseline restorer (BLR) is required in a high count-rate system to remove the DC drift ahead of the pileup correction. Many methods have been reported for BLR from classic analog methods to digital filter solutions. However a single channel BLR with analog method can only work under 500 kcps count-rate, and normally an analog front-end application-specific integrated circuits (ASIC) is required for the application involved hundreds BLR such as a PET camera. We have developed a simple statistics-based online baseline restorer (SOBLR) for a high count-rate fully digital system. In this method, we acquire additional samples, excluding the real gamma pulses, from the existing free-running ADC in the digital system, and perform online statistical processing to generate a baseline value. This baseline value will be subtracted from the digitized waveform to retrieve its original pulse with zero-baseline drift. This method can self-track the baseline without a micro-controller involved. The circuit consists of two digital counter/timers, one comparator, one register and one subtraction unit. Simulation shows a single channel works at 30 Mcps count-rate with pileup condition. 336 baseline restorer circuits have been implemented into 12 field-programmable-gate-arrays (FPGA) for our new fully digital PET system.

  13. The superiority of indium ratio over blood pool subtraction in analysis of indium-111 platelet deposition on thoraco-abdominal prosthetic grafts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ripley, S.; Wakefield, T.; Spaulding, S.

    1985-05-01

    In this investigation platelet deposition in polytetrafluroethylene (PTFE) thoracoabdominal grafts (TAC's) was evaluated using two different semi-quantitative techniques. Ten PTFE TAG's 6 mm in diameter and 30 cm in length were inserted into 10 mongrel dogs. One, 4 and 6 weeks after graft implantation the animals were injected with autologous In-111 platelets labelled by a modified Thakur technique. Platelet imaging in grafts was performed 48 hrs after injection. Blood pool was determined by Tc99m labelled RBC's (in vivo/in vitro technique). Semi-quantitative analysis was performed by subdividing the imaged graft into three major regions and selecting a reference region from eithermore » the native aorta or common iliac artery. Excess platelet deposition was determined by two methods: 1) the ratio of In-111 counts in the graft ROI''s to the reference region and 2) the percent In-111 excess using the Tc99m blood pool subtraction technique (TBPST). Animals were sacrificed 7 weeks after implantation and radioactivity in the excised grafts was determined using a well counter. A positive correlation was found to exist between the In-111 ratio percent analysis (IRPA) and the direct gamma counting (DCC) for all three segments of the prosthetic graft. Correlation coefficients for the thorax, midsegment and abdominal segments were 0.80, 0.73 and 0.48 respectivly. There was no correlation between TBPST and DGC. Using the IRPA technique the thrombogenicity of TAG's can be routinely assessed and is clinically applicable for patient use. TBPST should probably be limited to the extremities to avoid error due to free Tc99m counts from kidneys and ureters.« less

  14. Extending unbiased stereology of brain ultrastructure to three-dimensional volumes

    NASA Technical Reports Server (NTRS)

    Fiala, J. C.; Harris, K. M.; Koslow, S. H. (Principal Investigator)

    2001-01-01

    OBJECTIVE: Analysis of brain ultrastructure is needed to reveal how neurons communicate with one another via synapses and how disease processes alter this communication. In the past, such analyses have usually been based on single or paired sections obtained by electron microscopy. Reconstruction from multiple serial sections provides a much needed, richer representation of the three-dimensional organization of the brain. This paper introduces a new reconstruction system and new methods for analyzing in three dimensions the location and ultrastructure of neuronal components, such as synapses, which are distributed non-randomly throughout the brain. DESIGN AND MEASUREMENTS: Volumes are reconstructed by defining transformations that align the entire area of adjacent sections. Whole-field alignment requires rotation, translation, skew, scaling, and second-order nonlinear deformations. Such transformations are implemented by a linear combination of bivariate polynomials. Computer software for generating transformations based on user input is described. Stereological techniques for assessing structural distributions in reconstructed volumes are the unbiased bricking, disector, unbiased ratio, and per-length counting techniques. A new general method, the fractional counter, is also described. This unbiased technique relies on the counting of fractions of objects contained in a test volume. A volume of brain tissue from stratum radiatum of hippocampal area CA1 is reconstructed and analyzed for synaptic density to demonstrate and compare the techniques. RESULTS AND CONCLUSIONS: Reconstruction makes practicable volume-oriented analysis of ultrastructure using such techniques as the unbiased bricking and fractional counter methods. These analysis methods are less sensitive to the section-to-section variations in counts and section thickness, factors that contribute to the inaccuracy of other stereological methods. In addition, volume reconstruction facilitates visualization and modeling of structures and analysis of three-dimensional relationships such as synaptic connectivity.

  15. A Simple Ultrasonic Experiment Using a Phase Shift Detection Technique.

    ERIC Educational Resources Information Center

    Yunus, W. Mahmood Mat; Ahmad, Maulana

    1996-01-01

    Describes a simple ultrasonic experiment that can be used to measure the purity of liquid samples by detecting variations in the velocity of sound. Uses a phase shift detection technique that incorporates the use of logic gates and a piezoelectric transducer. (JRH)

  16. How-to-Do-It: A Simple DNA Isolation Technique Using Halophilic Bacteria.

    ERIC Educational Resources Information Center

    Guilfoile, Patrick

    1989-01-01

    Described is a simple technique for isolating DNA from halophilic bacteria. Materials, procedure, and additional experiments are outlined. It is stated that the DNA obtained will be somewhat contaminated with cellular proteins and RNA. Offers a procedure for greater purification. (RT)

  17. The calibration of photographic and spectroscopic films. The utilization of the digital image processor in the determination of aging of the surf clam (Spisula solidissima)

    NASA Technical Reports Server (NTRS)

    Peters, Kevin A.; Hammond, Ernest C., Jr.

    1987-01-01

    The age of the surf clam (Spisula solidissima) can be determined with the use of the Digital Image Processor. This technique is used in conjunction with a modified method for aging, refined by John Ropes of the Woods Hole Laboratory, Massachusetts. This method utilizes a thinned sectioned chondrophore of the surf clam which contains annual rings. The rings of the chondrophore are then counted to determine age. By digitizing the chondrophore, the Digital Image Processor is clearly able to separate these annual rings more accurately. This technique produces an easier and more efficient way to count annual rings to determine the age of the surf clam.

  18. Counting malaria parasites with a two-stage EM based algorithm using crowsourced data.

    PubMed

    Cabrera-Bean, Margarita; Pages-Zamora, Alba; Diaz-Vilor, Carles; Postigo-Camps, Maria; Cuadrado-Sanchez, Daniel; Luengo-Oroz, Miguel Angel

    2017-07-01

    Malaria eradication of the worldwide is currently one of the main WHO's global goals. In this work, we focus on the use of human-machine interaction strategies for low-cost fast reliable malaria diagnostic based on a crowdsourced approach. The addressed technical problem consists in detecting spots in images even under very harsh conditions when positive objects are very similar to some artifacts. The clicks or tags delivered by several annotators labeling an image are modeled as a robust finite mixture, and techniques based on the Expectation-Maximization (EM) algorithm are proposed for accurately counting malaria parasites on thick blood smears obtained by microscopic Giemsa-stained techniques. This approach outperforms other traditional methods as it is shown through experimentation with real data.

  19. γγ coincidence spectrometer for instrumental neutron-activation analysis

    NASA Astrophysics Data System (ADS)

    Tomlin, B. E.; Zeisler, R.; Lindstrom, R. M.

    2008-05-01

    Neutron-activation analysis (NAA) is an important technique for the accurate and precise determination of trace and ultra-trace elemental compositions. The application of γγ coincidence counting to NAA in order to enhance specificity was first explored over 40 years ago but has not evolved into a regularly used technique. A γγ coincidence spectrometer has been constructed at the National Institute of Standards and Technology, using two HPGe γ-ray detectors and an all-digital data-acquisition system, for the purpose of exploring coincidence NAA and its value in characterizing reference materials. This paper describes the initial evaluation of the quantitative precision of coincidence counting versus singles spectrometry, based upon a sample of neutron-irradiated bovine liver material.

  20. A new method of passive counting of nuclear missile warheads -a white paper for the Defense Threat Reduction Agency

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Morris, Christopher; Durham, J. Matthew; Guardincerri, Elena

    Cosmic ray muon imaging has been studied for the past several years as a possible technique for nuclear warhead inspection and verification as part of the New Strategic Arms Reduction Treaty between the United States and the Russian Federation. The Los Alamos team has studied two different muon imaging methods for this application, using detectors on two sides and one side of the object of interest. In this report we present results obtained on single sided imaging of configurations aimed at demonstrating the potential of this technique for counting nuclear warheads in place with detectors above the closed hatch ofmore » a ballistic missile submarine.« less

  1. Determination of nitrogen in coal macerals using electron microprobe technique-experimental procedure

    USGS Publications Warehouse

    Mastalerz, Maria; Gurba, L.W.

    2001-01-01

    This paper discusses nitrogen determination with the Cameca SX50 electron microprobe using PCO as an analyzing crystal. A set of conditions using differing accelerating voltages, beam currents, beam sizes, and counting times were tested to determine parameters that would give the most reliable nitrogen determination. The results suggest that, for the instrumentation used, 10 kV, current 20 nA, and a counting time of 20 s provides the most reliable nitrogen determination, with a much lower detection limit than the typical concentration of this element in coal. The study demonstrates that the electron microprobe technique can be used to determine the nitrogen content of coal macerals successfully and accurately. ?? 2001 Elsevier Science B.V. All rights reserved.

  2. Antimicrobial activity of hydroxyl radicals generated by hydrogen peroxide photolysis against Streptococcus mutans biofilm.

    PubMed

    Nakamura, Keisuke; Shirato, Midori; Kanno, Taro; Örtengren, Ulf; Lingström, Peter; Niwano, Yoshimi

    2016-10-01

    Prevention of dental caries with maximum conservation of intact tooth substance remains a challenge in dentistry. The present study aimed to evaluate the antimicrobial effect of H2O2 photolysis on Streptococcus mutans biofilm, which may be a novel antimicrobial chemotherapy for treating caries. S. mutans biofilm was grown on disk-shaped hydroxyapatite specimens. After 1-24 h of incubation, growth was assessed by confocal laser scanning microscopy and viable bacterial counting. Resistance to antibiotics (amoxicillin and erythromycin) was evaluated by comparing bactericidal effects on the biofilm with those on planktonic bacteria. To evaluate the effect of the antimicrobial technique, the biofilm was immersed in 3% H2O2 and was irradiated with an LED at 365 nm for 1 min. Viable bacterial counts in the biofilm were determined by colony counting. The thickness and surface coverage of S. mutans biofilm increased with time, whereas viable bacterial counts plateaued after 6 h. When 12- and 24-h-old biofilms were treated with the minimum concentration of antibiotics that killed viable planktonic bacteria with 3 log reduction, their viable counts were not significantly decreased, suggesting the biofilm acquired antibiotic resistance by increasing its thickness. By contrast, hydroxyl radicals generated by photolysis of 3% H2O2 effectively killed S. mutans in 24-h-old biofilm, with greater than 5 log reduction. The technique based on H2O2 photolysis is a potentially powerful adjunctive antimicrobial chemotherapy for caries treatment. Copyright © 2016 Elsevier B.V. and International Society of Chemotherapy. All rights reserved.

  3. Design and evaluation of a nondestructive fissile assay device for HTGR fuel samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McNeany, S. R.; Knoll, R. W.; Jenkins, J. D.

    1979-02-01

    Nondestructive assay of fissile material plays an important role in nuclear fuel processing facilities. Information for product quality control, plant criticality safety, and nuclear materials accountability can be obtained from assay devices. All of this is necessary for a safe, efficient, and orderly operation of a production plant. Presented here is a design description and an operational evaluation of a device developed to nondestructively assay small samples of High-Temperature Gas-Cooled Reactor (HTGR) fuel. The measurement technique employed consists in thermal-neutron irradiation of a sample followed by pneumatic transfer to a high-efficiency neutron detector where delayed neutrons are counted. In general,more » samples undergo several irradiation and count cycles during a measurement. The total number of delayed-neutron counts accumulated is translated into grams of fissile mass through comparison with the counts accumulated in an identical irradiation and count sequence of calibration standards. Successful operation of the device through many experiments over a one-year period indicates high operational reliability. Tests of assay precision show this to be better than 0.25% for measurements of 10 min. Assay biases may be encountered if calibration standards are not representative of unknown samples, but reasonable care in construction and control of standards should lead to no more than 0.2% bias in the measurements. Nondestructive fissile assay of HTGR fuel samples by thermal-neutron irradiation and delayed-neutron detection has been demonstrated to be a rapid and accurate analysis technique. However, careful attention and control must be given to calibration standards to see that they remain representative of unknown samples.« less

  4. Techniques and clinical effect of aseptic procedures on patients with acute leukemia in laminar airflow rooms.

    PubMed

    Takeo, H; Sakurai, T; Amaki, I

    1983-01-01

    The techniques of aseptic procedures in the laminar airflow room (LAF) were evaluated in 110 adult patients undergoing antileukemic chemotherapy for remission induction. The patients were divided into three groups according to the regimens: Group A, consisting of 20 patients who stayed in the LAF and received the gown technique + sterile food + prophylactic oral and topical antibiotics; Group B, consisting of 12 patients who stayed in the LAF and received sterile food + prophylactic oral antibiotics; and Group C, consisting of 78 patients in open wards, who received prophylactic oral antibiotics alone. Species and numbers of microorganisms on the skin surface were far less in the patients in Group A than in those in Group B. Airborne microorganisms were counted by the air sampling method. No microorganisms could be detected at the time of the patient's rest and of blood collection in either Group A or B. Electrocardiography and X-ray examination caused an increase in the number of colonies to more than one colony in Group B, but Group A had a count of less than 0.5 colony. The colony counts became negative within 5 min after the cessation of each operation. The percentage of febrile days for patients with a peripheral granulocyte count of less than 100/microliter was 29% in Group A, 21% in Group B and 44% in Group C. The incidence of documented infections during the total hospital stay was 25% (5/20), 42% (5/12) and 86% (67/78), respectively. The aseptic procedures in Group B were not as strict as in Group A, but the incidence of infections in Group B was significantly lower than in Group C.

  5. A new model to predict weak-lensing peak counts. II. Parameter constraint strategies

    NASA Astrophysics Data System (ADS)

    Lin, Chieh-An; Kilbinger, Martin

    2015-11-01

    Context. Peak counts have been shown to be an excellent tool for extracting the non-Gaussian part of the weak lensing signal. Recently, we developed a fast stochastic forward model to predict weak-lensing peak counts. Our model is able to reconstruct the underlying distribution of observables for analysis. Aims: In this work, we explore and compare various strategies for constraining a parameter using our model, focusing on the matter density Ωm and the density fluctuation amplitude σ8. Methods: First, we examine the impact from the cosmological dependency of covariances (CDC). Second, we perform the analysis with the copula likelihood, a technique that makes a weaker assumption than does the Gaussian likelihood. Third, direct, non-analytic parameter estimations are applied using the full information of the distribution. Fourth, we obtain constraints with approximate Bayesian computation (ABC), an efficient, robust, and likelihood-free algorithm based on accept-reject sampling. Results: We find that neglecting the CDC effect enlarges parameter contours by 22% and that the covariance-varying copula likelihood is a very good approximation to the true likelihood. The direct techniques work well in spite of noisier contours. Concerning ABC, the iterative process converges quickly to a posterior distribution that is in excellent agreement with results from our other analyses. The time cost for ABC is reduced by two orders of magnitude. Conclusions: The stochastic nature of our weak-lensing peak count model allows us to use various techniques that approach the true underlying probability distribution of observables, without making simplifying assumptions. Our work can be generalized to other observables where forward simulations provide samples of the underlying distribution.

  6. Chroma key without color restrictions based on asynchronous amplitude modulation of background illumination on retroreflective screens

    NASA Astrophysics Data System (ADS)

    Vidal, Borja; Lafuente, Juan A.

    2016-03-01

    A simple technique to avoid color limitations in image capture systems based on chroma key video composition using retroreflective screens and light-emitting diodes (LED) rings is proposed and demonstrated. The combination of an asynchronous temporal modulation onto the background illumination and simple image processing removes the usual restrictions on foreground colors in the scene. The technique removes technical constraints in stage composition, allowing its design to be purely based on artistic grounds. Since it only requires adding a very simple electronic circuit to widely used chroma keying hardware based on retroreflective screens, the technique is easily applicable to TV and filming studios.

  7. [Left ventricular volume determination by first-pass radionuclide angiocardiography using a semi-geometric count-based method].

    PubMed

    Kinoshita, S; Suzuki, T; Yamashita, S; Muramatsu, T; Ide, M; Dohi, Y; Nishimura, K; Miyamae, T; Yamamoto, I

    1992-01-01

    A new radionuclide technique for the calculation of left ventricular (LV) volume by the first-pass (FP) method was developed and examined. Using a semi-geometric count-based method, the LV volume can be measured by the following equation: CV = CM/(L/d). V = (CT/CV) x d3 = (CT/CM) x L x d2. (V = LV volume, CV = voxel count, CM = the maximum LV count, CT = the total LV count, L = LV depth where the maximum count was obtained, and d = pixel size.) This theorem was applied to FP LV images obtained in the 30-degree right anterior oblique position. Frame-mode acquisition was performed and the LV end-diastolic maximum count and total count were obtained. The maximum LV depth was obtained as the maximum width of the LV on the FP end-diastolic image, using the assumption that the LV cross-section is circular. These values were substituted in the above equation and the LV end-diastolic volume (FP-EDV) was calculated. A routine equilibrium (EQ) study was done, and the end-diastolic maximum count and total count were obtained. The LV maximum depth was measured on the FP end-diastolic frame, as the maximum length of the LV image. Using these values, the EQ-EDV was calculated and the FP-EDV was compared to the EQ-EDV. The correlation coefficient for these two values was r = 0.96 (n = 23, p less than 0.001), and the standard error of the estimated volume was 10 ml.(ABSTRACT TRUNCATED AT 250 WORDS)

  8. A Simple ELISA Exercise for Undergraduate Biology.

    ERIC Educational Resources Information Center

    Baker, William P.; Moore, Cathy R.

    Understanding of immunological techniques such as the Enzyme Linked Immuno Sorbent Assay (ELISA) is an important part of instructional units in human health, developmental biology, microbiology, and biotechnology. This paper describes a simple ELISA exercise for undergraduate biology that effectively simulates the technique using a paper model.…

  9. Fourier Spectroscopy: A Simple Analysis Technique

    ERIC Educational Resources Information Center

    Oelfke, William C.

    1975-01-01

    Presents a simple method of analysis in which the student can integrate, point by point, any interferogram to obtain its Fourier transform. The manual technique requires no special equipment and is based on relationships that most undergraduate physics students can derive from the Fourier integral equations. (Author/MLH)

  10. A method for the in vivo measurement of americium-241 at long times post-exposure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Neton, J.W.

    1988-01-01

    This study investigated an improved method for the quantitative measurement, calibration and calculation of {sup 241}Am organ burdens in humans. The techniques developed correct for cross-talk or count-rate contributions from surrounding and adjacent organ burdens and assures for the proper assignment of activity to the lungs, liver and skeleton. In order to predict the net count-rates for the measurement geometries of the skull, liver and lung, a background prediction method was developed. This method utilizes data obtained from the measurement of a group of control subjects. Based on this data, a linear prediction equation was developed for each measurement geometry.more » In order to correct for the cross-contributions among the various deposition loci, a series of surrogate human phantom structures were measured. The results of measurements of {sup 241}Am depositions in six exposure cases have been evaluated using these new techniques and have indicated that lung burden estimates could be in error by as much as 100 percent when corrections are not made for contributions to the count-rate from other organs.« less

  11. Effect of surgical hand scrub time on subsequent bacterial growth.

    PubMed

    Wheelock, S M; Lookinland, S

    1997-06-01

    In this experimental study, the researchers evaluated the effect of surgical hand scrub time on subsequent bacterial growth and assessed the effectiveness of the glove juice technique in a clinical setting. In a randomized crossover design, 25 perioperative staff members scrubbed for two or three minutes in the first trial and vice versa in the second trial, after which the wore sterile surgical gloves for one hour under clinical conditions. The researchers then sampled the subjects' nondominant hands for bacterial growth, cultured aliquots from the sampling solution, and counted microorganisms. Scrubbing for three minutes produced lower mean log bacterial counts than scrubbing for two minutes. Although the mean bacterial count differed significantly (P = .02) between the two-minute and three-minute surgical hand scrub times, it fell below 0.5 log, which is the threshold for practical and clinical significance. This finding suggests that a two-minute surgical hand scrub is clinically as effective as a three-minute surgical had scrub. The glove juice technique demonstrated sensitivity and reliability in enumerating bacteria on the hands of perioperative staff members in a clinical setting.

  12. Determinants of personal exposure to PM2.5, ultrafine particle counts, and CO in a transport microenvironment.

    PubMed

    Kaur, S; Nieuwenhuijsen, M J

    2009-07-01

    Short-term human exposure concentrations to PM2.5, ultrafine particle counts (particle range: 0.02-1 microm), and carbon monoxide (CO) were investigated at and around a street canyon intersection in Central London, UK. During a four week field campaign, groups of four volunteers collected samples at three timings (morning, lunch, and afternoon), along two different routes (a heavily trafficked route and a backstreet route) via five modes of transport (walking, cycling, bus, car, and taxi). This was followed by an investigation into the determinants of exposure using a regression technique which incorporated the site-specific traffic counts, meteorological variables (wind speed and temperature) and the mode of transport used. The analyses explained 9, 62, and 43% of the variability observed in the exposure concentrations to PM2.5, ultrafine particle counts, and CO in this study, respectively. The mode of transport was a statistically significant determinant of personal exposure to PM2.5, ultrafine particle counts, and CO, and for PM2.5 and ultrafine particle counts it was the most important determinant. Traffic count explained little of the variability in the PM2.5 concentrations, but it had a greater influence on ultrafine particle count and CO concentrations. The analyses showed that temperature had a statistically significant impact on ultrafine particle count and CO concentrations. Wind speed also had a statistically significant effect but smaller. The small proportion in variability explained in PM2.5 by the model compared to the largest proportion in ultrafine particle counts and CO may be due to the effect of long-range transboundary sources, whereas for ultrafine particle counts and CO, local traffic is the main source.

  13. Effect of face washing with soap and water and cleaning with antiseptics on upper-lid bacteria of surgical eye patients.

    PubMed

    Bekibele, Charles O; Kehinde, Aderemi O; Ajayi, Benedictus G K

    2010-12-01

    To determine the effect of face washing with soap and water and cleaning with povidone iodine and cetrimide/chlorhexidine gluconate (Savlon) on upper-lid bacteria. Prospective, nonrandomized clinical trial. Eighty patients attending the Eye Clinic, University College Hospital, Ibadan, Nigeria. Eighty patients assigned to 4 groups had swabs of the upper eyelid skin taken before and after face wash with soap and water, and cleansing with Savlon and 5% povidone iodine. Specimens were cultured and Gram stained. Bacterial counts were carried out using standard techniques. Face washing with soap and water increased the proportion of patients with bacterial isolates from 80.0% to 87.5%. The average colony count increased from 187.1 to 318.5 colony units per mL (p = 0.02). Application of 5% povidone iodine without face washing with soap and water reduced the proportion of patients with bacterial isolates from 82.6% (mean count 196.5) to 28.6% (mean count 34.1)(p = 0.001); in comparison, the application of 5% povidone iodine after face washing with soap and water reduced the proportion from 71.4% (mean count 133.9) to 40.0% (mean count 69.0)(p = 0.01). Application of Savlon without face washing with soap and water reduced the proportion of patients with bacterial isolates from 100% (mean count 310.9) to 41.2% (mean count 19.8)(p = 0.004) compared with the application after face washing, which reduced the proportion from 89.5% (mean count 240.3) to 41.2% (mean count 82.9)(p = 0.02). Both povidone and Savlon are effective in reducing periocular bacteria in an African setting. Prior face washing with soap and water had no added benefit in reducing bacterial colony count.

  14. A stitch in time saves nine: suture technique does not affect intestinal growth in a young, growing animal model.

    PubMed

    Gurien, Lori A; Wyrick, Deidre L; Smith, Samuel D; Maxson, R Todd

    2016-05-01

    Although this issue remains unexamined, pediatric surgeons commonly use simple interrupted suture for bowel anastomosis, as it is thought to improve intestinal growth postoperatively compared to continuous running suture. However, effects on intestinal growth are unclear. We compared intestinal growth using different anastomotic techniques during the postoperative period in young rats. Young, growing rats underwent small bowel transection and anastomosis using either simple interrupted or continuous running technique. At 7-weeks postoperatively after a four-fold growth, the anastomotic site was resected. Diameters and burst pressures were measured. Thirteen rats underwent anastomosis with simple interrupted technique and sixteen with continuous running method. No differences were found in body weight at first (102.46 vs 109.75g) or second operations (413.85 vs 430.63g). Neither the diameters (0.69 vs 0.79cm) nor burst pressures were statistically different, although the calculated circumference was smaller in the simple interrupted group (2.18 vs 2.59cm; p=0.03). No ruptures occurred at the anastomotic line. This pilot study is the first to compare continuous running to simple interrupted intestinal anastomosis in a pediatric model and showed no difference in growth. Adopting continuous running techniques for bowel anastomosis in young children may lead to faster operative time without affecting intestinal growth. Copyright © 2016 Elsevier Inc. All rights reserved.

  15. Determination of allergenic load and pollen count of Cupressus arizonica pollen by flow cytometry using Cup a1 polyclonal antibody.

    PubMed

    Benítez, Francisco Moreno; Camacho, Antonio Letrán; Del Cuvillo Bernal, Alfonso; de Medina, Pedro Lobatón Sánchez; Cózar, Francisco J García; Romeu, Ma Luisa Espinazo

    2013-07-10

    Background: There is an increase in the incidence of pollen related allergy, thus information on pollen schedules would be a great asset for physicians to improve the clinical care of patients. Like cypress pollen sensitization shows a high prevalence among the causes of allergic rhinitis, and therefore it is of interest to use it like a model of study, distinguishing cypress pollen, pollen count and allergenic load level. In this work, we use a flow cytometry based technique to obtain both Cupressus arizonica pollen count and allergenic load, using specific rabbit polyclonal antibody Cup a1 and its comparison with optical microscopy technique measurement. Methods: Airborne samples were collected from Burkard Spore-Trap and Burkard Cyclone Cupressus arizonica pollen was studied using specific rabbit polyclonal antibody Cup a1, labelled with AlexaFluor ® 488 or 750 and analysed by Flow Cytometry in both an EPICS XL and Cyan ADP cytometers (Beckman Coulter ® ). Optical microscopy study was realized with a Leica optical microscope. Bland & Altman was used to determine agreement between both techniques measured. Results: We can identify three different populations based on rabbit polyclonal antibody Cup a1 staining. The main region (44.5%) had 97.3% recognition, a second region (25%) with 28% and a third region (30.5%) with 68% respectively. Immunofluorescence and confocal microscopy showed that main region corresponds to whole pollen grains, the second region are pollen without exine and the third region is constituted by smaller particles with allergenic properties. Pollen schedule shows a higher correlation measured by optical microscopy and flow cytometry in the pollen count with a p-value: 0.0008E -2 and 0.0002 with regard to smaller particles, so the Bland & Altman measurement showed a good correlation between them, p-value: 0,0003. Conclusion: Determination of pollen count and allergenic load by flow cytometry represents an important tool in the determination of airborne respiratory allergens. We showed that not only whole pollen but also smaller particles could induce allergic sensitization. This is the first study where flow cytometry is used for calculating pollen counts and allergenic load. © 2013 Clinical Cytometry Society. Copyright © 2013 Clinical Cytometry Society.

  16. Effect of a dual inlet channel on cell loading in microfluidics.

    PubMed

    Yun, Hoyoung; Kim, Kisoo; Lee, Won Gu

    2014-11-01

    Unwanted sedimentation and attachment of a number of cells onto the bottom channel often occur on relatively large-scale inlets of conventional microfluidic channels as a result of gravity and fluid shear. Phenomena such as sedimentation have become recognized problems that can be overcome by performing microfluidic experiments properly, such as by calculating a meaningful output efficiency with respect to real input. Here, we present a dual-inlet design method for reducing cell loss at the inlet of channels by adding a new " upstream inlet " to a single main inlet design. The simple addition of an upstream inlet can create a vertically layered sheath flow prior to the main inlet for cell loading. The bottom layer flow plays a critical role in preventing the cells from attaching to the bottom of the channel entrance, resulting in a low possibility of cell sedimentation at the main channel entrance. To provide proof-of-concept validation, we applied our design to a microfabricated flow cytometer system (μFCS) and compared the cell counting efficiency of the proposed μFCS with that of the previous single-inlet μFCS and conventional FCS. We used human white blood cells and fluorescent microspheres to quantitatively evaluate the rate of cell sedimentation in the main inlet and to measure fluorescence sensitivity at the detection zone of the flow cytometer microchip. Generating a sheath flow as the bottom layer was meaningfully used to reduce the depth of field as well as the relative deviation of targets in the z-direction (compared to the x-y flow plane), leading to an increased counting sensitivity of fluorescent detection signals. Counting results using fluorescent microspheres showed both a 40% reduction in the rate of sedimentation and a 2-fold higher sensitivity in comparison with the single-inlet μFCS. The results of CD4(+) T-cell counting also showed that the proposed design results in a 25% decrease in the rate of cell sedimentation and a 28% increase in sensitivity when compared to the single-inlet μFCS. This method is simple and easy to use in design, yet requires no additional time or cost in fabrication. Furthermore, we expect that this approach could potentially be helpful for calculating exact cell loading and counting efficiency for a small input number of cells, such as primary cells and rare cells, in microfluidic channel applications.

  17. Prior automatic posture and activity identification improves physical activity energy expenditure prediction from hip-worn triaxial accelerometry.

    PubMed

    Garnotel, M; Bastian, T; Romero-Ugalde, H M; Maire, A; Dugas, J; Zahariev, A; Doron, M; Jallon, P; Charpentier, G; Franc, S; Blanc, S; Bonnet, S; Simon, C

    2018-03-01

    Accelerometry is increasingly used to quantify physical activity (PA) and related energy expenditure (EE). Linear regression models designed to derive PAEE from accelerometry-counts have shown their limits, mostly due to the lack of consideration of the nature of activities performed. Here we tested whether a model coupling an automatic activity/posture recognition (AAR) algorithm with an activity-specific count-based model, developed in 61 subjects in laboratory conditions, improved PAEE and total EE (TEE) predictions from a hip-worn triaxial-accelerometer (ActigraphGT3X+) in free-living conditions. Data from two independent subject groups of varying body mass index and age were considered: 20 subjects engaged in a 3-h urban-circuit, with activity-by-activity reference PAEE from combined heart-rate and accelerometry monitoring (Actiheart); and 56 subjects involved in a 14-day trial, with PAEE and TEE measured using the doubly-labeled water method. PAEE was estimated from accelerometry using the activity-specific model coupled to the AAR algorithm (AAR model), a simple linear model (SLM), and equations provided by the companion-software of used activity-devices (Freedson and Actiheart models). AAR-model predictions were in closer agreement with selected references than those from other count-based models, both for PAEE during the urban-circuit (RMSE = 6.19 vs 7.90 for SLM and 9.62 kJ/min for Freedson) and for EE over the 14-day trial, reaching Actiheart performances in the latter (PAEE: RMSE = 0.93 vs. 1.53 for SLM, 1.43 for Freedson, 0.91 MJ/day for Actiheart; TEE: RMSE = 1.05 vs. 1.57 for SLM, 1.70 for Freedson, 0.95 MJ/day for Actiheart). Overall, the AAR model resulted in a 43% increase of daily PAEE variance explained by accelerometry predictions. NEW & NOTEWORTHY Although triaxial accelerometry is widely used in free-living conditions to assess the impact of physical activity energy expenditure (PAEE) on health, its precision and accuracy are often debated. Here we developed and validated an activity-specific model which, coupled with an automatic activity-recognition algorithm, improved the variance explained by the predictions from accelerometry counts by 43% of daily PAEE compared with models relying on a simple relationship between accelerometry counts and EE.

  18. Radioisotope penile plethysmography: A technique for evaluating corpora cavernosal blood flow during early tumescence

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz, A.N.; Graham, M.M.; Ferency, G.F.

    1989-04-01

    Radioisotope penile plethysmography is a nuclear medicine technique which assists in the evaluation of patients with erectile dysfunction. This technique attempts to noninvasively quantitate penile corpora cavernosal blood flow during early penile tumescence using technetium-99m-labeled red blood cells. Penile images and counts were acquired in a steady-state blood-pool phase prior to and after the administration of intracorporal papaverine. Penile counts, images, and time-activity curves were computer analyzed in order to determine peak corporal flow and volume changes. Peak corporal flow rates were compared to arterial integrity (determined by angiography) and venosinusoidal corporal leak (determined by cavernosometry). Peak corporal flow correlatedmore » well with arterial integrity (r = 0.91) but did not correlate with venosinusoidal leak parameters (r = 0.01). This report focuses on the methodology and the assumptions which form the foundation of this technique. The strong correlation of peak corporal flow and angiography suggests that radioisotope penile plethysmography could prove useful in the evaluation of arterial inflow disorders in patients with erectile dysfunction.« less

  19. Multiphoton entanglement concentration and quantum cryptography.

    PubMed

    Durkin, Gabriel A; Simon, Christoph; Bouwmeester, Dik

    2002-05-06

    Multiphoton states from parametric down-conversion can be entangled both in polarization and photon number. Maximal high-dimensional entanglement can be concentrated postselectively from these states via photon counting. This makes them natural candidates for quantum key distribution, where the presence of more than one photon per detection interval has up to now been considered undesirable. We propose a simple multiphoton cryptography protocol for the case of low losses.

  20. Bird-habitat relationships in subalpine riparian shrublands of the Central Rocky Mountains

    Treesearch

    Deborah M. Finch

    1987-01-01

    Breeding birds were counted in 1982, 1983, and 1984 using the spot-map method on seven 8.1-ha plots In the Medicine Bow National Forest, Wyoming. At elevations of 2,280 to 3,000 m, riparian habitats were structurally simple, dominated by one or more bush willow species. Subalpine riparian avifaunas were depauperate with only four abundant species - song sparrow, white-...

Top